WorldWideScience

Sample records for generation time hypothesis

  1. Physiologic time: A hypothesis

    Science.gov (United States)

    West, Damien; West, Bruce J.

    2013-06-01

    The scaling of respiratory metabolism with body size in animals is considered by many to be a fundamental law of nature. One apparent consequence of this law is the scaling of physiologic time with body size, implying that physiologic time is separate and distinct from clock time. Physiologic time is manifest in allometry relations for lifespans, cardiac cycles, blood volume circulation, respiratory cycle, along with a number of other physiologic phenomena. Herein we present a theory of physiologic time that explains the allometry relation between time and total body mass averages as entailed by the hypothesis that the fluctuations in the total body mass are described by a scaling probability density.

  2. Testing One Hypothesis Multiple times

    OpenAIRE

    Algeri, Sara; van Dyk, David A.

    2017-01-01

    Hypothesis testing in presence of a nuisance parameter that is only identifiable under the alternative is challenging in part because standard asymptotic results (e.g., Wilks theorem for the generalized likelihood ratio test) do not apply. Several solutions have been proposed in the statistical literature and their practical implementation often reduces the problem into one of Testing One Hypothesis Multiple times (TOHM). Specifically, a fine discretization of the space of the non-identifiabl...

  3. Unaware Memory in Hypothesis Generation Tasks

    Science.gov (United States)

    1986-12-01

    distinguished two forms of memory : deliberate recollection of prior events versus the unaware influence of prior events on the performance of a later task...attempts to remember information. The findings reported here contribute in particular to our understanding of the memory processes involved in hypothesis... influenced by prior exposure to relevant events. Indeed, since the prior events themselves often cannot be consciously retrieved, this latter form of memory

  4. Learning-Related Changes in Adolescents' Neural Networks during Hypothesis-Generating and Hypothesis-Understanding Training

    Science.gov (United States)

    Lee, Jun-Ki; Kwon, Yongju

    2012-01-01

    Fourteen science high school students participated in this study, which investigated neural-network plasticity associated with hypothesis-generating and hypothesis-understanding in learning. The students were divided into two groups and participated in either hypothesis-generating or hypothesis-understanding type learning programs, which were…

  5. Decision Environment and Heuristics in Individual and Collective Hypothesis Generation

    Science.gov (United States)

    2017-11-01

    making analysis . Organizational Behavior and Human Decision Processes, 69, 149-163. Mueller, S. T., & Piper, B. J. (2014). The Psychology...U.S. Army Research Institute for the Behavioral and Social Sciences Research Report 2013 Decision Environment...and Heuristics in Individual and Collective Hypothesis Generation Drew A. Leins Jim Leonard Laura A. Zimmerman Applied Research Associates

  6. Organization of haemopoietic stem cells: the generation-age hypothesis

    International Nuclear Information System (INIS)

    Rosendaal, M.; Hodgson, G.S.; Bradley, T.R.

    1978-01-01

    This paper proposes that the previous division history of each stem cell is one determinant of the functional organisation of the haemopoietic stem cell population. Older stem cell are used to form blood before younger ones. The stem cells generating capacity of a lineage is finite, and cells are eventually lost to the system by forming two committed precursors of the cell lines, and the next oldest stem cell takes over. Hence the proposed term 'generation-age hypothesis', supported by experimental evidence. Older stem cells from normal bone marrow and 13 day foetal liver were stripped away with phase-specific drugs revealing a younger population of stem cells with three-to four-fold greater stem cell generating capacity. Normal stem cells aged by continuous irradiation and serial retransplantation had eight-fold reduced generating capacity. That of stem cells in the bloodstream was half to a quarter that of normal bone marrow stem cells. There were some circulating stem cells, identified by reaction to brain-associated antigen, positive for 75% of normal femoral stem cells but not their progeny, whose capacity for stem cell generation was an eighth to one fortieth that of normal cells. (U.K.)

  7. The linear hypothesis - an idea whose time has passed

    International Nuclear Information System (INIS)

    Tschaeche, A.N.

    1995-01-01

    The linear no-threshold hypothesis is the basis for radiation protection standards in the United States. In the words of the National Council on Radiation Protection and Measurements (NCRP), the hypothesis is: open-quotes In the interest of estimating effects in humans conservatively, it is not unreasonable to follow the assumption of a linear relationship between dose and effect in the low dose regions for which direct observational data are not available.close quotes The International Commission on Radiological Protection (ICRP) stated the hypothesis in a slightly different manner: open-quotes One such basic assumption ... is that ... there is ... a linear relationship without threshold between dose and the probability of an effect. The hypothesis was necessary 50 yr ago when it was first enunciated because the dose-effect curve for ionizing radiation for effects in humans was not known. The ICRP and NCRP needed a model to extrapolate high-dose effects to low-dose effects. So the linear no-threshold hypothesis was born. Certain details of the history of the development and use of the linear hypothesis are presented. In particular, use of the hypothesis by the U.S. regulatory agencies is examined. Over time, the sense of the hypothesis has been corrupted. The corruption of the hypothesis into the current paradigm of open-quote a little radiation, no matter how small, can and will harm youclose quotes is presented. The reasons the corruption occurred are proposed. The effects of the corruption are enumerated, specifically, the use of the corruption by the antinuclear forces in the United States and some of the huge costs to U.S. taxpayers due to the corruption. An alternative basis for radiation protection standards to assure public safety, based on the weight of scientific evidence on radiation health effects, is proposed

  8. Synchronization and phonological skills: precise auditory timing hypothesis (PATH

    Directory of Open Access Journals (Sweden)

    Adam eTierney

    2014-11-01

    Full Text Available Phonological skills are enhanced by music training, but the mechanisms enabling this cross-domain enhancement remain unknown. To explain this cross-domain transfer, we propose a precise auditory timing hypothesis (PATH whereby entrainment practice is the core mechanism underlying enhanced phonological abilities in musicians. Both rhythmic synchronization and language skills such as consonant discrimination, detection of word and phrase boundaries, and conversational turn-taking rely on the perception of extremely fine-grained timing details in sound. Auditory-motor timing is an acoustic feature which meets all five of the pre-conditions necessary for cross-domain enhancement to occur (Patel 2011, 2012, 2014. There is overlap between the neural networks that process timing in the context of both music and language. Entrainment to music demands more precise timing sensitivity than does language processing. Moreover, auditory-motor timing integration captures the emotion of the trainee, is repeatedly practiced, and demands focused attention. The precise auditory timing hypothesis predicts that musical training emphasizing entrainment will be particularly effective in enhancing phonological skills.

  9. Does the stress generation hypothesis apply to eating disorders?: an examination of stress generation in eating, depressive, and anxiety symptoms.

    Science.gov (United States)

    Bodell, Lindsay P; Hames, Jennifer L; Holm-Denoma, Jill M; Smith, April R; Gordon, Kathryn H; Joiner, Thomas E

    2012-12-15

    The stress generation hypothesis posits that individuals actively contribute to stress in their lives. Although stress generation has been studied frequently in the context of depression, few studies have examined whether this stress generation process is unique to depression or whether it occurs in other disorders. Although evidence suggests that stress contributes to the development of eating disorders, it is unclear whether eating disorders contribute to subsequent stress. A prospective design was used to examine the influence of eating disorder symptoms on negative life stressors. Two hundred and ninety female undergraduates completed questionnaires at two time points that examined eating disorder, depressive and anxiety symptoms and the presence of negative life events. Regression analyses found that while eating disorder symptoms (i.e. bulimic symptoms and drive for thinness) were independent, significant predictors of negative life events, they did not predict negative life events above and beyond symptoms of depression. Limitations include the use of self-report measures and a college-based sample, which may limit generalizability of the results. Findings suggest that if stress generation is present in individuals with symptoms of eating disorders, it is likely attributable to symptoms of depression. Thus, it may be important for clinicians to target depressive symptoms in order to reduce the frequency of negative life stressors among individuals with eating disorders. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. The linear hypothesis: An idea whose time has passed

    International Nuclear Information System (INIS)

    Tschaeche, A.N.

    1995-01-01

    This paper attempts to present a clear idea of what the linear (no-threshold) hypothesis (LH) is, how it was corrupted and what happened to the nuclear industry as a result, and one possible solution to this major problem for the nuclear industry. The corruption lies in the change of the LH from ''a little radiation MAY produce harm'' to ''low doses of radiation WILL KILL you.'' The result has been the retardation of the nuclear industry in the United States, although the industry is one of the safest, if not the safest industry. It is suggested to replace the LH with two sets of standards, one having to do with human and environmental health and safety, and the other (more stringent) for protection of manufactured items and premises. The safety standard could be some dose such as 5 rem/year. This would do away with the ALARA concept below the annual limit and with the collective dose at low doses. Benefits of the two-tier radiation standards system would be the alleviation of the public fear of radiation and the health of the nuclear industry

  11. Structuring and extracting knowledge for the support of hypothesis generation in molecular biology

    NARCIS (Netherlands)

    Roos, M.; Marshall, M.S.; Gibson, A.P.; Schuemie, M.; Meij, E.; Katrenko, S.; van Hage, W.R.; Krommydas, K.; Adriaans, P.W.

    2009-01-01

    Background: Hypothesis generation in molecular and cellular biology is an empirical process in which knowledge derived from prior experiments is distilled into a comprehensible model. The requirement of automated support is exemplified by the difficulty of considering all relevant facts that are

  12. Differences in Brain Information Transmission between Gifted and Normal Children during Scientific Hypothesis Generation

    Science.gov (United States)

    Jin, Seung-Hyun; Kwon, Yong-Ju; Jeong, Jin-Su; Kwon, Suk-Won; Shin, Dong-Hoon

    2006-01-01

    The purpose of the present study was to investigate differences in neural information transmission between gifted and normal children involved in scientific hypothesis generation. To investigate changes in the amount of information transmission, the children's averaged-cross mutual information (A-CMI) of EEGs was estimated during their generation…

  13. A Reasoning And Hypothesis-Generation Framework Based On Scalable Graph Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Sukumar, Sreenivas Rangan [ORNL

    2016-01-01

    Finding actionable insights from data has always been difficult. As the scale and forms of data increase tremendously, the task of finding value becomes even more challenging. Data scientists at Oak Ridge National Laboratory are leveraging unique leadership infrastructure (e.g. Urika-XA and Urika-GD appliances) to develop scalable algorithms for semantic, logical and statistical reasoning with unstructured Big Data. We present the deployment of such a framework called ORiGAMI (Oak Ridge Graph Analytics for Medical Innovations) on the National Library of Medicine s SEMANTIC Medline (archive of medical knowledge since 1994). Medline contains over 70 million knowledge nuggets published in 23.5 million papers in medical literature with thousands more added daily. ORiGAMI is available as an open-science medical hypothesis generation tool - both as a web-service and an application programming interface (API) at http://hypothesis.ornl.gov . Since becoming an online service, ORIGAMI has enabled clinical subject-matter experts to: (i) discover the relationship between beta-blocker treatment and diabetic retinopathy; (ii) hypothesize that xylene is an environmental cancer-causing carcinogen and (iii) aid doctors with diagnosis of challenging cases when rare diseases manifest with common symptoms. In 2015, ORiGAMI was featured in the Historical Clinical Pathological Conference in Baltimore as a demonstration of artificial intelligence to medicine, IEEE/ACM Supercomputing and recognized as a Centennial Showcase Exhibit at the Radiological Society of North America (RSNA) Conference in Chicago. The final paper will describe the workflow built for the Cray Urika-XA and Urika-GD appliances that is able to reason with the knowledge of every published medical paper every time a clinical researcher uses the tool.

  14. Marriage timing over the generations

    NARCIS (Netherlands)

    van Poppel, F.W.A.; Monden, C.; Mandemakers, K.

    2008-01-01

    Strong relationships have been hypothesized between the timing of marriage and the familial environment of the couple. Sociologists have identified various mechanisms via which the age at marriage in the parental generation might be related to the age at marriage of the children. In our paper we

  15. ePlant: Visualizing and Exploring Multiple Levels of Data for Hypothesis Generation in Plant Biology.

    Science.gov (United States)

    Waese, Jamie; Fan, Jim; Pasha, Asher; Yu, Hans; Fucile, Geoffrey; Shi, Ruian; Cumming, Matthew; Kelley, Lawrence A; Sternberg, Michael J; Krishnakumar, Vivek; Ferlanti, Erik; Miller, Jason; Town, Chris; Stuerzlinger, Wolfgang; Provart, Nicholas J

    2017-08-01

    A big challenge in current systems biology research arises when different types of data must be accessed from separate sources and visualized using separate tools. The high cognitive load required to navigate such a workflow is detrimental to hypothesis generation. Accordingly, there is a need for a robust research platform that incorporates all data and provides integrated search, analysis, and visualization features through a single portal. Here, we present ePlant (http://bar.utoronto.ca/eplant), a visual analytic tool for exploring multiple levels of Arabidopsis thaliana data through a zoomable user interface. ePlant connects to several publicly available web services to download genome, proteome, interactome, transcriptome, and 3D molecular structure data for one or more genes or gene products of interest. Data are displayed with a set of visualization tools that are presented using a conceptual hierarchy from big to small, and many of the tools combine information from more than one data type. We describe the development of ePlant in this article and present several examples illustrating its integrative features for hypothesis generation. We also describe the process of deploying ePlant as an "app" on Araport. Building on readily available web services, the code for ePlant is freely available for any other biological species research. © 2017 American Society of Plant Biologists. All rights reserved.

  16. PRECISION TIME-DELAY GENERATOR

    Science.gov (United States)

    Carr, B.J.; Peckham, V.D.

    1959-06-16

    A precision time-delay generator circuit with low jitter is described. The first thyratron has a series resonant circuit and a diode which is connected to the second thyratron. The first thyratron is triggered at the begin-ning of a time delay and a capacitor is discharged through the first thyratron and the diode, thereby, triggering the second thyratron. (T.R.H.) l6l9O The instrument described can measure pressures between sea level and 300,000 ft. The pressure- sensing transducer of the instrument is a small cylindrical tube with a thin foil of titanium-tritium fastened around the inside of the tube. Output is a digital signal which can be used for storage or telemetering more conveniently than an analog signal. (W.D.M.) l6l9l An experimental study was made on rolling contacts in the temperature range of 550 to 1000 deg F. Variables such as material composition, hardness, and operating conditions were investigated in a rolling test stand. Ball bearing tests were run to determine the effect of design parameters, bearing materials, lubricants, and operating conditions. (auth)

  17. Real-time scene generator

    Science.gov (United States)

    Lord, Eric; Shand, David J.; Cantle, Allan J.

    1996-05-01

    This paper describes the techniques which have been developed for an infra-red (IR) target, countermeasure and background image generation system working in real time for HWIL and Trial Proving applications. Operation is in the 3 to 5 and 8 to 14 micron bands. The system may be used to drive a scene projector (otherwise known as a thermal picture synthesizer) or for direct injection into equipment under test. The provision of realistic IR target and countermeasure trajectories and signatures, within representative backgrounds, enables the full performance envelope of a missile system to be evaluated. It also enables an operational weapon system to be proven in a trials environment without compromising safety. The most significant technique developed has been that of line by line synthesis. This minimizes the processing delays to the equivalent of 1.5 frames from input of target and sightline positions to the completion of an output image scan. Using this technique a scene generator has been produced for full closed loop HWIL performance analysis for the development of an air to air missile system. Performance of the synthesis system is as follows: 256 * 256 pixels per frame; 350 target polygons per frame; 100 Hz frame rate; and Gouraud shading, simple reflections, variable geometry targets and atmospheric scaling. A system using a similar technique has also bee used for direct insertion into the video path of a ground to air weapon system in live firing trials. This has provided realistic targets without degrading the closed loop performance. Delay of the modified video signal has been kept to less than 5 lines. The technique has been developed using a combination of 4 high speed Intel i860 RISC processors in parallel with the 4000 series XILINX field programmable gate arrays (FPGA). Start and end conditions for each line of target pixels are prepared and ordered in the I860. The merging with background pixels and output shading and scaling is then carried out in

  18. Testing Munk's hypothesis for submesoscale eddy generation using observations in the North Atlantic

    Science.gov (United States)

    Buckingham, Christian E.; Khaleel, Zammath; Lazar, Ayah; Martin, Adrian P.; Allen, John T.; Naveira Garabato, Alberto C.; Thompson, Andrew F.; Vic, Clément

    2017-08-01

    A high-resolution satellite image that reveals a train of coherent, submesoscale (6 km) vortices along the edge of an ocean front is examined in concert with hydrographic measurements in an effort to understand formation mechanisms of the submesoscale eddies. The infrared satellite image consists of ocean surface temperatures at ˜390 m resolution over the midlatitude North Atlantic (48.69°N, 16.19°W). Concomitant altimetric observations coupled with regular spacing of the eddies suggest the eddies result from mesoscale stirring, filamentation, and subsequent frontal instability. While horizontal shear or barotropic instability (BTI) is one mechanism for generating such eddies (Munk's hypothesis), we conclude from linear theory coupled with the in situ data that mixed layer or submesoscale baroclinic instability (BCI) is a more plausible explanation for the observed submesoscale vortices. Here we assume that the frontal disturbance remains in its linear growth stage and is accurately described by linear dynamics. This result likely has greater applicability to the open ocean, i.e., regions where the gradient Rossby number is reduced relative to its value along coasts and within strong current systems. Given that such waters comprise an appreciable percentage of the ocean surface and that energy and buoyancy fluxes differ under BTI and BCI, this result has wider implications for open-ocean energy/buoyancy budgets and parameterizations within ocean general circulation models. In summary, this work provides rare observational evidence of submesoscale eddy generation by BCI in the open ocean.Plain Language SummaryHere, we test Munk's theory for small-scale eddy generation using a unique set of satellite- and ship-based observations. We find that for one particular set of observations in the North Atlantic, the mechanism for eddy generation is not pure horizontal shear, as proposed by Munk et al. () and Munk (), but is instead vertical shear, or baroclinic instability

  19. Determining the Effects of Cognitive Style, Problem Complexity, and Hypothesis Generation on the Problem Solving Ability of School-Based Agricultural Education Students

    Science.gov (United States)

    Blackburn, J. Joey; Robinson, J. Shane

    2016-01-01

    The purpose of this experimental study was to assess the effects of cognitive style, problem complexity, and hypothesis generation on the problem solving ability of school-based agricultural education students. Problem solving ability was defined as time to solution. Kirton's Adaption-Innovation Inventory was employed to assess students' cognitive…

  20. "Paleoseismograms": Testing a Hypothesis of Source-Time Function Recording of Paleoearthquakes

    Science.gov (United States)

    Garrett, A.; Goldfinger, C.; Patton, J. R.; Morey, A. E.

    2011-12-01

    profiles of the deposits, reflecting considerable detail of the input source. We successfully simulated input perturbations of single and multiple input energy peaks of varying lengths and flow hydrographs, as well as simulated hyperpycnal flows, with consistent results across a wide range of timing and and parameter scaling variations. We conclude that turbidite records may record information about the triggering event. For earthquake sources, we suggest that a crude record of the source-time function of the earthquake may be recorded in the turbidite deposits. This hypothesis may explain the unusual correlation of isolated paleoseismic sites in Cascadia, the NSAF and Sumatra. If correct, turbidites under ideal conditions may record enough source information to be considered "paleoseismograms" recording information about paleo-rupture sequences on large fault systems. Data from turbidites found in land-locked inlets and sub-alpine Cascadia lakes are consistent with this experimental result

  1. Diversity Generator Mechanisms Are Essential Components of Biological Systems: The Two Queen Hypothesis

    Directory of Open Access Journals (Sweden)

    Eric Muraille

    2018-02-01

    Full Text Available Diversity is widely known to fuel adaptation and evolutionary processes and increase robustness at the population, species and ecosystem levels. The Neo-Darwinian paradigm proposes that the diversity of biological entities is the consequence of genetic changes arising spontaneously and randomly, without regard for their usefulness. However, a growing body of evidence demonstrates that the evolutionary process has shaped mechanisms, such as horizontal gene transfer mechanisms, meiosis and the adaptive immune system, which has resulted in the regulated generation of diversity among populations. Though their origins are unrelated, these diversity generator (DG mechanisms share common functional properties. They (i contribute to the great unpredictability of the composition and/or behavior of biological systems, (ii favor robustness and collectivism among populations and (iii operate mainly by manipulating the systems that control the interaction of living beings with their environment. The definition proposed here for DGs is based on these properties and can be used to identify them according to function. Interestingly, prokaryotic DGs appear to be mainly reactive, as they generate diversity in response to environmental stress. They are involved in the widely described Red Queen/arms race/Cairnsian dynamic. The emergence of multicellular organisms harboring K selection traits (longer reproductive life cycle and smaller population size has led to the acquisition of a new class of DGs that act anticipatively to stress pressures and generate a distinct dynamic called the “White Queen” here. The existence of DGs leads to the view of evolution as a more “intelligent” and Lamarckian-like process. Their repeated selection during evolution could be a neglected example of convergent evolution and suggests that some parts of the evolutionary process are tightly constrained by ecological factors, such as the population size, the generation time and

  2. Diversity Generator Mechanisms Are Essential Components of Biological Systems: The Two Queen Hypothesis.

    Science.gov (United States)

    Muraille, Eric

    2018-01-01

    Diversity is widely known to fuel adaptation and evolutionary processes and increase robustness at the population, species and ecosystem levels. The Neo-Darwinian paradigm proposes that the diversity of biological entities is the consequence of genetic changes arising spontaneously and randomly, without regard for their usefulness. However, a growing body of evidence demonstrates that the evolutionary process has shaped mechanisms, such as horizontal gene transfer mechanisms, meiosis and the adaptive immune system, which has resulted in the regulated generation of diversity among populations. Though their origins are unrelated, these diversity generator (DG) mechanisms share common functional properties. They (i) contribute to the great unpredictability of the composition and/or behavior of biological systems, (ii) favor robustness and collectivism among populations and (iii) operate mainly by manipulating the systems that control the interaction of living beings with their environment. The definition proposed here for DGs is based on these properties and can be used to identify them according to function. Interestingly, prokaryotic DGs appear to be mainly reactive, as they generate diversity in response to environmental stress. They are involved in the widely described Red Queen/arms race/Cairnsian dynamic. The emergence of multicellular organisms harboring K selection traits (longer reproductive life cycle and smaller population size) has led to the acquisition of a new class of DGs that act anticipatively to stress pressures and generate a distinct dynamic called the "White Queen" here. The existence of DGs leads to the view of evolution as a more "intelligent" and Lamarckian-like process. Their repeated selection during evolution could be a neglected example of convergent evolution and suggests that some parts of the evolutionary process are tightly constrained by ecological factors, such as the population size, the generation time and the intensity of

  3. Probing the Hypothesis of SAR Continuity Restoration by the Removal of Activity Cliffs Generators in QSAR.

    Science.gov (United States)

    Cruz-Monteagudo, Maykel; Medina-Franco, José L; Perera-Sardiña, Yunier; Borges, Fernanda; Tejera, Eduardo; Paz-Y-Miño, Cesar; Pérez-Castillo, Yunierkis; Sánchez-Rodríguez, Aminael; Contreras-Posada, Zuleidys; Cordeiro, M Natália D S

    2016-01-01

    In this work we report the first attempt to study the effect of activity cliffs over the generalization ability of machine learning (ML) based QSAR classifiers, using as study case a previously reported diverse and noisy dataset focused on drug induced liver injury (DILI) and more than 40 ML classification algorithms. Here, the hypothesis of structure-activity relationship (SAR) continuity restoration by activity cliffs removal is tested as a potential solution to overcome such limitation. Previously, a parallelism was established between activity cliffs generators (ACGs) and instances that should be misclassified (ISMs), a related concept from the field of machine learning. Based on this concept we comparatively studied the classification performance of multiple machine learning classifiers as well as the consensus classifier derived from predictive classifiers obtained from training sets including or excluding ACGs. The influence of the removal of ACGs from the training set over the virtual screening performance was also studied for the respective consensus classifiers algorithms. In general terms, the removal of the ACGs from the training process slightly decreased the overall accuracy of the ML classifiers and multi-classifiers, improving their sensitivity (the weakest feature of ML classifiers trained with ACGs) but decreasing their specificity. Although these results do not support a positive effect of the removal of ACGs over the classification performance of ML classifiers, the "balancing effect" of ACG removal demonstrated to positively influence the virtual screening performance of multi-classifiers based on valid base ML classifiers. Specially, the early recognition ability was significantly favored after ACGs removal. The results presented and discussed in this work represent the first step towards the application of a remedial solution to the activity cliffs problem in QSAR studies.

  4. [Gabaergic hypothesis of epilepsy and clinical experience: controversial actions of the new generation gabamimetic antiepileptic drugs].

    Science.gov (United States)

    Chmielewska, B

    2000-01-01

    Gamma-aminobutyric acid (GABA), the major inhibitory neurotransmitter in CNS can elevate level of neuronal excitability by the mechanisms of hyperpolarization. Gabaergic hypothesis of epileptogenesis influenced development of a group of gabamimetic antiepileptic drugs (AEDs). Powerful conventional AEDs barbiturates and benzodiazepines can directly activate GABA-A receptor but their usefulness is limited by development of dependence and tolerance to antiseizure activity. The second generation AEDs have been achieved by a rationale synthesis of compounds that could mimic or augment the activity of endogenous GABA. Vigabatrin (VGB) irreversibly inhibits GABA-T activity, tiagabine (TGB) inhibits GABA-reuptake system (GAT-1) and gabapentin (GPT) enhances GABA turnover in CNS. New drugs with selective and specific influence on GABA neurotransmission are non-toxic and well-tolerated, but some side-effects (aggravation of seizures, visual field deficit and psychotic reactions) seems to be strictly connected with their pharmacodynamic properties. Absence and probably myoclonic seizures noted in about 10% of patients under VGB seems to be the result of disturbed GABA inhibition in thalamic interneurons and non-controlled hyperactivity of excitatory neocortex-thalamus-neocotrex circuits. Perimetric examination might reveal peripheral, persistent binasal visual field deficit in about 30% of patients treated with VGB. This is probably the effect of cytotoxic influence of enormous accumulation of GABA in retinal neurons. Barbiturates and benzodiazepines can exacerbate intellectual functioning and behaviour. Some emotional and reactive disturbances are more characteristic for newer drugs. Serious depressive reactions and psychoses were observed respectively in 12.5 and 2.5% epileptics under VGB and anecdotically after TGB or GPT therapy. Newer selective and specific gabamimetic AEDs play an essential role as add-on therapy of pharmaco-resistant epilepsy, but they did not bring

  5. Real-time hypothesis driven feature extraction on parallel processing architectures

    DEFF Research Database (Denmark)

    Granmo, O.-C.; Jensen, Finn Verner

    2002-01-01

    Feature extraction in content-based indexing of media streams is often computational intensive. Typically, a parallel processing architecture is necessary for real-time performance when extracting features brute force. On the other hand, Bayesian network based systems for hypothesis driven feature...... extraction, which selectively extract relevant features one-by-one, have in some cases achieved real-time performance on single processing element architectures. In this paperwe propose a novel technique which combines the above two approaches. Features are selectively extracted in parallelizable sets...... parallelizable feature sets real-time in a goal oriented fashion, even when some features are pairwise highly correlated and causally complexly interacting....

  6. Scientific Reasoning in Early and Middle Childhood: The Development of Domain-General Evidence Evaluation, Experimentation, and Hypothesis Generation Skills

    Science.gov (United States)

    Piekny, Jeanette; Maehler, Claudia

    2013-01-01

    According to Klahr's (2000, 2005; Klahr & Dunbar, 1988) Scientific Discovery as Dual Search model, inquiry processes require three cognitive components: hypothesis generation, experimentation, and evidence evaluation. The aim of the present study was to investigate (a) when the ability to evaluate perfect covariation, imperfect covariation,…

  7. Diamond's temperature: Unruh effect for bounded trajectories and thermal time hypothesis

    International Nuclear Information System (INIS)

    Martinetti, Pierre; Rovelli, Carlo

    2003-01-01

    We study the Unruh effect for an observer with a finite lifetime, using the thermal time hypothesis. The thermal time hypothesis maintains that: (i) time is the physical quantity determined by the flow defined by a state over an observable algebra and (ii) when this flow is proportional to a geometric flow in spacetime, the temperature is the ratio between flow parameter and proper time. An eternal accelerated Unruh observer has access to the local algebra associated with a Rindler wedge. The flow defined by the Minkowski vacuum of a field theory over this algebra is proportional to a flow in spacetime and the associated temperature is the Unruh temperature. An observer with a finite lifetime has access to the local observable algebra associated with a finite spacetime region called a 'diamond'. The flow defined by the Minkowski vacuum of a (four-dimensional, conformally invariant) quantum field theory over this algebra is also proportional to a flow in spacetime. The associated temperature generalizes the Unruh temperature to finite lifetime observers. Furthermore, this temperature does not vanish even in the limit in which the acceleration is zero. The temperature associated with an inertial observer with lifetime Τ which we denote as 'diamond's temperature', is T D = 2 h/ π k b Τ. This temperature is related to the fact that a finite lifetime observer does not have access to all the degrees of freedom of the quantum field theory. However, we do not attempt to provide any physical interpretation of our proposed assignment of a temperature

  8. Testing the Binary Hypothesis: Pulsar Timing Constraints on Supermassive Black Hole Binary Candidates

    Science.gov (United States)

    Sesana, Alberto; Haiman, Zoltán; Kocsis, Bence; Kelley, Luke Zoltan

    2018-03-01

    The advent of time domain astronomy is revolutionizing our understanding of the universe. Programs such as the Catalina Real-time Transient Survey (CRTS) or the Palomar Transient Factory (PTF) surveyed millions of objects for several years, allowing variability studies on large statistical samples. The inspection of ≈250 k quasars in CRTS resulted in a catalog of 111 potentially periodic sources, put forward as supermassive black hole binary (SMBHB) candidates. A similar investigation on PTF data yielded 33 candidates from a sample of ≈35 k quasars. Working under the SMBHB hypothesis, we compute the implied SMBHB merger rate and we use it to construct the expected gravitational wave background (GWB) at nano-Hz frequencies, probed by pulsar timing arrays (PTAs). After correcting for incompleteness and assuming virial mass estimates, we find that the GWB implied by the CRTS sample exceeds the current most stringent PTA upper limits by almost an order of magnitude. After further correcting for the implicit bias in virial mass measurements, the implied GWB drops significantly but is still in tension with the most stringent PTA upper limits. Similar results hold for the PTF sample. Bayesian model selection shows that the null hypothesis (whereby the candidates are false positives) is preferred over the binary hypothesis at about 2.3σ and 3.6σ for the CRTS and PTF samples respectively. Although not decisive, our analysis highlights the potential of PTAs as astrophysical probes of individual SMBHB candidates and indicates that the CRTS and PTF samples are likely contaminated by several false positives.

  9. Detection of Respiratory Pathogens in Parapneumonic Effusions by Hypothesis-free, Next-Generation Sequencing (NGS)

    Science.gov (United States)

    Ampofo, Krow; Pavia, Andrew; Blaschke, Anne J; Schlaberg, Robert

    2017-01-01

    Abstract Background Species-specific polymerase chain reaction (PCR) testing of pleural fluid (PF) from children with parapneumonic effusion (PPE) has increased pathogen identification in pediatric PPE. However, a pathogen is not detected in 25–35% of cases. Hypothesis-free, next-generation sequencing (NGS) provides a more comprehensive alternative and has led to pathogen detection in PCR-negative samples. However, the utility of NGS in the evaluation of PF from children with PPE is unknown. Methods Archived PF (n = 20) from children younger than 18 years with PPE and hospitalized at Primary Children’s Hospital, Utah, in 2015 and previously tested by PCR were evaluated. Ten PCR-negative and 10 PCR-positive PF specimens were tested using RNA-seq at an average depth of 7.7×106 sequencing reads per sample. NGS data were analyzed with Taxonomer. We compared pathogens detected by blood and PF culture, PCR, and NGS. Results Overall, compared with blood/PF culture, PF PCR and PF NGS testing of PF increased bacterial identification from 15% to 50% (P < 0.05) and 65% (P = 0.003), respectively. Pathogen detection in PF by PCR and NGS were comparable (50 vs. 65%, p = NS) (Table). However, compared with PF PCR, NGS significantly increased detection of S. pyogenes (20% vs. 55%; P < 0.05), with 100% concordance when detected by PCR and culture. Detection of Fusobacterium spp. (10 vs. 10%) by PF NGS and PF PCR were comparable. In contrast, there was no detection of S. pneumoniae (15 vs. 0%) by PF NGS compared with PF PCR. Conclusion PF NGS testing significantly improves bacterial identification and comparable to PF PCR testing, which can help inform antimicrobial selection. However there were differences in detection of S. pneumoniae and S. pyogenes. Further studies of NGS testing of PF of children with PPE are needed to assess its potential in the evaluation of PPE in children. Positive by culturea and PCR (n = 10) Negative by culturea and PCR (n = 10

  10. Sport physiology, dopamine and nitric oxide - Some speculations and hypothesis generation.

    Science.gov (United States)

    Landers, J G; Esch, Tobias

    2015-12-01

    Elite Spanish professional soccer players surprisingly showed a preponderance of an allele coding for nitric oxide synthase (NOS) that resulted in lower nitric oxide (NO) compared with Spanish endurance and power athletes and sedentary men. The present paper attempts a speculative explanation. Soccer is an "externally-paced" (EP) sport and team work dependent, requiring "executive function skills". We accept that time interval estimation skill is, in part, also an executive skill. Dopamine (DA) is prominent among the neurotransmitters with a role in such skills. Polymorphisms affecting dopamine (especially DRD2/ANKK1-Taq1a which leads to lower density of dopamine D2 receptors in the striatum, leading to increased striatal dopamine synthesis) and COMT val 158 met (which prolongs the action of dopamine in the cortex) feature both in the time interval estimation and the executive skills literatures. Our paper may be a pioneering attempt to stimulate empirical efforts to show how genotypes among soccer players may be connected via neurotransmitters to certain cognitive abilities that predict sporting success, perhaps also in some other externally-paced team sports. Graphing DA levels against time interval estimation accuracy and also against certain executive skills reveals an inverted-U relationship. A pathway from DA, via endogenous morphine and mu3 receptors on endothelia, to the generation of NO in tiny quantities has been demonstrated. Exercise up-regulates DA and this pathway. With somewhat excessive exercise, negative feedback from NO down-regulates DA, hypothetically keeping it near the peak of the inverted-U. Other research, not yet done on higher animals or humans, shows NO "fine-tuning" movement. We speculate that Caucasian men, playing soccer recreationally, would exemplify the above pattern and their nitric oxide synthase (NOS) would reflect the norm of their community, whereas professional players of soccer and perhaps other EP sports, with DA boosted by

  11. From the arrow of time in Badiali's quantum approach to the dynamic meaning of Riemann's hypothesis

    Directory of Open Access Journals (Sweden)

    P. Riot

    2017-09-01

    Full Text Available The novelty of the Jean Pierre Badiali last scientific works stems to a quantum approach based on both (i a return to the notion of trajectories (Feynman paths and (ii an irreversibility of the quantum transitions. These iconoclastic choices find again the Hilbertian and the von Neumann algebraic point of view by dealing statistics over loops. This approach confers an external thermodynamic origin to the notion of a quantum unit of time (Rovelli Connes' thermal time. This notion, basis for quantization, appears herein as a mere criterion of parting between the quantum regime and the thermodynamic regime. The purpose of this note is to unfold the content of the last five years of scientific exchanges aiming to link in a coherent scheme the Jean Pierre's choices and works, and the works of the authors of this note based on hyperbolic geodesics and the associated role of Riemann zeta functions. While these options do not unveil any contradictions, nevertheless they give birth to an intrinsic arrow of time different from the thermal time. The question of the physical meaning of Riemann hypothesis as the basis of quantum mechanics, which was at the heart of our last exchanges, is the backbone of this note.

  12. Toward a new application of real-time electrophysiology: online optimization of cognitive neurosciences hypothesis testing.

    Science.gov (United States)

    Sanchez, Gaëtan; Daunizeau, Jean; Maby, Emmanuel; Bertrand, Olivier; Bompas, Aline; Mattout, Jérémie

    2014-01-23

    Brain-computer interfaces (BCIs) mostly rely on electrophysiological brain signals. Methodological and technical progress has largely solved the challenge of processing these signals online. The main issue that remains, however, is the identification of a reliable mapping between electrophysiological measures and relevant states of mind. This is why BCIs are highly dependent upon advances in cognitive neuroscience and neuroimaging research. Recently, psychological theories became more biologically plausible, leading to more realistic generative models of psychophysiological observations. Such complex interpretations of empirical data call for efficient and robust computational approaches that can deal with statistical model comparison, such as approximate Bayesian inference schemes. Importantly, the latter enable the optimization of a model selection error rate with respect to experimental control variables, yielding maximally powerful designs. In this paper, we use a Bayesian decision theoretic approach to cast model comparison in an online adaptive design optimization procedure. We show how to maximize design efficiency for individual healthy subjects or patients. Using simulated data, we demonstrate the face- and construct-validity of this approach and illustrate its extension to electrophysiology and multiple hypothesis testing based on recent psychophysiological models of perception. Finally, we discuss its implications for basic neuroscience and BCI itself.

  13. Toward a New Application of Real-Time Electrophysiology: Online Optimization of Cognitive Neurosciences Hypothesis Testing

    Directory of Open Access Journals (Sweden)

    Gaëtan Sanchez

    2014-01-01

    Full Text Available Brain-computer interfaces (BCIs mostly rely on electrophysiological brain signals. Methodological and technical progress has largely solved the challenge of processing these signals online. The main issue that remains, however, is the identification of a reliable mapping between electrophysiological measures and relevant states of mind. This is why BCIs are highly dependent upon advances in cognitive neuroscience and neuroimaging research. Recently, psychological theories became more biologically plausible, leading to more realistic generative models of psychophysiological observations. Such complex interpretations of empirical data call for efficient and robust computational approaches that can deal with statistical model comparison, such as approximate Bayesian inference schemes. Importantly, the latter enable the optimization of a model selection error rate with respect to experimental control variables, yielding maximally powerful designs. In this paper, we use a Bayesian decision theoretic approach to cast model comparison in an online adaptive design optimization procedure. We show how to maximize design efficiency for individual healthy subjects or patients. Using simulated data, we demonstrate the face- and construct-validity of this approach and illustrate its extension to electrophysiology and multiple hypothesis testing based on recent psychophysiological models of perception. Finally, we discuss its implications for basic neuroscience and BCI itself.

  14. A Dual-Process Discrete-Time Survival Analysis Model: Application to the Gateway Drug Hypothesis

    Science.gov (United States)

    Malone, Patrick S.; Lamis, Dorian A.; Masyn, Katherine E.; Northrup, Thomas F.

    2010-01-01

    The gateway drug model is a popular conceptualization of a progression most substance users are hypothesized to follow as they try different legal and illegal drugs. Most forms of the gateway hypothesis are that "softer" drugs lead to "harder," illicit drugs. However, the gateway hypothesis has been notably difficult to…

  15. Complexity-Measure-Based Sequential Hypothesis Testing for Real-Time Detection of Lethal Cardiac Arrhythmias

    Directory of Open Access Journals (Sweden)

    Chen Szi-Wen

    2007-01-01

    Full Text Available A novel approach that employs a complexity-based sequential hypothesis testing (SHT technique for real-time detection of ventricular fibrillation (VF and ventricular tachycardia (VT is presented. A dataset consisting of a number of VF and VT electrocardiogram (ECG recordings drawn from the MIT-BIH database was adopted for such an analysis. It was split into two smaller datasets for algorithm training and testing, respectively. Each ECG recording was measured in a 10-second interval. For each recording, a number of overlapping windowed ECG data segments were obtained by shifting a 5-second window by a step of 1 second. During the windowing process, the complexity measure (CM value was calculated for each windowed segment and the task of pattern recognition was then sequentially performed by the SHT procedure. A preliminary test conducted using the database produced optimal overall predictive accuracy of . The algorithm was also implemented on a commercial embedded DSP controller, permitting a hardware realization of real-time ventricular arrhythmia detection.

  16. The brain as a dream state generator: an activation-synthesis hypothesis of the dream process.

    Science.gov (United States)

    Hobson, J A; McCarley, R W

    1977-12-01

    Recent research in the neurobiology of dreaming sleep provides new evidence for possible structural and functional substrates of formal aspects of the dream process. The data suggest that dreaming sleep is physiologically determined and shaped by a brain stem neuronal mechanism that can be modeled physiologically and mathematically. Formal features of the generator processes with strong implications for dream theory include periodicity and automaticity of forebrain activation, suggesting a preprogrammed neural basis for dream mentation in sleep; intense and sporadic activation of brain stem sensorimotor circuits including reticular, oculomotor, and vestibular neurons, possibly determining spatiotemporal aspects of dream imagery; and shifts in transmitter ratios, possibly accounting for dream amnesia. The authors suggest that the automatically activated forebrain synthesizes the dream by comparing information generated in specific brain stem circuits with information stored in memory.

  17. Role for circadian clock genes in seasonal timing: testing the Bünning hypothesis.

    Directory of Open Access Journals (Sweden)

    Mirko Pegoraro

    2014-09-01

    Full Text Available A major question in chronobiology focuses around the "Bünning hypothesis" which implicates the circadian clock in photoperiodic (day-length measurement and is supported in some systems (e.g. plants but disputed in others. Here, we used the seasonally-regulated thermotolerance of Drosophila melanogaster to test the role of various clock genes in day-length measurement. In Drosophila, freezing temperatures induce reversible chill coma, a narcosis-like state. We have corroborated previous observations that wild-type flies developing under short photoperiods (winter-like exhibit significantly shorter chill-coma recovery times (CCRt than flies that were raised under long (summer-like photoperiods. Here, we show that arrhythmic mutant strains, per01, tim01 and ClkJrk, as well as variants that speed up or slow down the circadian period, disrupt the photoperiodic component of CCRt. Our results support an underlying circadian function mediating seasonal daylength measurement and indicate that clock genes are tightly involved in photo- and thermo-periodic measurements.

  18. Interactions between risk factors in the prediction of onset of eating disorders: Exploratory hypothesis generating analyses.

    Science.gov (United States)

    Stice, Eric; Desjardins, Christopher D

    2018-06-01

    Because no study has tested for interactions between risk factors in the prediction of future onset of each eating disorder, this exploratory study addressed this lacuna to generate hypotheses to be tested in future confirmatory studies. Data from three prevention trials that targeted young women at high risk for eating disorders due to body dissatisfaction (N = 1271; M age 18.5, SD 4.2) and collected diagnostic interview data over 3-year follow-up were combined to permit sufficient power to predict onset of anorexia nervosa (AN), bulimia nervosa (BN), binge eating disorder (BED), and purging disorder (PD) using classification tree analyses, an analytic technique uniquely suited to detecting interactions. Low BMI was the most potent predictor of AN onset, and body dissatisfaction amplified this relation. Overeating was the most potent predictor of BN onset, and positive expectancies for thinness and body dissatisfaction amplified this relation. Body dissatisfaction was the most potent predictor of BED onset, and overeating, low dieting, and thin-ideal internalization amplified this relation. Dieting was the most potent predictor of PD onset, and negative affect and positive expectancies for thinness amplified this relation. Results provided evidence of amplifying interactions between risk factors suggestive of cumulative risk processes that were distinct for each disorder; future confirmatory studies should test the interactive hypotheses generated by these analyses. If hypotheses are confirmed, results may allow interventionists to target ultra high-risk subpopulations with more intensive prevention programs that are uniquely tailored for each eating disorder, potentially improving the yield of prevention efforts. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Time-varying disaster risk models: An empirical assessment of the Rietz-Barro hypothesis

    DEFF Research Database (Denmark)

    Irarrazabal, Alfonso; Parra-Alvarez, Juan Carlos

    This paper revisits the fit of disaster risk models where a representative agent has recursive preferences and the probability of a macroeconomic disaster changes over time. We calibrate the model as in Wachter (2013) and perform two sets of tests to assess the empirical performance of the model ...... and hence to reduce the Sharpe Ratio, a lower elasticity of substitution generates a more reasonable level for the equity risk premium and for the volatility of the government bond returns without compromising the ability of the price-dividend ratio to predict excess returns....

  20. A new hypothesis for the importance of seed dispersal in time

    Directory of Open Access Journals (Sweden)

    Adriana Guzmán

    2011-12-01

    Full Text Available Most studies on seed dispersal in time have focused on seed dormancy and the physiological triggers for germination. However, seed dispersed by animals with low metabolic and moving rates, and long gutpassage times such as terrestrial turtles, could be considered another type of dispersal in time. This study tests the hypothesis that seeds dispersed in time may lower predation rates. We predicted that seeds deposited below parent trees after fruiting fall has finished is advantageous to minimize seed predators and should show higher survival rates. Four Amazonian plant species, Dicranostyles ampla, Oenocarpus bataua, Guatteria atabapensis and Ocotea floribunda, were tested for seed survival probabilities in two periods: during fruiting and 10-21 days after fruiting. Experiments were carried out in two biological stations located in the Colombian Amazon (Caparú and Zafire Biological Stations. Seed predation was high and mainly caused by non-vertebrates. Out of the four plant species tested, only Guatteria atabapensis supported the time escape hypothesis. For this species, seed predation by vertebrates after the fruiting period increased (from 4.1% to 9.2% while seed predation by nonvertebrates decreased (from 54.0% to 40.2%. In contrast, seed predation by vertebrates and by non-vertebrates after the fruiting period in D. ampla increased (from 7.9% to 22.8% and from 40.4% to 50.6%, respectively, suggesting predator satiation. Results suggest that for some species dispersal in time could be advantageous to avoid some type of seed predators. Escape in time could be an additional dimension in which seeds may reach adequate sites for recruitment. Thus, future studies should be address to better understand the survival advantages given by an endozoochory time-dispersal process. Rev. Biol. Trop. 59 (4: 1795-1803. Epub 2011 December 01.La mayoría de estudios sobre dispersión de semillas en el tiempo tratan sobre la dormancia de las semillas y los

  1. Model Based Segmentation And Hypothesis Generation For The Recognition Of Printed Documents

    Science.gov (United States)

    Dengel, A.; Luhn, A.; Ueberreiter, B.

    1988-04-01

    The task of document recognition requires the scanning of a paper document and the analysis of its content and structure. The resulting electronic representation has to capture the content as well as the logic and layout structure of the document. The first step in the recognition process is scanning, filtering and binarization of the paper document. Based on the preprocessing results we delineate key areas like address or signature for a letter, or the abstract for a report. This segmentation procedure uses a specific document layout model. The validity of this segmentation can be verified in a second step by using the results of more time-consuming procedures like text/graphic classification, optical character recognition (OCR) and the comparison with more elaborate models for specific document parts. Thus our concept of model driven segmentation allows quick focussing of the analysis on important regions. The segmentation is able to operate directly on the raster image of a document without necessarily requiring CPU-intensive preprocessing steps for the whole document. A test version for the analysis of simple business letters has been implemented.

  2. Timing and causality in the generation of learned eyelid responses

    Directory of Open Access Journals (Sweden)

    Raudel eSánchez-Campusano

    2011-08-01

    Full Text Available The cerebellum-red nucleus-facial motoneuron (Mn pathway has been reported as being involved in the proper timing of classically conditioned eyelid responses. This special type of associative learning serves as a model of event timing for studying the role of the cerebellum in dynamic motor control. Here, we have re-analyzed the firing activities of cerebellar posterior interpositus (IP neurons and orbicularis oculi (OO Mns in alert behaving cats during classical eyeblink conditioning, using a delay paradigm. The aim was to revisit the hypothesis that the IP neurons can be considered a neuronal phase-modulating device supporting OO Mns firing with an emergent timing mechanism and an explicit correlation code during learned eyelid movements. Optimized experimental and computational tools allowed us to determine the different causal relationships (temporal order and correlation code during and between trials. These intra- and inter-trial timing strategies expanding from sub-second range (millisecond timing to longer-lasting ranges (interval timing expanded the functional domain of cerebellar timing beyond motor control. Interestingly, the results supported the above-mentioned hypothesis. The causal inferences were influenced by the precise motor and premotor spike-timing in the cause-effect interval, and, in addition, the timing of the learned responses depended on cerebellar-Mn network causality. Furthermore, the timing of CRs depended upon the probability of simulated causal conditions in the cause-effect interval and not the mere duration of the inter-stimulus interval. In this work, the close relation between timing and causality was verified. It could thus be concluded that the firing activities of IP neurons may be related more to the proper performance of ongoing CRs (i.e., the proper timing as a consequence of the pertinent causality than to their generation and/or initiation.

  3. Computer generated timing diagrams to supplement simulation

    CERN Document Server

    Booth, A W

    1981-01-01

    The ISPS computer description language has been used in a simulation study to specify the components of a high speed data acquisition system and its protocols. A facility has been developed for automatically generating timing diagrams from the specification of the data acquisition system written in the ISPS description language. Diagrams can be generated for both normal and abnormal working modes of the system. They are particularly useful for design and debugging in the prototyping stage of a project and can be later used for reference by maintenance engineers. (11 refs).

  4. Motor Resonance as a Function of Narrative Time: Further Tests of the Linguistic Focus Hypothesis

    Science.gov (United States)

    Zwaan, Rolf A.; Taylor, Lawrence J.; de Boer, Mirte

    2010-01-01

    Neuroimaging and behavioral studies have revealed involvement of the brain's motor system in language comprehension. The Linguistic-Focus Hypothesis [Taylor, L. J., & Zwaan, R. A. (2008). Motor resonance and linguistic focus. "Quarterly Journal of Experimental Psychology,61", 869-904.] postulates that engagement of the motor system during language…

  5. 4-channel time delayed pulse generator

    International Nuclear Information System (INIS)

    Wetzel, L.F.S.; Rossi, J.O.; Del Bosco, E.

    1987-02-01

    It is described the project of a 4-channel delayed pulse generator employed to trigger the plasma centrifuge experiment of the Laboratorio Associado de Plasmas. The circuit delivers pulses with amplitude of 15V, full width at half maximum of 50μs and rise time of 0.7μs. The maximum time delay is 100ms. There are two channels with a fine adjustment of 0-1ms. The system can be manually or automatically driven. (author) [pt

  6. Methods and baseline cardiovascular data from the Early versus Late Intervention Trial with Estradiol testing the menopausal hormone timing hypothesis.

    Science.gov (United States)

    Hodis, Howard N; Mack, Wendy J; Shoupe, Donna; Azen, Stanley P; Stanczyk, Frank Z; Hwang-Levine, Juliana; Budoff, Matthew J; Henderson, Victor W

    2015-04-01

    This study aims to present methods and baseline data from the Early versus Late Intervention Trial with Estradiol (ELITE), the only clinical trial designed to specifically test the timing hypothesis of postmenopausal hormone therapy (HT). The timing hypothesis posits that HT effects depend on the temporal initiation of HT relative to time since menopause. ELITE is a randomized, double-blind, placebo-controlled trial with a 2 × 2 factorial design. Six hundred forty-three healthy postmenopausal women without cardiovascular disease were randomized to oral estradiol or placebo for up to 6 to 7 years according to time since menopause (y). Carotid artery intima-media thickness (CIMT) and cardiac computed tomography were conducted to determine HT effects on subclinical atherosclerosis across menopause strata. Participants in the early and late postmenopausal strata were well-separated by mean age (55.4 vs 65.4 y) and median time since menopause (3.5 vs 14.3 y). Expected risk factors (age, blood pressure, and body mass index) were associated with CIMT at baseline in both strata. In the early postmenopausal group, but not in the late postmenopausal group, we observed significant associations between CIMT and factors that may play a role in the responsiveness of atherosclerosis progression according to timing of HT initiation. These include low-density lipoprotein cholesterol, high-density lipoprotein cholesterol, sex hormone-binding globulin, and serum total estradiol. The ELITE randomized controlled trial is timely and unique. Baseline data indicate that ELITE is well-positioned to test the HT timing hypothesis in relation to atherosclerosis progression and coronary artery disease.

  7. Internal and ancestral controls of cell-generation times

    Science.gov (United States)

    Kubitschek, H. E.

    1969-01-01

    Lateral and longitudinal correlations between related cells reveal associations between the generation times of cells for an intermediate period /three generations in bacteral cultures/. Generation times of progeny are influenced by nongenetic factors transmitted from their ancestors.

  8. Disruption of the LTD dialogue between the cerebellum and the cortex in Angelman syndrome model: a timing hypothesis

    Directory of Open Access Journals (Sweden)

    Guy eCheron

    2014-11-01

    Full Text Available Angelman syndrome is a genetic neurodevelopmental disorder in which cerebellar functioning impairment has been documented despite the absence of gross structural abnormalities. Characteristically, a spontaneous 160 Hz oscillation emerges in the Purkinje cells network of the Ube3am-/p+ Angelman mouse model. This abnormal oscillation is induced by enhanced Purkinje cell rhythmicity and hypersynchrony along the parallel fiber beam. We present a pathophysiological hypothesis for the neurophysiology underlying major aspects of the clinical phenotype of Angelman syndrome, including cognitive, language and motor deficits, involving long-range connection between the cerebellar and the cortical networks. This hypothesis states that the alteration of the cerebellar rhythmic activity impinges cerebellar long-term depression (LTD plasticity, which in turn alters the LTD plasticity in the cerebral cortex. This hypothesis was based on preliminary experiments using electrical stimulation of the whiskers pad performed in alert mice showing that after a 8 Hz LTD-inducing protocol, the cerebellar LTD accompanied by a delayed response in the wild type mice is missing in Ube3am-/p+ mice and that the LTD induced in the barrel cortex following the same peripheral stimulation in wild mice is reversed into a LTP in the Ube3am-/p+ mice. The control exerted by the cerebellum on the excitation vs inhibition balance in the cerebral cortex and possible role played by the timing plasticity of the Purkinje cell LTD on the spike–timing dependent plasticity (STDP of the pyramidal neurons are discussed in the context of the present hypothesis.

  9. Birth timing for mountain lions (Puma concolor); testing the prey availability hypothesis.

    Science.gov (United States)

    Jansen, Brian D; Jenks, Jonathan A

    2012-01-01

    We investigated potential advantages in birth timing for mountain lion (Puma concolor) cubs. We examined cub body mass, survival, and age of natal dispersal in relation to specific timing of birth. We also investigated the role of maternal age relative to timing of births. We captured mountain lion cubs while in the natal den to determine birth date, which allowed for precise estimates of the population birth pulse and age of natal dispersal. A birth pulse occurred during June-August. Body mass of cubs was related to litter size and timing of birth; heaviest cubs occurred in litters of 2, and those born after 1 July. Cubs born within pulse months exhibited similar survival to those born out of the pulse. We found that cubs born April-June dispersed at younger ages than those born after 1 July. There was less variation in birth timing for 1(st) litters of females than older females. We hypothesize that cubs born after the peak in births of neonate prey are advantaged by the abundance of vulnerable prey and those cubs and mothers realize an evolutionary advantage.

  10. Platelet activating factors are associated with depressive symptoms in coronary artery disease patients: a hypothesis-generating study

    Directory of Open Access Journals (Sweden)

    Mazereeuw G

    2015-09-01

    Full Text Available Graham Mazereeuw,1,2,4 Nathan Herrmann,1,5 Hongbin Xu,3,4 Alexandre P Blanchard,3,4 Daniel Figeys,3,4 Paul I Oh,6 Steffany AL Bennett,3,4 Krista L Lanctôt1,2,4–61Hurvitz Brain Sciences Program, Sunnybrook Research Institute, Sunnybrook Health Sciences Centre, Toronto, ON, 2Department of Pharmacology and Toxicology, University of Toronto, Toronto, ON, Canada; 3Ottawa Institute of Systems Biology and Neural Regeneration Laboratory, Department of Biochemistry, Microbiology, and Immunology, University of Ottawa, Ottawa, ON, 4CIHR  Training Program in Neurodegenerative Lipidomics, Department of Biochemistry, Microbiology, and Immunology, University of Ottawa, Ottawa, ON, 5Department of Psychiatry, University of Toronto, Toronto, ON, Canada; 6UHN Toronto Rehabilitation Institute, Toronto, ON, CanadaIntroduction: Depression is a frequent complication of coronary artery disease (CAD with an unknown etiology. Platelet activating factor (PAF lipids, which are associated with CAD, have recently been linked with novel proposed etiopathological mechanisms for depression such as inflammation, oxidative/nitrosative stress, and vascular endothelial dysfunction.Methods and results: This hypothesis-generating study investigated the relationships between various PAF species and depressive symptoms in 26 CAD patients (age: 60.6±9.2 years, 69% male, mean Hamilton Depression Rating Scale [HAM-D] score: 11.8±5.2, HAM-D range: 3–20. Plasma PAF analyses were performed using high performance liquid chromatography electrospray ionization mass spectrometry in precursor ion scan. Significant associations between depressive symptom severity (HAM-D score and a greater plasma abundance of the PAFs phosphocholine (PC PC(O-12:0/2:0 (r=0.49, P=0.01, PC(O-14:1/2:0 (r=0.43, P=0.03, PC(O-17:3/2:0 (r=0.44, P=0.04, and PC(O-18:3/2:0 (r=0.50, P=0.01 were observed. Associations between those PAFs and HAM-D score persisted after adjusting for age and sex.Conclusion: These

  11. A hypothesis linking chrysophyte microfossils to lake carbon dynamics on ecological and evolutionary time scales

    Science.gov (United States)

    Wolfe, Alexander P.; Siver, Peter A.

    2013-12-01

    Chrysophyte algae are common in the plankton of oligotrophic lakes and produce a rich microfossil record of siliceous cysts and scales. Paleolimnological investigations and phytoplankton records suggest that chrysophyte populations are increasing in a wide range of boreal and arctic lakes, ultimately representing one component of the limnological response to contemporary global changes. However, the exact mechanisms responsible for widespread increases of chrysophyte populations remain elusive. We hypothesize that recent increases in chrysophytes are related to rising pCO2 in lakes, in part because these algae lack carbon concentrating mechanisms and therefore rely on diffusive entry of CO2 to Rubisco during photosynthesis. We assessed the abundance of modern sediment chrysophyte microfossils in relation to summer CO2 relative saturation in 46 New England (USA) lakes, revealing significant positive relationships for both cysts and scales. These observations imply that correlations between chrysophytes and limnological conditions including low pH, oligotrophy, and elevated dissolved organic matter are ultimately underscored by the high pCO2 associated with these conditions. In lakes where chrysophyte populations have expanded over recent decades, we infer that increasingly heterotrophic conditions with respect to CO2 have stimulated production by these organisms. This linkage is supported by the remarkable abundance and diversity of chrysophytes from middle Eocene lake sediments, deposited under atmospheric CO2 concentrations significantly higher than present. The Eocene assemblages suggest that any chrysophyte-CO2 connection borne out of results from modern and sub-recent sediments also operated on evolutionary time scales, and thus the absence of carbon concentrating mechanisms appears to be an ancient feature within the group. Chrysophyte microfossils may potentially provide important insights concerning the temporal dynamics of carbon cycling in aquatic

  12. Real-Time Contour Surface Display Generation,

    Science.gov (United States)

    1984-09-01

    set of trees generated for the two-dimensional grid such that duplicate edges in separate trees are eliminated. This solution has the added benefit that...of SI GGRAPH- ACAI , Vol. 16, No. 3 (July 1982), p, 135. 20. Zyda. Michael J. "Multiprocessor Considerations in the Design of a Readl-Timne Contour

  13. Stability of Aggression Over Time and Generations.

    Science.gov (United States)

    Huesman, L. Rowell; And Others

    1984-01-01

    Studies the aggressiveness of over 600 subjects, their parents, and their children over a 22-year period. Subjects who were more aggressive 8-year-olds were more aggressive 30-year-olds, exhibiting serious antisocial behavior as adults. The stability of aggression across generations within a family was also high. (Author/CB)

  14. The timing hypothesis: Do coronary risks of menopausal hormone therapy vary by age or time since menopause onset?

    Science.gov (United States)

    Bassuk, Shari S; Manson, JoAnn E

    2016-05-01

    The Women's Health Initiative (WHI), a landmark randomized trial of menopausal hormone therapy (HT) for prevention of chronic disease in postmenopausal women aged 50-79, established that such therapy neither prevents coronary heart disease (CHD) nor yields a favorable balance of benefits and risks in such women as a whole. However, a nuanced look at the data from this trial, considered alongside other evidence, suggests that timing of HT initiation affects the relation between such therapy and coronary risk, as well as its overall benefit-risk balance. Estrogen may have a beneficial effect on the heart if started in early menopause, when a woman's arteries are likely to be relatively healthy, but a harmful effect if started in late menopause, when those arteries are more likely to show signs of atherosclerotic disease. However, even if HT-associated relative risks are constant across age or time since menopause onset, the low absolute risk of CHD in younger or recently menopausal women translates into low attributable risks in this group. Thus, HT initiation for relief of moderate to severe vasomotor symptoms in early menopausal patients who have a favorable coronary profile remains a viable option. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Using the distribution of the CCR5-Δ32 allele in third-generation Maltese citizens to disprove the Black Death hypothesis.

    Science.gov (United States)

    Baron, B; Schembri-Wismayer, P

    2011-04-01

    Malta was under Norman rule for over 400 years and has had three major documented plague outbreaks (and a number of minor ones) since the 14th century with death tolls of 5-15% of the population at the time. This makes the Maltese population ideal for testing the hypothesis that the Black Death (particularly that of 1346-52) was responsible for a genetic shift that spread the CCR5-Δ32 allele. By enrolling 300 blood donors to determine the percentage of the Maltese population resistant to HIV-1 (which uses the CCR5-receptor to infect cells), it was established that the CCR5-Δ32 allele frequency is almost zero in third-generation Maltese citizens and sequencing showed that the deletion observed in the region of interest is the 32-base deletion expected. Thus, despite the extensive Norman occupation and the repeated plague cullings, the CCR5-Δ32 allele frequency is extremely low. This provides a basis for the discussion of conflicting hypotheses regarding the possible origin, function and spread of the CCR5-Δ32 deletion. © 2010 Blackwell Publishing Ltd.

  16. Generator of an exponential function with respect to time

    International Nuclear Information System (INIS)

    Janin, Paul; Puyal, Claude.

    1981-01-01

    This invention deals with an exponential function generator, and an application of this generator to simulating the criticality of a nuclear reactor for reactimeter calibration purposes. This generator, which is particularly suitable for simulating the criticality of a nuclear reactor to calibrate a reactimeter, can also be used in any field of application necessitating the generation of an exponential function in real time. In certain fields of thermodynamics, it is necessary to represent temperature gradients as a function of time. The generator might find applications here. Another application is nuclear physics where it is necessary to represent the attenuation of a neutron flux density with respect to time [fr

  17. A Complete Fossil-Calibrated Phylogeny of Seed Plant Families as a Tool for Comparative Analyses: Testing the 'Time for Speciation' Hypothesis.

    Directory of Open Access Journals (Sweden)

    Liam W Harris

    Full Text Available Explaining the uneven distribution of species richness across the branches of the tree of life has been a major challenge for evolutionary biologists. Advances in phylogenetic reconstruction, allowing the generation of large, well-sampled, phylogenetic trees have provided an opportunity to contrast competing hypotheses. Here, we present a new time-calibrated phylogeny of seed plant families using Bayesian methods and 26 fossil calibrations. While there are various published phylogenetic trees for plants which have a greater density of species sampling, we are still a long way from generating a complete phylogeny for all ~300,000+ plants. Our phylogeny samples all seed plant families and is a useful tool for comparative analyses. We use this new phylogenetic hypothesis to contrast two alternative explanations for differences in species richness among higher taxa: time for speciation versus ecological limits. We calculated net diversification rate for each clade in the phylogeny and assessed the relationship between clade age and species richness. We then fit models of speciation and extinction to individual branches in the tree to identify major rate-shifts. Our data suggest that the majority of lineages are diversifying very slowly while a few lineages, distributed throughout the tree, are diversifying rapidly. Diversification is unrelated to clade age, no matter the age range of the clades being examined, contrary to both the assumption of an unbounded lineage increase through time, and the paradigm of fixed ecological limits. These findings are consistent with the idea that ecology plays a role in diversification, but rather than imposing a fixed limit, it may have variable effects on per lineage diversification rates through time.

  18. Slow cortical potentials and "inner time consciousness" - A neuro-phenomenal hypothesis about the "width of present".

    Science.gov (United States)

    Northoff, Georg

    2016-05-01

    William James postulated a "stream of consciousness" that presupposes temporal continuity. The neuronal mechanisms underlying the construction of such temporal continuity remain unclear, however, in my contribution, I propose a neuro-phenomenal hypothesis that is based on slow cortical potentials and their extension of the present moment as described in the phenomenal term of "width of present". More specifically, I focus on the way the brain's neural activity needs to be encoded in order to make possible the "stream of consciousness." This leads us again to the low-frequency fluctuations of the brain's neural activity and more specifically to slow cortical potentials (SCPs). Due to their long phase duration as low-frequency fluctuations, SCPs can integrate different stimuli and their associated neural activity from different regions in one converging region. Such integration may be central for consciousness to occur, as it was recently postulated by He and Raichle. They leave open, however, the question of the exact neuronal mechanisms, like the encoding strategy, that make possible the association of the otherwise purely neuronal SCP with consciousness and its phenomenal features. I hypothesize that SCPs allow for linking and connecting different discrete points in physical time by encoding their statistically based temporal differences rather than the single discrete time points by themselves. This presupposes difference-based coding rather than stimulus-based coding. The encoding of such statistically based temporal differences makes it possible to "go beyond" the merely physical features of the stimuli; that is, their single discrete time points and their conduction delays (as related to their neural processing in the brain). This, in turn, makes possible the constitution of "local temporal continuity" of neural activity in one particular region. The concept of "local temporal continuity" signifies the linkage and integration of different discrete time points

  19. Minimum time trajectory generation for relative guidance of aircraft

    OpenAIRE

    Mora-Camino , Felix Antonio Claudio; Miquel , Thierry; Ouattara , Baba Ouinténi; Achaibou , Karim; Faye , Roger; Sawadogo , Salam

    2004-01-01

    International audience; In this communication is considered the problem of on line generation of minimum time trajectories to be followed by an aircraft to achieve relative convergence maneuvers. Here the trajectory generation problem is first considered as a minimum time control problem. The analysis of the resulting set of complex optimality conditions shows that the minimum time trajectories are produced by bang-bang control laws and can be characterized by some few geometric parameters. T...

  20. Time-Optimal Real-Time Test Case Generation using UPPAAL

    DEFF Research Database (Denmark)

    Hessel, Anders; Larsen, Kim Guldstrand; Nielsen, Brian

    2004-01-01

    Testing is the primary software validation technique used by industry today, but remains ad hoc, error prone, and very expensive. A promising improvement is to automatically generate test cases from formal models of the system under test. We demonstrate how to automatically generate real......-time conformance test cases from timed automata specifications. Specifically we demonstrate how to fficiently generate real-time test cases with optimal execution time i.e test cases that are the fastest possible to execute. Our technique allows time optimal test cases to be generated using manually formulated...... test purposes or generated automatically from various coverage criteria of the model....

  1. Constraint-based model of Shewanella oneidensis MR-1 metabolism: a tool for data analysis and hypothesis generation.

    Directory of Open Access Journals (Sweden)

    Grigoriy E Pinchuk

    2010-06-01

    Full Text Available Shewanellae are gram-negative facultatively anaerobic metal-reducing bacteria commonly found in chemically (i.e., redox stratified environments. Occupying such niches requires the ability to rapidly acclimate to changes in electron donor/acceptor type and availability; hence, the ability to compete and thrive in such environments must ultimately be reflected in the organization and utilization of electron transfer networks, as well as central and peripheral carbon metabolism. To understand how Shewanella oneidensis MR-1 utilizes its resources, the metabolic network was reconstructed. The resulting network consists of 774 reactions, 783 genes, and 634 unique metabolites and contains biosynthesis pathways for all cell constituents. Using constraint-based modeling, we investigated aerobic growth of S. oneidensis MR-1 on numerous carbon sources. To achieve this, we (i used experimental data to formulate a biomass equation and estimate cellular ATP requirements, (ii developed an approach to identify cycles (such as futile cycles and circulations, (iii classified how reaction usage affects cellular growth, (iv predicted cellular biomass yields on different carbon sources and compared model predictions to experimental measurements, and (v used experimental results to refine metabolic fluxes for growth on lactate. The results revealed that aerobic lactate-grown cells of S. oneidensis MR-1 used less efficient enzymes to couple electron transport to proton motive force generation, and possibly operated at least one futile cycle involving malic enzymes. Several examples are provided whereby model predictions were validated by experimental data, in particular the role of serine hydroxymethyltransferase and glycine cleavage system in the metabolism of one-carbon units, and growth on different sources of carbon and energy. This work illustrates how integration of computational and experimental efforts facilitates the understanding of microbial metabolism at a

  2. Real-Time Trajectory Generation for Autonomous Nonlinear Flight Systems

    National Research Council Canada - National Science Library

    Larsen, Michael; Beard, Randal W; McLain, Timothy W

    2006-01-01

    ... to mobile threats such as radar, jammers, and unfriendly aircraft. In Phase 1 of this STTR project, real-time path planning and trajectory generation techniques for two dimensional flight were developed and demonstrated in software simulation...

  3. Generating k-independent variables in constant time

    DEFF Research Database (Denmark)

    Christiani, Tobias Lybecker; Pagh, Rasmus

    2014-01-01

    The generation of pseudorandom elements over finite fields is fundamental to the time, space and randomness complexity of randomized algorithms and data structures. We consider the problem of generating k-independent random values over a finite field F in a word RAM model equipped with constant...... time addition and multiplication in F, and present the first nontrivial construction of a generator that outputs each value in constant time, not dependent on k. Our generator has period length |F| poly log k and uses k poly (log k) log |F| bits of space, which is optimal up to a poly log k factor. We...... are able to bypass Siegel's lower bound on the time-space tradeoff for k-independent functions by a restriction to sequential evaluation....

  4. Grammar-based feature generation for time-series prediction

    CERN Document Server

    De Silva, Anthony Mihirana

    2015-01-01

    This book proposes a novel approach for time-series prediction using machine learning techniques with automatic feature generation. Application of machine learning techniques to predict time-series continues to attract considerable attention due to the difficulty of the prediction problems compounded by the non-linear and non-stationary nature of the real world time-series. The performance of machine learning techniques, among other things, depends on suitable engineering of features. This book proposes a systematic way for generating suitable features using context-free grammar. A number of feature selection criteria are investigated and a hybrid feature generation and selection algorithm using grammatical evolution is proposed. The book contains graphical illustrations to explain the feature generation process. The proposed approaches are demonstrated by predicting the closing price of major stock market indices, peak electricity load and net hourly foreign exchange client trade volume. The proposed method ...

  5. Inverse relationship of the velocities of perceived time and information processing events in the brain: a potential bioassay for neural functions: a hypothesis.

    Science.gov (United States)

    Rosenberg, R N

    1979-12-01

    The velocity of elapsing time is not a constant but a relativistic component in the space-time continuum as postulated by Albert Einstein in his general and special relativity theories. The hypothesis presented here is that there is a biological corollary to relativity theory. It is postulated that biological time perception is also not a constant but is related by an inverse relationship between the velocities of neural processing events and perceived elapsing time. A careful analysis of this relationship may potentially offer a sensitive bioassay to determine the integrity of regional brain function under normal conditions and in the presence of specific disease processes. The mechanism for the biological basis of this theorem depends on the presence of a neural circuit developed through evolution which monitors overall brain efficiency and is coordinately linked to neural time perceiving circuits. Several test approaches are presented to validate the hypothesis of biologic time relativity compared to the rate of neural processing.

  6. Real-time data flow and product generating for GNSS

    Science.gov (United States)

    Muellerschoen, Ronald J.; Caissy, Mark

    2004-01-01

    The last IGS workshop with the theme 'Towards Real-Time' resulted in the design of a prototype for real-time data and sharing within the IGS. A prototype real-time network is being established that will serve as a test bed for real-time activities within the IGS. We review the developments of the prototype and discuss some of the existing methods and related products of real-time GNSS systems. Recommendations are made concerning real-time data distribution and product generation.

  7. Stochastic generation of hourly wind speed time series

    International Nuclear Information System (INIS)

    Shamshad, A.; Wan Mohd Ali Wan Hussin; Bawadi, M.A.; Mohd Sanusi, S.A.

    2006-01-01

    In the present study hourly wind speed data of Kuala Terengganu in Peninsular Malaysia are simulated by using transition matrix approach of Markovian process. The wind speed time series is divided into various states based on certain criteria. The next wind speed states are selected based on the previous states. The cumulative probability transition matrix has been formed in which each row ends with 1. Using the uniform random numbers between 0 and 1, a series of future states is generated. These states have been converted to the corresponding wind speed values using another uniform random number generator. The accuracy of the model has been determined by comparing the statistical characteristics such as average, standard deviation, root mean square error, probability density function and autocorrelation function of the generated data to those of the original data. The generated wind speed time series data is capable to preserve the wind speed characteristics of the observed data

  8. Multivariate proteomic analysis of the cerebrospinal fluid of patients with peripheral neuropathic pain and healthy controls – a hypothesis-generating pilot study

    Directory of Open Access Journals (Sweden)

    Bäckryd E

    2015-07-01

    Full Text Available Emmanuel Bäckryd,1,2 Bijar Ghafouri,1,2 Anders K Carlsson,1,2 Patrik Olausson,1,2 Björn Gerdle1,2 1Division of Community Medicine, Department of Medical and Health Sciences, Faculty of Health Sciences, Linköping University, Linköping, Sweden; 2Pain and Rehabilitation Centre, Anaesthetics, Operations and Specialty Surgery Centre, Region Östergötland, Linköping, SwedenAbstract: Pain medicine lacks objective biomarkers to guide diagnosis and treatment. Combining two-dimensional gel proteomics with multivariate data analysis by projection, we exploratively analyzed the cerebrospinal fluid of eleven patients with severe peripheral neuropathic pain due to trauma and/or surgery refractory to conventional treatment and eleven healthy controls. Using orthogonal partial least squares discriminant analysis, we identified a panel of 36 proteins highly discriminating between the two groups. Due to a possible confounding effect of age, a new model with age as outcome variable was computed for patients (n=11, and four out of 36 protein spots were excluded due to a probable influence of age. Of the 32 remaining proteins, the following seven had the highest discriminatory power between the two groups: an isoform of angiotensinogen (upregulated in patients, two isoforms of alpha-1-antitrypsin (downregulated in patients, three isoforms of haptoglobin (upregulated in patients, and one isoform of pigment epithelium-derived factor (downregulated in patients. It has recently been hypothesized that the renin–angiotensin system may play a role in the pathophysiology of neuropathic pain, and a clinical trial of an angiotensin II receptor antagonist was recently published. It is noteworthy that when searching for neuropathic pain biomarkers with a purely explorative methodology, it was indeed a renin–angiotensin system protein that had the highest discriminatory power between patients and controls in the present study. The results from this hypothesis-generating

  9. Subnanosecond-rise-time, low-impedance pulse generator

    International Nuclear Information System (INIS)

    Druce, R.; Vogtlin, G.

    1983-01-01

    This paper describes a fast rise, low-impedance pulse generator that has been developed at the Lawrence Livermore National Laboratory. The design specifications of this generator are: 50-kV operating voltage, 1-ohm output impedance, subnanosecond rise time, and a 2 to 10 nanosecond pulse length. High repetition rate is not required. The design chosen is a parallel-plate, folded Blumlein generator. A tack switch is utilized for its simple construction and high performance. The primary diagnostic is a capacitive voltage divider with a B probe used to measure the current waveform

  10. Generation time and effective population size in Polar Eskimos

    Science.gov (United States)

    Matsumura, Shuichi; Forster, Peter

    2008-01-01

    North Greenland Polar Eskimos are the only hunter–gatherer population, to our knowledge, who can offer precise genealogical records spanning several generations. This is the first report from Eskimos on two key parameters in population genetics, namely, generation time (T) and effective population size (Ne). The average mother–daughter and father–son intervals were 27 and 32 years, respectively, roughly similar to the previously published generation times obtained from recent agricultural societies across the world. To gain an insight for the generation time in our distant ancestors, we calculated maternal generation time for two wild chimpanzee populations. We also provide the first comparison among three distinct approaches (genealogy, variance and life table methods) for calculating Ne, which resulted in slightly differing values for the Eskimos. The ratio of the effective to the census population size is estimated as 0.6–0.7 for autosomal and X-chromosomal DNA, 0.7–0.9 for mitochondrial DNA and 0.5 for Y-chromosomal DNA. A simulation of alleles along the genealogy suggested that Y-chromosomal DNA may drift a little faster than mitochondrial DNA in this population, in contrast to agricultural Icelanders. Our values will be useful not only in prehistoric population inference but also in understanding the shaping of our genome today. PMID:18364314

  11. Semi-autonomous remote sensing time series generation tool

    Science.gov (United States)

    Babu, Dinesh Kumar; Kaufmann, Christof; Schmidt, Marco; Dhams, Thorsten; Conrad, Christopher

    2017-10-01

    High spatial and temporal resolution data is vital for crop monitoring and phenology change detection. Due to the lack of satellite architecture and frequent cloud cover issues, availability of daily high spatial data is still far from reality. Remote sensing time series generation of high spatial and temporal data by data fusion seems to be a practical alternative. However, it is not an easy process, since it involves multiple steps and also requires multiple tools. In this paper, a framework of Geo Information System (GIS) based tool is presented for semi-autonomous time series generation. This tool will eliminate the difficulties by automating all the steps and enable the users to generate synthetic time series data with ease. Firstly, all the steps required for the time series generation process are identified and grouped into blocks based on their functionalities. Later two main frameworks are created, one to perform all the pre-processing steps on various satellite data and the other one to perform data fusion to generate time series. The two frameworks can be used individually to perform specific tasks or they could be combined to perform both the processes in one go. This tool can handle most of the known geo data formats currently available which makes it a generic tool for time series generation of various remote sensing satellite data. This tool is developed as a common platform with good interface which provides lot of functionalities to enable further development of more remote sensing applications. A detailed description on the capabilities and the advantages of the frameworks are given in this paper.

  12. Criteria for the generation of spectra consistent time histories

    International Nuclear Information System (INIS)

    Lin, C.-W.

    1977-01-01

    Several methods are available to conduct seismic analysis for nuclear power plant systems and components. Among them, the response spectrum technique has been most widely adopted for linear type of modal analysis. However, for designs which consist of structural or material nonlinearites such as frequency dependent soil properties, the existance of gaps, single tie rods, and friction between supports where the response has to be computed as a function of time, time history approach is the only viable method of analysis. Two examples of time history analysis are: 1) soil-structure interaction study and, 2) a coupled reactor coolant system and building analysis to either generate the floor response specra or compute nonlinear system time history response. The generation of a suitable time history input for the analysis has been discussed in the literature. Some general guidelines are available to insure that the time history imput will be as conservative as the design response spectra. Very little has been reported as to the effect of the dyanmic characteristics of the time history input upon the system response. In fact, the only available discussion in this respect concerns only with the statitical independent nature of the time history components. In this paper, numerical results for cases using the time history approach are presented. Criteria are also established which may be advantageously used to arrive at spectra consistent time histories which are conservative and more importantly, realistic. (Auth.)

  13. Sensor-Generated Time Series Events: A Definition Language

    Directory of Open Access Journals (Sweden)

    Juan Pazos

    2012-08-01

    Full Text Available There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.

  14. Sensor-Generated Time Series Events: A Definition Language

    Science.gov (United States)

    Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan

    2012-01-01

    There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.

  15. Dead-Time Generation in Six-Phase Frequency Inverter

    Directory of Open Access Journals (Sweden)

    Aurelijus Pitrėnas

    2016-06-01

    Full Text Available In this paper control of multi-phase induction drives is discussed. Structure of six-phase frequency inverter is examined. The article deals with dead-time generation circuits in six-phase frequency inverter for transistor control signals. Computer models of dead-time circuits is created using LTspice software package. Simulation results are compared with experimental results of the tested dead-time circuits. Parameters obtained in simulation results are close to the parameters obtained in experimental results.

  16. Real Time Face Quality Assessment for Face Log Generation

    DEFF Research Database (Denmark)

    Kamal, Nasrollahi; Moeslund, Thomas B.

    2009-01-01

    Summarizing a long surveillance video to just a few best quality face images of each subject, a face-log, is of great importance in surveillance systems. Face quality assessment is the back-bone for face log generation and improving the quality assessment makes the face logs more reliable....... Developing a real time face quality assessment system using the most important facial features and employing it for face logs generation are the concerns of this paper. Extensive tests using four databases are carried out to validate the usability of the system....

  17. Absolute GPS Time Event Generation and Capture for Remote Locations

    Science.gov (United States)

    HIRES Collaboration

    The HiRes experiment operates fixed location and portable lasers at remote desert locations to generate calibration events. One physics goal of HiRes is to search for unusual showers. These may appear similar to upward or horizontally pointing laser tracks used for atmospheric calibration. It is therefore necessary to remove all of these calibration events from the HiRes detector data stream in a physics blind manner. A robust and convenient "tagging" method is to generate the calibration events at precisely known times. To facilitate this tagging method we have developed the GPSY (Global Positioning System YAG) module. It uses a GPS receiver, an embedded processor and additional timing logic to generate laser triggers at arbitrary programmed times and frequencies with better than 100nS accuracy. The GPSY module has two trigger outputs (one microsecond resolution) to trigger the laser flash-lamp and Q-switch and one event capture input (25nS resolution). The GPSY module can be programmed either by a front panel menu based interface or by a host computer via an RS232 serial interface. The latter also allows for computer logging of generated and captured event times. Details of the design and the implementation of these devices will be presented. 1 Motivation Air Showers represent a small fraction, much less than a percent, of the total High Resolution Fly's Eye data sample. The bulk of the sample is calibration data. Most of this calibration data is generated by two types of systems that use lasers. One type sends light directly to the detectors via optical fibers to monitor detector gains (Girard 2001). The other sends a beam of light into the sky and the scattered light that reaches the detectors is used to monitor atmospheric effects (Wiencke 1998). It is important that these calibration events be cleanly separated from the rest of the sample both to provide a complete set of monitoring information, and more

  18. A time-domain method to generate artificial time history from a given reference response spectrum

    International Nuclear Information System (INIS)

    Shin, Gang Sik; Song, Oh Seop

    2016-01-01

    Seismic qualification by test is widely used as a way to show the integrity and functionality of equipment that is related to the overall safety of nuclear power plants. Another means of seismic qualification is by direct integration analysis. Both approaches require a series of time histories as an input. However, in most cases, the possibility of using real earthquake data is limited. Thus, artificial time histories are widely used instead. In many cases, however, response spectra are given. Thus, most of the artificial time histories are generated from the given response spectra. Obtaining the response spectrum from a given time history is straightforward. However, the procedure for generating artificial time histories from a given response spectrum is difficult and complex to understand. Thus, this paper presents a simple time-domain method for generating a time history from a given response spectrum; the method was shown to satisfy conditions derived from nuclear regulatory guidance

  19. Short-time action electric generators to power physical devices

    International Nuclear Information System (INIS)

    Glebov, I.A.; Kasharskij, Eh.G.; Rutberg, F.G.; Khutoretskij, G.M.

    1982-01-01

    Requirements to be met by power-supply sources of the native electrophysical facilities have been analyzed and trends in designing foreign electric machine units of short-time action have been considered. Specifications of a generator, manufactured in the form of synchronous bipolar turbogenerator with an all-forged rotor with indirect air cooling of the rotor and stator windings are presented. Front parts of the stator winding are additionally fixed using glass-textolite rings, brackets and gaskets. A flywheel, manufactured in the form of all-forged steel cylinder is joined directly with the generator rotor by means of a half-coupling. An acceleration asynchronous engine with a phase rotor of 4 MW nominal capacity is located on the opposite side of the flywheel. The generator peak power is 242 MVxA; power factor = 0.9; energy transferred to the load 5per 1 pulse =00 MJ; the flywheel weight 81 t

  20. Procedural Content Generation for Real-Time Strategy Games

    Directory of Open Access Journals (Sweden)

    Raúl Lara-Cabrera

    2015-03-01

    Full Text Available Videogames are one of the most important and profitable sectors in the industry of entertainment. Nowadays, the creation of a videogame is often a large-scale endeavor and bears many similarities with, e.g., movie production. On the central tasks in the development of a videogame is content generation, namely the definition of maps, terrains, non-player characters (NPCs and other graphical, musical and AI-related components of the game. Such generation is costly due to its complexity, the great amount of work required and the need of specialized manpower. Hence the relevance of optimizing the process and alleviating costs. In this sense, procedural content generation (PCG comes in handy as a means of reducing costs by using algorithmic techniques to automatically generate some game contents. PCG also provides advantages in terms of player experience since the contents generated are typically not fixed but can vary in different playing sessions, and can even adapt to the player herself. For this purpose, the underlying algorithmic technique used for PCG must be also flexible and adaptable. This is the case of computational intelligence in general and evolutionary algorithms in particular. In this work we shall provide an overview of the use of evolutionary intelligence for PCG, with special emphasis on its use within the context of real-time strategy games. We shall show how these techniques can address both playability and aesthetics, as well as improving the game AI.

  1. The Geohistorical Time Arrow: From Steno's Stratigraphic Principles to Boltzmann's Past Hypothesis

    Science.gov (United States)

    Kravitz, Gadi

    2014-01-01

    Geologists have always embraced the time arrow in order to reconstruct the past geology of Earth, thus turning geology into a historical science. The covert assumption regarding the direction of time from past to present appears in Nicolas Steno's principles of stratigraphy. The intuitive-metaphysical nature of Steno's assumption was based on a…

  2. Criteria for the generation of spectra consistent time histories

    International Nuclear Information System (INIS)

    Lin, C.-W.

    1977-01-01

    There are several approaches currently being used by the nuclear industry to generate design time history input. None of these produce unique results. That is, given a design response spectrum, nearly unlimited number of synthesized time history motions can be constructed. The effects of these time history motions on the system response vary and they have not been properly evaluated. For instance, some time histories may have high frequency content, higher than indicated by the real earthquake records. This may have adverse influence on the system response with high frequency impact or predominate high frequency modes. Other time histories may have unnecessarily long duration which makes a large and detailed analytical model uneconomical. The influence of the time history duration is primarily on the number of peak response stress cycles computed which can be either extrapolated from limited duration input or determined using other means. Rarely is the case that duration has to be kept long enough for the structure response to reach its peak. Consequently, input duration should be kept no longer than necessary to produce peak response to allow the use of more sophisticated model which enables the problem to be studied thoroughly. There are also time histories which have satisfied the generally accepted definition of statistical independent requirements, but possess statistical characteristics unlike those of the real earthquakes. Finally, some time histories may require smaller integration time steps than ordinarily used to insure that certain systems will have converge and stable solutions

  3. Behavioral and multimodal neuroimaging evidence for a deficit in brain timing networks in stuttering: A hypothesis and theory

    Directory of Open Access Journals (Sweden)

    Andrew C Etchell

    2014-06-01

    Full Text Available The fluent production of speech requires accurately timed movements. In this article, we propose that a deficit in brain timing networks is the core neurophysiological deficit in stuttering. We first discuss the experimental evidence supporting the involvement of the basal ganglia and supplementary motor area in stuttering and the involvement of the cerebellum as a mechanism for compensating for the neural deficits that underlie stuttering. Next, we outline the involvement of the right inferior frontal gyrus as another putative compensatory locus in stuttering and suggest a role for this structure in an expanded core timing-network. Subsequently, we review behavioral studies of timing in people who stutter and examine their behavioral performance as compared to people who do not stutter. Finally, we highlight challenges to existing research and provide avenues for future research with specific hypotheses.

  4. Time-Grating for the Generation of STUD Pulse Trains

    Science.gov (United States)

    Zheng, Jun; Wang, Shi-Wei; Xu, Jian-Qiu

    2013-04-01

    Spike train of uneven duration or delay (STUD) pulses hold potential for laser-plasma interaction (LPI) control in laser fusion. The technique based on time grating is applied to generate an STUD pulse train. Time grating, a temporal analogy of the diffraction grating, can control the pulse width, shape, and repetition rate easily through the use of electro-optical devices. The pulse width and repetition rate are given by the modulation frequency and depth of the phase modulation function in theory and numerical calculation. The zero-chirped phase modulation is good for the compression effect of the time grating. A principle experiment of two pulses interfering is shown to verify the time grating function.

  5. Analysis and generation of groundwater concentration time series

    Science.gov (United States)

    Crăciun, Maria; Vamoş, Călin; Suciu, Nicolae

    2018-01-01

    Concentration time series are provided by simulated concentrations of a nonreactive solute transported in groundwater, integrated over the transverse direction of a two-dimensional computational domain and recorded at the plume center of mass. The analysis of a statistical ensemble of time series reveals subtle features that are not captured by the first two moments which characterize the approximate Gaussian distribution of the two-dimensional concentration fields. The concentration time series exhibit a complex preasymptotic behavior driven by a nonstationary trend and correlated fluctuations with time-variable amplitude. Time series with almost the same statistics are generated by successively adding to a time-dependent trend a sum of linear regression terms, accounting for correlations between fluctuations around the trend and their increments in time, and terms of an amplitude modulated autoregressive noise of order one with time-varying parameter. The algorithm generalizes mixing models used in probability density function approaches. The well-known interaction by exchange with the mean mixing model is a special case consisting of a linear regression with constant coefficients.

  6. Methodologies for estimating one-time hazardous waste generation for capacity generation for capacity assurance planning

    International Nuclear Information System (INIS)

    Tonn, B.; Hwang, Ho-Ling; Elliot, S.; Peretz, J.; Bohm, R.; Hendrucko, B.

    1994-04-01

    This report contains descriptions of methodologies to be used to estimate the one-time generation of hazardous waste associated with five different types of remediation programs: Superfund sites, RCRA Corrective Actions, Federal Facilities, Underground Storage Tanks, and State and Private Programs. Estimates of the amount of hazardous wastes generated from these sources to be shipped off-site to commercial hazardous waste treatment and disposal facilities will be made on a state by state basis for the years 1993, 1999, and 2013. In most cases, estimates will be made for the intervening years, also

  7. Methodologies for estimating one-time hazardous waste generation for capacity generation for capacity assurance planning

    Energy Technology Data Exchange (ETDEWEB)

    Tonn, B.; Hwang, Ho-Ling; Elliot, S. [Oak Ridge National Lab., TN (United States); Peretz, J.; Bohm, R.; Hendrucko, B. [Univ. of Tennessee, Knoxville, TN (United States)

    1994-04-01

    This report contains descriptions of methodologies to be used to estimate the one-time generation of hazardous waste associated with five different types of remediation programs: Superfund sites, RCRA Corrective Actions, Federal Facilities, Underground Storage Tanks, and State and Private Programs. Estimates of the amount of hazardous wastes generated from these sources to be shipped off-site to commercial hazardous waste treatment and disposal facilities will be made on a state by state basis for the years 1993, 1999, and 2013. In most cases, estimates will be made for the intervening years, also.

  8. Timing of breast cancer surgery in relation to the menstrual cycle the rise and fall of a hypothesis

    DEFF Research Database (Denmark)

    Kroman, N.

    2008-01-01

    It has been claimed that the timing of surgery in relation to the menstrual cycle can significantly influence the prognosis among premenopausal women with primary breast cancer. The literature on the subject is reviewed. The results are heterogeneous, and the quality of the studies is in general...

  9. Time and Place of Human Origins, the African Eve Hypothesis Examined through Modelling: Can High Schools Contribute?

    Science.gov (United States)

    Oxnard, Charles

    1994-01-01

    Studies of mitochondrial DNA imply that modern humans arose in Africa 150,000 years ago and spread throughout the world, replacing all prior human groups. But many paleontologists see continuity in human fossils on each continent and over a much longer time. Modeling may help test these alternatives. (Author/MKR)

  10. Is Time a creation of Life in response to Gravity? : This hypothesis suggests new ways for looking at extraterrestrial life

    NARCIS (Netherlands)

    Ockels, W.J.

    2007-01-01

    From his personal experience during a space flight (Challenger 1985) onward, the author has been struck repeatedly by the remarkable influence of Earth's environment on life, in particular by its most inevitable elements: time and gravity. Our life might be peculiar to the local Earth conditions,

  11. Timing the generation of distinct retinal cells by homeobox proteins.

    Directory of Open Access Journals (Sweden)

    Sarah Decembrini

    2006-09-01

    Full Text Available The reason why different types of vertebrate nerve cells are generated in a particular sequence is still poorly understood. In the vertebrate retina, homeobox genes play a crucial role in establishing different cell identities. Here we provide evidence of a cellular clock that sequentially activates distinct homeobox genes in embryonic retinal cells, linking the identity of a retinal cell to its time of generation. By in situ expression analysis, we found that the three Xenopus homeobox genes Xotx5b, Xvsx1, and Xotx2 are initially transcribed but not translated in early retinal progenitors. Their translation requires cell cycle progression and is sequentially activated in photoreceptors (Xotx5b and bipolar cells (Xvsx1 and Xotx2. Furthermore, by in vivo lipofection of "sensors" in which green fluorescent protein translation is under control of the 3' untranslated region (UTR, we found that the 3' UTRs of Xotx5b, Xvsx1, and Xotx2 are sufficient to drive a spatiotemporal pattern of translation matching that of the corresponding proteins and consistent with the time of generation of photoreceptors (Xotx5b and bipolar cells (Xvsx1 and Xotx2. The block of cell cycle progression of single early retinal progenitors impairs their differentiation as photoreceptors and bipolar cells, but is rescued by the lipofection of Xotx5b and Xvsx1 coding sequences, respectively. This is the first evidence to our knowledge that vertebrate homeobox proteins can work as effectors of a cellular clock to establish distinct cell identities.

  12. Real-time Image Generation for Compressive Light Field Displays

    International Nuclear Information System (INIS)

    Wetzstein, G; Lanman, D; Hirsch, M; Raskar, R

    2013-01-01

    With the invention of integral imaging and parallax barriers in the beginning of the 20th century, glasses-free 3D displays have become feasible. Only today—more than a century later—glasses-free 3D displays are finally emerging in the consumer market. The technologies being employed in current-generation devices, however, are fundamentally the same as what was invented 100 years ago. With rapid advances in optical fabrication, digital processing power, and computational perception, a new generation of display technology is emerging: compressive displays exploring the co-design of optical elements and computational processing while taking particular characteristics of the human visual system into account. In this paper, we discuss real-time implementation strategies for emerging compressive light field displays. We consider displays composed of multiple stacked layers of light-attenuating or polarization-rotating layers, such as LCDs. The involved image generation requires iterative tomographic image synthesis. We demonstrate that, for the case of light field display, computed tomographic light field synthesis maps well to operations included in the standard graphics pipeline, facilitating efficient GPU-based implementations with real-time framerates.

  13. Real Time Engineering Analysis Based on a Generative Component Implementation

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Klitgaard, Jens

    2007-01-01

    The present paper outlines the idea of a conceptual design tool with real time engineering analysis which can be used in the early conceptual design phase. The tool is based on a parametric approach using Generative Components with embedded structural analysis. Each of these components uses...... without jumping from aesthetics to structural digital design tools and back, but to work with both simultaneously and real time. The engineering level of knowledge is incorporated at a conceptual thinking level, i.e. qualitative information is used in stead of using quantitative information. An example...... with a static determinate roof structure modelled by beam components is given. The example outlines the idea of the tool for conceptual design in early phase of a multidisciplinary design process between architecture and structural engineering....

  14. Design of time interval generator based on hybrid counting method

    Energy Technology Data Exchange (ETDEWEB)

    Yao, Yuan [State Key Laboratory of Particle Detection and Electronics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Wang, Zhaoqi [State Key Laboratory of Particle Detection and Electronics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Lu, Houbing [State Key Laboratory of Particle Detection and Electronics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Hefei Electronic Engineering Institute, Hefei 230037 (China); Chen, Lian [State Key Laboratory of Particle Detection and Electronics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Jin, Ge, E-mail: goldjin@ustc.edu.cn [State Key Laboratory of Particle Detection and Electronics, University of Science and Technology of China, Hefei, Anhui 230026 (China)

    2016-10-01

    Time Interval Generators (TIGs) are frequently used for the characterizations or timing operations of instruments in particle physics experiments. Though some “off-the-shelf” TIGs can be employed, the necessity of a custom test system or control system makes the TIGs, being implemented in a programmable device desirable. Nowadays, the feasibility of using Field Programmable Gate Arrays (FPGAs) to implement particle physics instrumentation has been validated in the design of Time-to-Digital Converters (TDCs) for precise time measurement. The FPGA-TDC technique is based on the architectures of Tapped Delay Line (TDL), whose delay cells are down to few tens of picosecond. In this case, FPGA-based TIGs with high delay step are preferable allowing the implementation of customized particle physics instrumentations and other utilities on the same FPGA device. A hybrid counting method for designing TIGs with both high resolution and wide range is presented in this paper. The combination of two different counting methods realizing an integratable TIG is described in detail. A specially designed multiplexer for tap selection is emphatically introduced. The special structure of the multiplexer is devised for minimizing the different additional delays caused by the unpredictable routings from different taps to the output. A Kintex-7 FPGA is used for the hybrid counting-based implementation of a TIG, providing a resolution up to 11 ps and an interval range up to 8 s.

  15. Advanced Diffusion-weighted Imaging Modeling for Prostate Cancer Characterization: Correlation with Quantitative Histopathologic Tumor Tissue Composition-A Hypothesis-generating Study.

    Science.gov (United States)

    Hectors, Stefanie J; Semaan, Sahar; Song, Christopher; Lewis, Sara; Haines, George K; Tewari, Ashutosh; Rastinehad, Ardeshir R; Taouli, Bachir

    2018-03-01

    Purpose To correlate quantitative diffusion-weighted imaging (DWI) parameters derived from conventional monoexponential DWI, stretched exponential DWI, diffusion kurtosis imaging (DKI), and diffusion-tensor imaging (DTI) with quantitative histopathologic tumor tissue composition in prostate cancer in a preliminary hypothesis-generating study. Materials and Methods This retrospective institutional review board-approved study included 24 patients with prostate cancer (mean age, 63 years) who underwent magnetic resonance (MR) imaging, including high-b-value DWI and DTI at 3.0 T, before prostatectomy. The following parameters were calculated in index tumors and nontumoral peripheral zone (PZ): apparent diffusion coefficient (ADC) obtained with monoexponential fit (ADC ME ), ADC obtained with stretched exponential modeling (ADC SE ), anomalous exponent (α) obtained at stretched exponential DWI, ADC obtained with DKI modeling (ADC DKI ), kurtosis with DKI, ADC obtained with DTI (ADC DTI ), and fractional anisotropy (FA) at DTI. Parameters in prostate cancer and PZ were compared by using paired Student t tests. Pearson correlations between tumor DWI and quantitative histologic parameters (nuclear, cytoplasmic, cellular, stromal, luminal fractions) were determined. Results All DWI parameters were significantly different between prostate cancer and PZ (P < .012). ADC ME , ADC SE , and ADC DKI all showed significant negative correlation with cytoplasmic and cellular fractions (r = -0.546 to -0.435; P < .034) and positive correlation with stromal fractions (r = 0.619-0.669; P < .001). ADC DTI and FA showed correlation only with stromal fraction (r = 0.512 and -0.413, respectively; P < .045). α did not correlate with histologic parameters, whereas kurtosis showed significant correlations with histopathologic parameters (r = 0.487, 0.485, -0.422 for cytoplasmic, cellular, and stromal fractions, respectively; P < .040). Conclusion Advanced DWI methods showed significant

  16. Monte-Carlo Generation of Time Evolving Fission Chains

    Energy Technology Data Exchange (ETDEWEB)

    Verbeke, Jerome M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kim, Kenneth S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Prasad, Manoj K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Snyderman, Neal J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-08-01

    About a decade ago, a computer code was written to model neutrons from their “birth” to their final “death” in thermal neutron detectors (3He tubes): SrcSim had enough physics to track the neutrons in multiplying systems, appropriately increasing and decreasing the neutron population as they interacted by absorption, fission and leakage. The theory behind the algorithms assumed that all neutrons produced in a fission chain were all produced simultaneously, and then diffused to the neutron detectors. For cases where the diffusion times are long compared to the fission chains, SrcSim is very successful. Indeed, it works extraordinarily well for thermal neutron detectors and bare objects, because it takes tens of microseconds for fission neutrons to slow down to thermal energies, where they can be detected. Microseconds are a very long time compared to the lengths of the fission chains. However, this inherent assumption in the theory prevents its use to cases where either the fission chains are long compared to the neutron diffusion times (water-cooled nuclear reactors, or heavily moderated object, where the theory starts failing), or the fission neutrons can be detected shortly after they were produced (fast neutron detectors). For these cases, a new code needs to be written, where the underlying assumption is not made. The purpose of this report is to develop an algorithm to generate the arrival times of neutrons in fast neutron detectors, starting from a neutron source such as a spontaneous fission source (252Cf) or a multiplying source (Pu). This code will be an extension of SrcSim to cases where correlations between neutrons in the detectors are on the same or shorter time scales as the fission chains themselves.

  17. Physicochemical and toxicological characteristics of welding fume derived particles generated from real time welding processes.

    Science.gov (United States)

    Chang, Cali; Demokritou, Philip; Shafer, Martin; Christiani, David

    2013-01-01

    Welding fume particles have been well studied in the past; however, most studies have examined welding fumes generated from machine models rather than actual exposures. Furthermore, the link between physicochemical and toxicological properties of welding fume particles has not been well understood. This study aims to investigate the physicochemical properties of particles derived during real time welding processes generated during actual welding processes and to assess the particle size specific toxicological properties. A compact cascade impactor (Harvard CCI) was stationed within the welding booth to sample particles by size. Size fractionated particles were extracted and used for both off-line physicochemical analysis and in vitro cellular toxicological characterization. Each size fraction was analyzed for ions, elemental compositions, and mass concentration. Furthermore, real time optical particle monitors (DustTrak™, TSI Inc., Shoreview, Minn.) were used in the same welding booth to collect real time PM2.5 particle number concentration data. The sampled particles were extracted from the polyurethane foam (PUF) impaction substrates using a previously developed and validated protocol, and used in a cellular assay to assess oxidative stress. By mass, welding aerosols were found to be in coarse (PM 2.5–10), and fine (PM 0.1–2.5) size ranges. Most of the water soluble (WS) metals presented higher concentrations in the coarse size range with some exceptions such as sodium, which presented elevated concentration in the PM 0.1 size range. In vitro data showed size specific dependency, with the fine and ultrafine size ranges having the highest reactive oxygen species (ROS) activity. Additionally, this study suggests a possible correlation between welders' experience, the welding procedure and equipment used and particles generated from welding fumes. Mass concentrations and total metal and water soluble metal concentrations of welding fume particles may be

  18. Real Time Engineering Analysis Based on a Generative component Implementation

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Klitgaard, Jens

    2007-01-01

    The present paper outlines the idea of a conceptual design tool with real time engineering analysis which can be used in the early conceptual design phase. The tool is based on a parametric approach using Generative Components with embedded structural analysis. Each of these components uses...... the geometry, material properties and fixed point characteristics to calculate the dimensions and subsequent feasibility of any architectural design. The proposed conceptual design tool provides the possibility for the architect to work with both the aesthetic as well as the structural aspects of architecture...... with a static determinate roof structure modelled by beam components is given. The example outlines the idea of the tool for conceptual design in early phase of a multidisciplinary design process between architecture and structural engineering....

  19. Do cortical midline variability and low frequency fluctuations mediate William James' "Stream of Consciousness"? "Neurophenomenal Balance Hypothesis" of "Inner Time Consciousness".

    Science.gov (United States)

    Northoff, Georg

    2014-11-01

    William James famously characterized consciousness by 'stream of consciousness' which describes the temporal continuity and flow of the contents of consciousness in our 'inner time consciousness'. More specifically he distinguished between "substantive parts", the contents of consciousness, and "transitive parts", the linkages between different contents. While much research has recently focused on the substantive parts, the neural mechanisms underlying the transitive parts and their characterization by the balance between 'sensible continuity' and 'continuous change' remain unclear. The aim of this paper is to develop so-called neuro-phenomenal hypothesis about specifically the transitive parts and their two phenomenal hallmark features, sensible continuity and continuous change in 'inner time consciousness'. Based on recent findings, I hypothesize that the cortical midline structures and their high degree of variability and strong low frequency fluctuations play an essential role in mediating the phenomenal balance between sensible continuity and continuous change. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Time series power flow analysis for distribution connected PV generation.

    Energy Technology Data Exchange (ETDEWEB)

    Broderick, Robert Joseph; Quiroz, Jimmy Edward; Ellis, Abraham; Reno, Matthew J.; Smith, Jeff; Dugan, Roger

    2013-01-01

    Distributed photovoltaic (PV) projects must go through an interconnection study process before connecting to the distribution grid. These studies are intended to identify the likely impacts and mitigation alternatives. In the majority of the cases, system impacts can be ruled out or mitigation can be identified without an involved study, through a screening process or a simple supplemental review study. For some proposed projects, expensive and time-consuming interconnection studies are required. The challenges to performing the studies are twofold. First, every study scenario is potentially unique, as the studies are often highly specific to the amount of PV generation capacity that varies greatly from feeder to feeder and is often unevenly distributed along the same feeder. This can cause location-specific impacts and mitigations. The second challenge is the inherent variability in PV power output which can interact with feeder operation in complex ways, by affecting the operation of voltage regulation and protection devices. The typical simulation tools and methods in use today for distribution system planning are often not adequate to accurately assess these potential impacts. This report demonstrates how quasi-static time series (QSTS) simulation and high time-resolution data can be used to assess the potential impacts in a more comprehensive manner. The QSTS simulations are applied to a set of sample feeders with high PV deployment to illustrate the usefulness of the approach. The report describes methods that can help determine how PV affects distribution system operations. The simulation results are focused on enhancing the understanding of the underlying technical issues. The examples also highlight the steps needed to perform QSTS simulation and describe the data needed to drive the simulations. The goal of this report is to make the methodology of time series power flow analysis readily accessible to utilities and others responsible for evaluating

  1. The reaction-time task-rule congruency effect is not affected by working memory load: further support for the activated long-term memory hypothesis.

    Science.gov (United States)

    Kessler, Yoav; Meiran, Nachshon

    2010-07-01

    Previous studies claimed that task representation is carried out by the activated long-term memory portion of working memory (WM; Meiran and Kessler in J Exp Psychol Human Percept Perform 34:137-157, 2008). The present study provides a more direct support for this hypothesis. We used the reaction-time task-rule congruency effect (RT-TRCE) in a task-switching setup, and tested the effects of loading WM with irrelevant task rules on RT-TRCE. Experiment 1 manipulated WM load in a between-subject design. WM participants performed a color/shape task switching, while having 0, 1 or 3 numerical task rules as WM load. Experiment 2 used a similar load manipulation (1 or 3 rules to load WM) in a within-subject design. Experiment 3 extended these results by loading WM with perceptual tasks that were more similar to the shape/color tasks. The results show that RT-TRCE was not affected by WM load supporting the activated long-term memory hypothesis.

  2. Impact of JAK2(V617F mutation status on treatment response to anagrelide in essential thrombocythemia: an observational, hypothesis-generating study

    Directory of Open Access Journals (Sweden)

    Cascavilla N

    2015-05-01

    Full Text Available Nicola Cascavilla,1 Valerio De Stefano,2 Fabrizio Pane,3 Alessandro Pancrazzi,4 Alessandra Iurlo,5 Marco Gobbi,6 Francesca Palandri,7 Giorgina Specchia,8 A Marina Liberati,9 Mariella D’Adda,10 Gianluca Gaidano,11 Rajmonda Fjerza,4 Heinrich Achenbach,12 Jonathan Smith,13 Paul Wilde,13 Alessandro M Vannucchi41Division of Hematology, Casa Sollievo della Sofferenza Hospital, IRCCS, San Giovanni Rotondo, Italy; 2Institute of Hematology, Catholic University, Rome, Italy; 3Department of Clinical Medicine and Surgery, University of Naples Federico II, Naples, Italy; 4Department of Experimental and Clinical Medicine, University of Florence, Florence, Italy; 5Oncohematology Unit, Fondazione IRCCS Ca’ Granda, Ospedale Maggiore Policlinico, Milan, Italy; 6IRCCS AOU San Martino, Genova, Italy; 7Department of Specialistic, Diagnostic and Experimental Medicine, St Orsola-Malpighi Hospital, University of Bologna, Bologna, Italy; 8Unit of Hematology with Transplantation, Department of Emergency and Organ Transplantation, University of Bari, Bari, Italy; 9Ospedale Santa Maria, Terni, Italy; 10Division of Hematology, Azienda Ospedaliera Spedali Civili di Brescia, Brescia, Italy; 11Division of Hematology, Department of Translational Medicine, Amedeo Avogadro University of Eastern Piedmont, Novara, Italy; 12Research and Development, Shire GmbH, Eysins, Switzerland; 13Shire Pharmaceutical Development Ltd, Basingstoke, United KingdomAbstract: A JAK2(V617F mutation is found in approximately 55% of patients with essential thrombocythemia (ET, and represents a key World Health Organization diagnostic criterion. This hypothesis-generating study (NCT01352585 explored the impact of JAK2(V617F mutation status on treatment response to anagrelide in patients with ET who were intolerant/refractory to their current cytoreductive therapy. The primary objective was to compare the proportion of JAK2-positive versus JAK2-negative patients who achieved at least a partial platelet

  3. Verbal fluency: Effect of time on item generation

    Directory of Open Access Journals (Sweden)

    Mayra Jacuviske Venegas

    Full Text Available Abstract The distribution of item generation/time in the performance of elderly on verbal fluency (VF remains unknown. Objective: To analyze the number of items, their distribution and impact of the first quartile on the final test result. Methods: 31 individuals performed the tests (average age=74 years; schooling=8.16 years. Results: The number of items produced in the first quartile differed from the other quartiles for both semantic and phonologic VF where 40% of items were produced in the first quartile. No effect of age was found and schooling influenced performance on the first and second quartiles of semantic VF and on the 1st, 2nd and 3rd quartiles of phonemic VF. Discussion: This study contributes with the finding that asymptotic levels are attained prior to the 30 seconds observed in other studies, being reached at the 15-second mark. Furthermore, schooling was found to be associated to the number of items produced in both the first and 2nd quartiles for semantic VF, and in 1st, 2nd and 3rd quartiles for phonemic fluency. Conclusion: The schooling effect was noted both in semantic and executive aspects of VF. The brief form of the VF test may represent a promising tool for clinical evaluation.

  4. Sequentially firing neurons confer flexible timing in neural pattern generators

    International Nuclear Information System (INIS)

    Urban, Alexander; Ermentrout, Bard

    2011-01-01

    Neuronal networks exhibit a variety of complex spatiotemporal patterns that include sequential activity, synchrony, and wavelike dynamics. Inhibition is the primary means through which such patterns are implemented. This behavior is dependent on both the intrinsic dynamics of the individual neurons as well as the connectivity patterns. Many neural circuits consist of networks of smaller subcircuits (motifs) that are coupled together to form the larger system. In this paper, we consider a particularly simple motif, comprising purely inhibitory interactions, which generates sequential periodic dynamics. We first describe the dynamics of the single motif both for general balanced coupling (all cells receive the same number and strength of inputs) and then for a specific class of balanced networks: circulant systems. We couple these motifs together to form larger networks. We use the theory of weak coupling to derive phase models which, themselves, have a certain structure and symmetry. We show that this structure endows the coupled system with the ability to produce arbitrary timing relationships between symmetrically coupled motifs and that the phase relationships are robust over a wide range of frequencies. The theory is applicable to many other systems in biology and physics.

  5. Time to abandon the hygiene hypothesis: new perspectives on allergic disease, the human microbiome, infectious disease prevention and the role of targeted hygiene.

    Science.gov (United States)

    Bloomfield, Sally F; Rook, Graham Aw; Scott, Elizabeth A; Shanahan, Fergus; Stanwell-Smith, Rosalind; Turner, Paul

    2016-07-01

    To review the burden of allergic and infectious diseases and the evidence for a link to microbial exposure, the human microbiome and immune system, and to assess whether we could develop lifestyles which reconnect us with exposures which could reduce the risk of allergic disease while also protecting against infectious disease. Using methodology based on the Delphi technique, six experts in infectious and allergic disease were surveyed to allow for elicitation of group judgement and consensus view on issues pertinent to the aim. Key themes emerged where evidence shows that interaction with microbes that inhabit the natural environment and human microbiome plays an essential role in immune regulation. Changes in lifestyle and environmental exposure, rapid urbanisation, altered diet and antibiotic use have had profound effects on the human microbiome, leading to failure of immunotolerance and increased risk of allergic disease. Although evidence supports the concept of immune regulation driven by microbe-host interactions, the term 'hygiene hypothesis' is a misleading misnomer. There is no good evidence that hygiene, as the public understands, is responsible for the clinically relevant changes to microbial exposures. Evidence suggests a combination of strategies, including natural childbirth, breast feeding, increased social exposure through sport, other outdoor activities, less time spent indoors, diet and appropriate antibiotic use, may help restore the microbiome and perhaps reduce risks of allergic disease. Preventive efforts must focus on early life. The term 'hygiene hypothesis' must be abandoned. Promotion of a risk assessment approach (targeted hygiene) provides a framework for maximising protection against pathogen exposure while allowing spread of essential microbes between family members. To build on these findings, we must change public, public health and professional perceptions about the microbiome and about hygiene. We need to restore public

  6. Indirect Basal Metabolism Estimation in Tailoring Recombinant Human TSH Administration in Patients Affected by Differentiated Thyroid Cancer: A Hypothesis-Generating Study

    Directory of Open Access Journals (Sweden)

    Agnese Barnabei

    2018-02-01

    Full Text Available PurposeRecombinant human TSH (rhTSH is currently used in follow-up of patients affected by differentiated thyroid cancer (DTC. Age, sex, weight, body mass index, body surface area (BSA and renal function are known factors affecting serum TSH peak levels, but the proper rhTSH dose to deliver to single patient remains elusive. In this study, the correlations of basal metabolic rates with serum TSH peak following rhTSH administration were investigated.MethodsWe evaluated 221 patients affected by thyroid cancer that received a standard dose rhTSH. Blood samples were collected at pre-established time points. Data on body weight, height, and BSA were collected. The Mifflin-St Jeor and Fleisch equations were used to assess basal metabolism.ResultsThe median value (range of serum TSH peaks was 142 ± 53 μU/ml. Serum TSH peaks were significantly lower in males than in females (p = 0.04. TSH values also increased with age. Data showed a significant decrease of TSH peak levels at day 3 from the administration of rhTSH when basal metabolic rates increased (p = 0.002 and p = 0.009, respectively. Similar findings were observed at day 5 (p = 0.004 and p = 0.04, respectively. A multivariate analysis of several factors revealed that patients’ basal metabolism (obtained using the Mifflin-St Jeor but not Fleisch equation predicts serum TSH level peak at day 3 (p < 0.001. These results were used to generate a new formula based on Mifflin-StJeor equation which reveals as a promising tool in tailoring rhTSH dose.ConclusionBasal metabolism appears an improving factor in tailoring diagnostic rhTSH dose in patients affected by DTC.

  7. Physiopathological Hypothesis of Cellulite

    OpenAIRE

    de Godoy, Jos? Maria Pereira; de Godoy, Maria de F?tima Guerreiro

    2009-01-01

    A series of questions are asked concerning this condition including as regards to its name, the consensus about the histopathological findings, physiological hypothesis and treatment of the disease. We established a hypothesis for cellulite and confirmed that the clinical response is compatible with this hypothesis. Hence this novel approach brings a modern physiological concept with physiopathologic basis and clinical proof of the hypothesis. We emphasize that the choice of patient, correct ...

  8. Variability: A Pernicious Hypothesis.

    Science.gov (United States)

    Noddings, Nel

    1992-01-01

    The hypothesis of greater male variability in test results is discussed in its historical context, and reasons feminists have objected to the hypothesis are considered. The hypothesis acquires political importance if it is considered that variability results from biological, rather than cultural, differences. (SLD)

  9. A battery-operated pilot balloon time-signal generator

    Science.gov (United States)

    Ralph H. Moltzau

    1966-01-01

    Describes the design and construction of a 1-pound, battery-operated, time-signal transmitter, which is usable with portable radio or field telephone circuits for synchronizing multi-theodolite observation of pilot balloons.

  10. Generate stepper motor linear speed profile in real time

    Science.gov (United States)

    Stoychitch, M. Y.

    2018-01-01

    In this paper we consider the problem of realization of linear speed profile of stepper motors in real time. We considered the general case when changes of speed in the phases of acceleration and deceleration are different. The new and practical algorithm of the trajectory planning is given. The algorithms of the real time speed control which are suitable for realization to the microcontroller and FPGA circuits are proposed. The practical realization one of these algorithms, using Arduino platform, is given also.

  11. Space-time trajectories of wind power generation: Parameterized precision matrices under a Gaussian copula approach

    DEFF Research Database (Denmark)

    Tastu, Julija; Pinson, Pierre; Madsen, Henrik

    2015-01-01

    Emphasis is placed on generating space-time trajectories of wind power generation, consisting of paths sampled from high-dimensional joint predictive densities, describing wind power generation at a number of contiguous locations and successive lead times. A modelling approach taking advantage...

  12. Automatic run-time provenance capture for scientific dataset generation

    Science.gov (United States)

    Frew, J.; Slaughter, P.

    2008-12-01

    Provenance---the directed graph of a dataset's processing history---is difficult to capture effectively. Human- generated provenance, as narrative metadata, is labor-intensive and thus often incorrect, incomplete, or simply not recorded. Workflow systems capture some provenance implicitly in workflow specifications, but these systems are not ubiquitous or standardized, and a workflow specification may not capture all of the factors involved in a dataset's production. System audit trails capture potentially all processing activities, but not the relationships between them. We describe a system that transparently (i.e., without any modification to science codes) and automatically (i.e. without any human intervention) captures the low-level interactions (files read/written, parameters accessed, etc.) between scientific processes, and then synthesizes these relationships into a provenance graph. This system---the Earth System Science Server (ES3)---is sufficiently general that it can accommodate any combination of stand-alone programs, interpreted codes (e.g. IDL), and command- language scripts. Provenance in ES3 can be published in well-defined XML formats (including formats suitable for graphical visualization), and queried to determine the ancestors or descendants of any specific data file or process invocation. We demonstrate how ES3 can be used to capture the provenance of a large operational ocean color dataset.

  13. Correlations of artificially generated three component time histories

    International Nuclear Information System (INIS)

    Chen, C.; Lee, J.P.

    1975-01-01

    It is a common practice in the industry to consider three component inputs in the seismic resistant design of nuclear power plant facilities. One can either input the three components one by one and combine the peak responses by root sum square approach, or input three components simultaneously and combine the response algebraically at each time increment. When the latter approach is applied, a question arises as to how should the three components of the ground motion correlate to one another. In the paper the authors calculate variances and covariances of strong motion accelerograms recorded at 104 sites using the assumption of erogodic process. Examination of the results reveals that majority of the covariances between vertical and horizontal direction are higher than those of the two randomly oriented horizontal directions. This is an indication that the vertical axis is not always one of the principal axis. The contradiction is probably caused by the assumption that all three components have the same intensity function as assumed by Penzien and Watabe. In reality, they are not the same, hence the direction of principal axes of ground motion in general, is a function of time. Thus, the statistically uncorrelated time histories cannot be used as a criterion. Since the industry needs a criterion to define the correlations among the three components, this paper calculates the statistical values of the correlation coefficients from the recorded accelerograms at 104 sites of random orientation and recommends those values as the criterion of correlation among the three components

  14. Structural brain imaging correlates of ASD and ADHD across the lifespan : A hypothesis-generating review on developmental ASD-ADHD subtypes

    NARCIS (Netherlands)

    Rommelse, Nanda; Buitelaar, Jan K.; Hartman, Catharina A.

    We hypothesize that it is plausible that biologically distinct developmental ASD-ADHD subtypes are present, each characterized by a distinct time of onset of symptoms, progression and combination of symptoms. The aim of the present narrative review was to explore if structural brain imaging studies

  15. Correlations of artificially generated three component time histories

    International Nuclear Information System (INIS)

    Chen, C.; Lee, J.P.

    1975-01-01

    The authors calculate variances and covariances of strong motion accelerograms recorded at 104 sites using the assumption of ergodic process. Examination of the results reveals that majority of the covariances between vertical and horizontal directions are higher than those of the two randomly oriented horizontal directions. This is an indication that the vertical axis is not always one of the principle axis. The contradiction is probably caused by the assumption that all three components have the same intensity function as assumed by Penzien and Watabe. In reality, they are not the same, hence the direction of principle axes of ground motion in general, is a function of time. Thus, the statistically uncorrelated time histories cannot be used as a criterion. Since the industry needs a criterion to define the correlations among the three components this paper calculates the statistical value of the correlation coefficients from the recorded accelerograms at 104 sites of random orientation and recommends those values as the criterion of correlation among the three components. (Auth.)

  16. Physiopathological Hypothesis of Cellulite

    Science.gov (United States)

    de Godoy, José Maria Pereira; de Godoy, Maria de Fátima Guerreiro

    2009-01-01

    A series of questions are asked concerning this condition including as regards to its name, the consensus about the histopathological findings, physiological hypothesis and treatment of the disease. We established a hypothesis for cellulite and confirmed that the clinical response is compatible with this hypothesis. Hence this novel approach brings a modern physiological concept with physiopathologic basis and clinical proof of the hypothesis. We emphasize that the choice of patient, correct diagnosis of cellulite and the technique employed are fundamental to success. PMID:19756187

  17. Life Origination Hydrate Hypothesis (LOH-Hypothesis).

    Science.gov (United States)

    Ostrovskii, Victor; Kadyshevich, Elena

    2012-01-04

    The paper develops the Life Origination Hydrate Hypothesis (LOH-hypothesis), according to which living-matter simplest elements (LMSEs, which are N-bases, riboses, nucleosides, nucleotides), DNA- and RNA-like molecules, amino-acids, and proto-cells repeatedly originated on the basis of thermodynamically controlled, natural, and inevitable processes governed by universal physical and chemical laws from CH4, niters, and phosphates under the Earth's surface or seabed within the crystal cavities of the honeycomb methane-hydrate structure at low temperatures; the chemical processes passed slowly through all successive chemical steps in the direction that is determined by a gradual decrease in the Gibbs free energy of reacting systems. The hypothesis formulation method is based on the thermodynamic directedness of natural movement and consists ofan attempt to mentally backtrack on the progression of nature and thus reveal principal milestones alongits route. The changes in Gibbs free energy are estimated for different steps of the living-matter origination process; special attention is paid to the processes of proto-cell formation. Just the occurrence of the gas-hydrate periodic honeycomb matrix filled with LMSEs almost completely in its final state accounts for size limitation in the DNA functional groups and the nonrandom location of N-bases in the DNA chains. The slowness of the low-temperature chemical transformations and their "thermodynamic front" guide the gross process of living matter origination and its successive steps. It is shown that the hypothesis is thermodynamically justified and testable and that many observed natural phenomena count in its favor.

  18. Life Origination Hydrate Hypothesis (LOH-Hypothesis

    Directory of Open Access Journals (Sweden)

    Victor Ostrovskii

    2012-01-01

    Full Text Available The paper develops the Life Origination Hydrate Hypothesis (LOH-hypothesis, according to which living-matter simplest elements (LMSEs, which are N-bases, riboses, nucleosides, nucleotides, DNA- and RNA-like molecules, amino-acids, and proto-cells repeatedly originated on the basis of thermodynamically controlled, natural, and inevitable processes governed by universal physical and chemical laws from CH4, niters, and phosphates under the Earth's surface or seabed within the crystal cavities of the honeycomb methane-hydrate structure at low temperatures; the chemical processes passed slowly through all successive chemical steps in the direction that is determined by a gradual decrease in the Gibbs free energy of reacting systems. The hypothesis formulation method is based on the thermodynamic directedness of natural movement and consists ofan attempt to mentally backtrack on the progression of nature and thus reveal principal milestones alongits route. The changes in Gibbs free energy are estimated for different steps of the living-matter origination process; special attention is paid to the processes of proto-cell formation. Just the occurrence of the gas-hydrate periodic honeycomb matrix filled with LMSEs almost completely in its final state accounts for size limitation in the DNA functional groups and the nonrandom location of N-bases in the DNA chains. The slowness of the low-temperature chemical transformations and their “thermodynamic front” guide the gross process of living matter origination and its successive steps. It is shown that the hypothesis is thermodynamically justified and testable and that many observed natural phenomena count in its favor.

  19. Exploring new potentials and generating hypothesis for management of locally advanced head neck cancer: Analysis of pooled data from two phase II trials

    Directory of Open Access Journals (Sweden)

    Chufal Kundan

    2010-01-01

    Full Text Available Background: To study the long term results of two phase II concurrent chemoradiotherapy protocols and conduct pooled data analysis with special emphasis on nodal density. Materials and Methods: In the period from April 2001 to May 2003, phase II Mitomycin C (MMC and late chemo-intensification (LCI protocols were started in the same institute, enrolling 69 and 74 patients respectively. Long term results for these individual trials are reported along with pooled data analysis. Results: Median follow-up time for whole group, MMC protocol and LCI protocol was 43.8 months (SD619.8, 55 months (SD 618.5 and 47.5 months (SD 620.9 respectively. LRFS, DFS and OS at five years for whole group was 59.4, 43.5 and 47.1% respectively, for MMC protocol was 59.9, 45.5 and 49.5% respectively and for LCI, protocol was 53.6%, 41.5% and 44.4% respectively. Subgroup analysis revealed that MMC protocol was more effective than LCI protocol in terms of DFS and OS in patients with hypo dense nodes while opposite was true for Isodense nodes. Multivariate analysis revealed nodal density as an independent variable that had an impact on treatment outcome. Risk of death in patients with hypo dense nodes was 2.91 times that of Isodense nodes. Conclusions: Innovative and pragmatic approach is required to address locally advanced head neck cancer. Long term results for MMC and LCI protocols are encouraging. Integrating the basic concepts of these protocols may help develop new protocols, which will facilitate the search for the optimal solution.

  20. Structural brain imaging correlates of ASD and ADHD across the lifespan: a hypothesis-generating review on developmental ASD-ADHD subtypes.

    Science.gov (United States)

    Rommelse, Nanda; Buitelaar, Jan K; Hartman, Catharina A

    2017-02-01

    We hypothesize that it is plausible that biologically distinct developmental ASD-ADHD subtypes are present, each characterized by a distinct time of onset of symptoms, progression and combination of symptoms. The aim of the present narrative review was to explore if structural brain imaging studies may shed light on key brain areas that are linked to both ASD and ADHD symptoms and undergo significant changes during development. These findings may possibly pinpoint to brain mechanisms underlying differential developmental ASD-ADHD subtypes. To this end we brought together the literature on ASD and ADHD structural brain imaging symptoms and particularly highlight the adolescent years and beyond. Findings indicate that the vast majority of existing MRI studies has been cross-sectional and conducted in children, and sometimes did include adolescents as well, but without explicitly documenting on this age group. MRI studies documenting on age effects in adults with ASD and/or ADHD are rare, and if age is taken into account, only linear effects are examined. Data from various studies suggest that a crucial distinctive feature underlying different developmental ASD-ADHD subtypes may be the differential developmental thinning patterns of the anterior cingulate cortex and related connections towards other prefrontal regions. These regions are crucial for the development of cognitive/effortful control and socio-emotional functioning, with impairments in these features as key to both ASD and ADHD.

  1. Using Clinical Data, Hypothesis Generation Tools and PubMed Trends to Discover the Association between Diabetic Retinopathy and Antihypertensive Drugs

    Energy Technology Data Exchange (ETDEWEB)

    Senter, Katherine G [ORNL; Sukumar, Sreenivas R [ORNL; Patton, Robert M [ORNL; Chaum, Ed [University of Tennessee, Knoxville (UTK)

    2015-01-01

    Diabetic retinopathy (DR) is a leading cause of blindness and common complication of diabetes. Many diabetic patients take antihypertensive drugs to prevent cardiovascular problems, but these drugs may have unintended consequences on eyesight. Six common classes of antihypertensive drug are angiotensin converting enzyme (ACE) inhibitors, alpha blockers, angiotensin receptor blockers (ARBs), -blockers, calcium channel blockers, and diuretics. Analysis of medical history data might indicate which of these drugs provide safe blood pressure control, and a literature review is often used to guide such analyses. Beyond manual reading of relevant publications, we sought to identify quantitative trends in literature from the biomedical database PubMed to compare with quantitative trends in the clinical data. By recording and analyzing PubMed search results, we found wide variation in the prevalence of each antihypertensive drug in DR literature. Drug classes developed more recently such as ACE inhibitors and ARBs were most prevalent. We also identified instances of change-over-time in publication patterns. We then compared these literature trends to a dataset of 500 diabetic patients from the UT Hamilton Eye Institute. Data for each patient included class of antihypertensive drug, presence and severity of DR. Graphical comparison revealed that older drug classes such as diuretics, calcium channel blockers, and -blockers were much more prevalent in the clinical data than in the DR and antihypertensive literature. Finally, quantitative analysis of the dataset revealed that patients taking -blockers were statistically more likely to have DR than patients taking other medications, controlling for presence of hypertension and year of diabetes onset. This finding was concerning given the prevalence of -blockers in the clinical data. We determined that clinical use of -blockers should be minimized in diabetic patients to prevent retinal damage.

  2. Indirect Basal Metabolism Estimation in Tailoring Recombinant Human TSH Administration in Patients Affected by Differentiated Thyroid Cancer: A Hypothesis-Generating Study.

    Science.gov (United States)

    Barnabei, Agnese; Strigari, Lidia; Persichetti, Agnese; Baldelli, Roberto; Rizza, Laura; Annoscia, Claudia; Lauretta, Rosa; Cigliana, Giovanni; Barba, Maddalena; De Leo, Aurora; Appetecchia, Marialuisa; Torino, Francesco

    2018-01-01

    Recombinant human TSH (rhTSH) is currently used in follow-up of patients affected by differentiated thyroid cancer (DTC). Age, sex, weight, body mass index, body surface area (BSA) and renal function are known factors affecting serum TSH peak levels, but the proper rhTSH dose to deliver to single patient remains elusive. In this study, the correlations of basal metabolic rates with serum TSH peak following rhTSH administration were investigated. We evaluated 221 patients affected by thyroid cancer that received a standard dose rhTSH. Blood samples were collected at pre-established time points. Data on body weight, height, and BSA were collected. The Mifflin-St Jeor and Fleisch equations were used to assess basal metabolism. The median value (range) of serum TSH peaks was 142 ± 53 μU/ml. Serum TSH peaks were significantly lower in males than in females ( p  = 0.04). TSH values also increased with age. Data showed a significant decrease of TSH peak levels at day 3 from the administration of rhTSH when basal metabolic rates increased ( p  = 0.002 and p  = 0.009, respectively). Similar findings were observed at day 5 ( p  = 0.004 and p  = 0.04, respectively). A multivariate analysis of several factors revealed that patients' basal metabolism (obtained using the Mifflin-St Jeor but not Fleisch equation) predicts serum TSH level peak at day 3 ( p  Basal metabolism appears an improving factor in tailoring diagnostic rhTSH dose in patients affected by DTC.

  3. Real-time Java API Specifications for High Coverage Test Generation

    NARCIS (Netherlands)

    Ahrendt, W.; Mostowski, Wojciech; Paganelli, G.

    2012-01-01

    We present the test case generation method and tool KeY-TestGen in the context of real-time Java applications and libraries. The generated tests feature strong coverage criteria, like the Modified Condition/Decision Criterion, by construction. This is achieved by basing the test generation on formal

  4. A generation-time effect on the rate of molecular evolution in bacteria.

    Science.gov (United States)

    Weller, Cory; Wu, Martin

    2015-03-01

    Molecular evolutionary rate varies significantly among species and a strict global molecular clock has been rejected across the tree of life. Generation time is one primary life-history trait that influences the molecular evolutionary rate. Theory predicts that organisms with shorter generation times evolve faster because of the accumulation of more DNA replication errors per unit time. Although the generation-time effect has been demonstrated consistently in plants and animals, the evidence of its existence in bacteria is lacking. The bacterial phylum Firmicutes offers an excellent system for testing generation-time effect because some of its members can enter a dormant, nonreproductive endospore state in response to harsh environmental conditions. It follows that spore-forming bacteria would--with their longer generation times--evolve more slowly than their nonspore-forming relatives. It is therefore surprising that a previous study found no generation-time effect in Firmicutes. Using a phylogenetic comparative approach and leveraging on a large number of Firmicutes genomes, we found sporulation significantly reduces the genome-wide spontaneous DNA mutation rate and protein evolutionary rate. Contrary to the previous study, our results provide strong evidence that the evolutionary rates of bacteria, like those of plants and animals, are influenced by generation time. © 2015 The Author(s).

  5. Use of erroneous wolf generation time in assessments of domestic dog and human evolution

    Science.gov (United States)

    Mech, L. David; Barber-meyer, Shannon

    2017-01-01

    Scientific interest in dog domestication and parallel evolution of dogs and humans (Wang et al. 2013) has increased recently (Freedman et al. 2014, Larson and Bradley 2014, Franz et al. 2016,), and various important conclusions have been drawn based on how long ago the calculations show dogs were domesticated from ancestral wolves (Canis lupus). Calculation of this duration is based on “the most commonly assumed mutation rate of 1 x 10-8 per generation and a 3-year gray wolf generation time . . .” (Skoglund et al. 2015:3). It is unclear on what information the assumed generation time is based, but Ersmark et al. (2016) seemed to have based their assumption on a single wolf (Mech and Seal 1987). The importance of assuring that such assumptions are valid is obvious. Recently, two independent studies employing three large data sets and three methods from two widely separated areas have found that wolf generation time is 4.2-4.7 years. The first study, based on 200 wolves in Yellowstone National Park used age-specific birth and death rates to calculate a generation time of 4.16 years (vonHoldt et al. 2008). The second, using estimated first-breeding times of 86 female wolves in northeastern Minnesota found a generation time of 4.3 years and using uterine examination of 159 female wolves from throughout Minnesota yielded a generation time of 4.7 years (Mech et al. 2016). We suggest that previous studies using a 3-year generation time recalculate their figures and adjust their conclusions based on these generation times and publish revised results.

  6. Transient Stability Analysis of Induction Generator Using Time Domain Torque Characteristic

    Science.gov (United States)

    Senjyu, Tomonobu; Sueyoshi, Norihide; Uezato, Katsumi; Fujita, Hideki; Funabashi, Toshihisa

    Transient stability assessment of the wind power generator is one of main issue in power system security and operation. The transient stability of the wind power generator is determine by its corresponding Critical Clearing Time(CCT). In this paper, we present the formulae to the transient behavior analysis and the transient stability analysis technique of induction generator used in wind power generating system at the three-phase fault condition. In the proposed method, the transient stability of the induction generator is analyzed using well known torque-slip and generator speed-time characteristics. The validity of the developed technique is confirmed with the results obtained from trials and error method using MATLAB/SIMULINK.

  7. Development of a Marx-coupled trigger generator with high voltages and low time delay

    Science.gov (United States)

    Hu, Yixiang; Zeng, Jiangtao; Sun, Fengju; Cong, Peitian; Su, Zhaofeng; Yang, Shi; Zhang, Xinjun; Qiu, Ai'ci

    2016-10-01

    Coupled by the Marx of the "JianGuang-I" facility, a high voltage, low time-delay trigger generator was developed. Working principles of this trigger generator and its key issues were described in detail. Structures of this generator were also carefully designed and optimized. Based on the "JianGuang-I" Marx generator, a test stand was established. And a series of experiment tests were carried out to the study performance of this trigger generator. Experiment results show that the output voltage of this trigger generator can be continuously adjusted from 58 kV to 384 kV. The time delay (from the beginning of the Marx-discharging pulse to the time that the output pulse of the trigger generator arises) of this trigger pulse is about 200 ns and its peak time (0%˜100%) is less than 50 ns. Experiment results also indicate that the time-delay jitter of trigger voltages decreases rapidly with the increase in the peak voltage of trigger pulses. When the trigger voltage is higher than 250 kV, the time-delay jitters (the standard deviation) are less than 7.7 ns.

  8. The Qualitative Expectations Hypothesis

    DEFF Research Database (Denmark)

    Frydman, Roman; Johansen, Søren; Rahbek, Anders

    2017-01-01

    We introduce the Qualitative Expectations Hypothesis (QEH) as a new approach to modeling macroeconomic and financial outcomes. Building on John Muth's seminal insight underpinning the Rational Expectations Hypothesis (REH), QEH represents the market's forecasts to be consistent with the predictions...... of an economistís model. However, by assuming that outcomes lie within stochastic intervals, QEH, unlike REH, recognizes the ambiguity faced by an economist and market participants alike. Moreover, QEH leaves the model open to ambiguity by not specifying a mechanism determining specific values that outcomes take...

  9. The Qualitative Expectations Hypothesis

    DEFF Research Database (Denmark)

    Frydman, Roman; Johansen, Søren; Rahbek, Anders

    We introduce the Qualitative Expectations Hypothesis (QEH) as a new approach to modeling macroeconomic and financial outcomes. Building on John Muth's seminal insight underpinning the Rational Expectations Hypothesis (REH), QEH represents the market's forecasts to be consistent with the predictions...... of an economist's model. However, by assuming that outcomes lie within stochastic intervals, QEH, unlike REH, recognizes the ambiguity faced by an economist and market participants alike. Moreover, QEH leaves the model open to ambiguity by not specifying a mechanism determining specific values that outcomes take...

  10. Event rate and reaction time performance in ADHD: Testing predictions from the state regulation deficit hypothesis using an ex-Gaussian model.

    Science.gov (United States)

    Metin, Baris; Wiersema, Jan R; Verguts, Tom; Gasthuys, Roos; van Der Meere, Jacob J; Roeyers, Herbert; Sonuga-Barke, Edmund

    2014-12-06

    According to the state regulation deficit (SRD) account, ADHD is associated with a problem using effort to maintain an optimal activation state under demanding task settings such as very fast or very slow event rates. This leads to a prediction of disrupted performance at event rate extremes reflected in higher Gaussian response variability that is a putative marker of activation during motor preparation. In the current study, we tested this hypothesis using ex-Gaussian modeling, which distinguishes Gaussian from non-Gaussian variability. Twenty-five children with ADHD and 29 typically developing controls performed a simple Go/No-Go task under four different event-rate conditions. There was an accentuated quadratic relationship between event rate and Gaussian variability in the ADHD group compared to the controls. The children with ADHD had greater Gaussian variability at very fast and very slow event rates but not at moderate event rates. The results provide evidence for the SRD account of ADHD. However, given that this effect did not explain all group differences (some of which were independent of event rate) other cognitive and/or motivational processes are also likely implicated in ADHD performance deficits.

  11. Just-in-time characterization and certification of DOE-generated wastes

    Energy Technology Data Exchange (ETDEWEB)

    Arrenholz, D.A.; Primozic, F.J. [Benchmark Environmental Corp., Albuquerque, NM (United States); Robinson, M.A. [Los Alamos National Lab., NM (United States)

    1995-12-31

    Transportation and disposal of wastes generated by Department of Energy (DOE) activities, including weapons production and decontamination and decommissioning (D&D) of facilities, require that wastes be certified as complying with various regulations and requirements. These certification requirements are typically summarized by disposal sites in their specific waste acceptance criteria. Although a large volume of DOE wastes have been generated by past activities and are presently in storage awaiting disposal, a significant volume of DOE wastes, particularly from D&D projects. have not yet been generated. To prepare DOE-generated wastes for disposal in an efficient manner. it is suggested that a program of just-in-time characterization and certification be adopted. The goal of just-in-time characterization and certification, which is based on the just-in-time manufacturing process, is to streamline the certification process by eliminating redundant layers of oversight and establishing pro-active waste management controls. Just-in-time characterization and certification would rely on a waste management system in which wastes are characterized at the point of generation, precertified as they are generated (i.e., without iterative inspections and tests subsequent to generation and storage), and certified at the point of shipment, ideally the loading dock of the building from which the wastes are generated. Waste storage would be limited to accumulating containers for cost-efficient transportation.

  12. Just-in-time characterization and certification of DOE-generated wastes

    International Nuclear Information System (INIS)

    Arrenholz, D.A.; Primozic, F.J.; Robinson, M.A.

    1995-01-01

    Transportation and disposal of wastes generated by Department of Energy (DOE) activities, including weapons production and decontamination and decommissioning (D ampersand D) of facilities, require that wastes be certified as complying with various regulations and requirements. These certification requirements are typically summarized by disposal sites in their specific waste acceptance criteria. Although a large volume of DOE wastes have been generated by past activities and are presently in storage awaiting disposal, a significant volume of DOE wastes, particularly from D ampersand D projects. have not yet been generated. To prepare DOE-generated wastes for disposal in an efficient manner. it is suggested that a program of just-in-time characterization and certification be adopted. The goal of just-in-time characterization and certification, which is based on the just-in-time manufacturing process, is to streamline the certification process by eliminating redundant layers of oversight and establishing pro-active waste management controls. Just-in-time characterization and certification would rely on a waste management system in which wastes are characterized at the point of generation, precertified as they are generated (i.e., without iterative inspections and tests subsequent to generation and storage), and certified at the point of shipment, ideally the loading dock of the building from which the wastes are generated. Waste storage would be limited to accumulating containers for cost-efficient transportation

  13. System and Component Software Specification, Run-time Verification and Automatic Test Generation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The following background technology is described in Part 5: Run-time Verification (RV), White Box Automatic Test Generation (WBATG). Part 5 also describes how WBATG...

  14. Thunderstorm Hypothesis Reasoner

    Science.gov (United States)

    Mulvehill, Alice M.

    1994-01-01

    THOR is a knowledge-based system which incorporates techniques from signal processing, pattern recognition, and artificial intelligence (AI) in order to determine the boundary of small thunderstorms which develop and dissipate over the area encompassed by KSC and the Cape Canaveral Air Force Station. THOR interprets electric field mill data (derived from a network of electric field mills) by using heuristics and algorithms about thunderstorms that have been obtained from several domain specialists. THOR generates two forms of output: contour plots which visually describe the electric field activity over the network and a verbal interpretation of the activity. THOR uses signal processing and pattern recognition to detect signatures associated with noise or thunderstorm behavior in a near real time fashion from over 31 electrical field mills. THOR's AI component generates hypotheses identifying areas which are under a threat from storm activity, such as lightning. THOR runs on a VAX/VMS at the Kennedy Space Center. Its software is a coupling of C and FORTRAN programs, several signal processing packages, and an expert system development shell.

  15. Strong Normalization by Type-Directed Partial Evaluation and Run-Time Code Generation

    DEFF Research Database (Denmark)

    Balat, Vincent; Danvy, Olivier

    1997-01-01

    We investigate the synergy between type-directed partial evaluation and run-time code generation for the Caml dialect of ML. Type-directed partial evaluation maps simply typed, closed Caml values to a representation of their long βη-normal form. Caml uses a virtual machine and has the capability...... conclude this note with a preview of our current work on scaling up strong normalization by run-time code generation to the Caml module language....

  16. Strong normalization by type-directed partial evaluation and run-time code generation

    DEFF Research Database (Denmark)

    Balat, Vincent; Danvy, Olivier

    1998-01-01

    We investigate the synergy between type-directed partial evaluation and run-time code generation for the Caml dialect of ML. Type-directed partial evaluation maps simply typed, closed Caml values to a representation of their long βη-normal form. Caml uses a virtual machine and has the capability...... conclude this note with a preview of our current work on scaling up strong normalization by run-time code generation to the Caml module language....

  17. The Lehman Sisters Hypothesis

    NARCIS (Netherlands)

    I.P. van Staveren (Irene)

    2014-01-01

    markdownabstract__Abstract__ This article explores the Lehman Sisters Hypothesis. It reviews empirical literature about gender differences in behavioral, experimental, and neuro-economics as well as in other fields of behavioral research. It discusses gender differences along three dimensions of

  18. Revisiting the Dutch hypothesis

    NARCIS (Netherlands)

    Postma, Dirkje S.; Weiss, Scott T.; van den Berge, Maarten; Kerstjens, Huib A. M.; Koppelman, Gerard H.

    The Dutch hypothesis was first articulated in 1961, when many novel and advanced scientific techniques were not available, such as genomics techniques for pinpointing genes, gene expression, lipid and protein profiles, and the microbiome. In addition, computed tomographic scans and advanced analysis

  19. Biostatistics series module 2: Overview of hypothesis testing

    Directory of Open Access Journals (Sweden)

    Avijit Hazra

    2016-01-01

    Full Text Available Hypothesis testing (or statistical inference is one of the major applications of biostatistics. Much of medical research begins with a research question that can be framed as a hypothesis. Inferential statistics begins with a null hypothesis that reflects the conservative position of no change or no difference in comparison to baseline or between groups. Usually, the researcher has reason to believe that there is some effect or some difference which is the alternative hypothesis. The researcher therefore proceeds to study samples and measure outcomes in the hope of generating evidence strong enough for the statistician to be able to reject the null hypothesis. The concept of the P value is almost universally used in hypothesis testing. It denotes the probability of obtaining by chance a result at least as extreme as that observed, even when the null hypothesis is true and no real difference exists. Usually, if P is < 0.05 the null hypothesis is rejected and sample results are deemed statistically significant. With the increasing availability of computers and access to specialized statistical software, the drudgery involved in statistical calculations is now a thing of the past, once the learning curve of the software has been traversed. The life sciences researcher is therefore free to devote oneself to optimally designing the study, carefully selecting the hypothesis tests to be applied, and taking care in conducting the study well. Unfortunately, selecting the right test seems difficult initially. Thinking of the research hypothesis as addressing one of five generic research questions helps in selection of the right hypothesis test. In addition, it is important to be clear about the nature of the variables (e.g., numerical vs. categorical; parametric vs. nonparametric and the number of groups or data sets being compared (e.g., two or more than two at a time. The same research question may be explored by more than one type of hypothesis test

  20. Real-time monitoring during transportation of a radioisotope thermoelectric generator (RTG) using the radioisotope thermoelectric generator transportation system (RTGTS)

    Science.gov (United States)

    Pugh, Barry K.

    1997-01-01

    The Radioisotopic Thermoelectric Generators (RTGs) that will be used to support the Cassini mission will be transported in the Radioisotope Thermoelectric Generator Transportation System (RTGTS). To ensure that the RTGs will not be affected during transportation, all parameters that could adversely affect RTG's performance must be monitored. The Instrumentation and Data Acquisition System (IDAS) for the RTGTS displays, monitors, and records all critical packaging and trailer system parameters. The IDAS also monitors the package temperature control system, RTG package shock and vibration data, and diesel fuel levels for the diesel fuel tanks. The IDAS alarms if any of these parameters reach an out-of-limit condition. This paper discusses the real-time monitoring during transportation of the Cassini RTGs using the RTGTS IDAS.

  1. Mobile charge generation dynamics in P3HT: PCBM observed by time-resolved terahertz spectroscopy

    DEFF Research Database (Denmark)

    Cooke, D. G.; Krebs, Frederik C; Jepsen, Peter Uhd

    2012-01-01

    Ultra-broadband time-resolved terahertz spectroscopy is used to examine the sub-ps conductivity dynamics of a conjugated polymer bulk heterojunction film P3HT:PCBM. We directly observe mobile charge generation dynamics on a sub-100 fs time scale.......Ultra-broadband time-resolved terahertz spectroscopy is used to examine the sub-ps conductivity dynamics of a conjugated polymer bulk heterojunction film P3HT:PCBM. We directly observe mobile charge generation dynamics on a sub-100 fs time scale....

  2. Gene-Auto: Automatic Software Code Generation for Real-Time Embedded Systems

    Science.gov (United States)

    Rugina, A.-E.; Thomas, D.; Olive, X.; Veran, G.

    2008-08-01

    This paper gives an overview of the Gene-Auto ITEA European project, which aims at building a qualified C code generator from mathematical models under Matlab-Simulink and Scilab-Scicos. The project is driven by major European industry partners, active in the real-time embedded systems domains. The Gene- Auto code generator will significantly improve the current development processes in such domains by shortening the time to market and by guaranteeing the quality of the generated code through the use of formal methods. The first version of the Gene-Auto code generator has already been released and has gone thought a validation phase on real-life case studies defined by each project partner. The validation results are taken into account in the implementation of the second version of the code generator. The partners aim at introducing the Gene-Auto results into industrial development by 2010.

  3. An X-ray CCD signal generator with true random arrival time

    International Nuclear Information System (INIS)

    Huo Jia; Xu Yuming; Chen Yong; Cui Weiwei; Li Wei; Zhang Ziliang; Han Dawei; Wang Yusan; Wang Juan

    2011-01-01

    An FPGA-based true random signal generator with adjustable amplitude and exponential distribution of time interval is presented. Since traditional true random number generators (TRNG) are resource costly and difficult to transplant, we employed a method of random number generation based on jitter and phase noise in ring oscillators formed by gates in an FPGA. In order to improve the random characteristics, a combination of two different pseudo-random processing circuits is used for post processing. The effects of the design parameters, such as sample frequency are discussed. Statistical tests indicate that the generator can well simulate the timing behavior of random signals with Poisson distribution. The X-ray CCD signal generator will be used in debugging the CCD readout system of the Low Energy X-ray Instrument onboard the Hard X-ray Modulation Telescope (HXMT). (authors)

  4. Generation of floor spectra compatible time histories for equipment seismic qualification in nuclear power plants

    International Nuclear Information System (INIS)

    Shyu, Y.-S.; Luh, Gary G.; Blum, Arie

    2004-01-01

    This paper proposes a procedure for generating floor response spectra compatible time histories used for equipment seismic qualification in nuclear power plants. From the 84th percentile power spectrum density function of an earthquake ensemble of four randomly generated time history motions, a statistically equivalent time history can be obtained by converting the power spectrum density function from the frequency domain into the time domain. With minor modification, if needed, the converted time history will satisfy both the spectral and the power spectrum density enveloping criteria, as required by the USNRC per Revision 2 of the Standard Review Plan, Section 3.7.1. Step-by-step generating procedures and two numerical examples are presented to illustrate the applications of the methodology. (author)

  5. Application of Hilbert-Huang Transform in Generating Spectrum-Compatible Earthquake Time Histories

    OpenAIRE

    Ni, Shun-Hao; Xie, Wei-Chau; Pandey, Mahesh

    2011-01-01

    Spectrum-compatible earthquake time histories have been widely used for seismic analysis and design. In this paper, a data processing method, Hilbert-Huang transform, is applied to generate earthquake time histories compatible with the target seismic design spectra based on multiple actual earthquake records. Each actual earthquake record is decomposed into several components of time-dependent amplitude and frequency by Hilbert-Huang transform. The spectrum-compatible earthquake time history ...

  6. Note: Fully integrated 3.2 Gbps quantum random number generator with real-time extraction

    International Nuclear Information System (INIS)

    Zhang, Xiao-Guang; Nie, You-Qi; Liang, Hao; Zhang, Jun; Pan, Jian-Wei; Zhou, Hongyi; Ma, Xiongfeng

    2016-01-01

    We present a real-time and fully integrated quantum random number generator (QRNG) by measuring laser phase fluctuations. The QRNG scheme based on laser phase fluctuations is featured for its capability of generating ultra-high-speed random numbers. However, the speed bottleneck of a practical QRNG lies on the limited speed of randomness extraction. To close the gap between the fast randomness generation and the slow post-processing, we propose a pipeline extraction algorithm based on Toeplitz matrix hashing and implement it in a high-speed field-programmable gate array. Further, all the QRNG components are integrated into a module, including a compact and actively stabilized interferometer, high-speed data acquisition, and real-time data post-processing and transmission. The final generation rate of the QRNG module with real-time extraction can reach 3.2 Gbps.

  7. Retention time generates short-term phytoplankton blooms in a shallow microtidal subtropical estuary

    Science.gov (United States)

    Odebrecht, Clarisse; Abreu, Paulo C.; Carstensen, Jacob

    2015-09-01

    In this study it was hypothesised that increasing water retention time promotes phytoplankton blooms in the shallow microtidal Patos Lagoon estuary (PLE). This hypothesis was tested using salinity variation as a proxy of water retention time and chlorophyll a for phytoplankton biomass. Submersible sensors fixed at 5 m depth near the mouth of PLE continuously measured water temperature, salinity and pigments fluorescence (calibrated to chlorophyll a) between March 2010 and 12th of December 2011, with some gaps. Salinity variations were used to separate alternating patterns of outflow of lagoon water (salinity 24; 35% of the time). The two transition phases represented a rapid change from lagoon water outflow to marine water inflow and a more gradually declining salinity between the dominating inflow and outflow conditions. During the latter of these, a significant chlorophyll a increase relative to that expected from a linear mixing relationship was observed at intermediate salinities (10-20). The increase in chlorophyll a was positively related to the duration of the prior coastal water inflow in the PLE. Moreover, chlorophyll a increase was significantly higher during austral spring-summer than autumn-winter, probably due to higher light and nutrient availability in the former. Moreover, the retention time process operating on time scales of days influences the long-term phytoplankton variability in this ecosystem. Comparing these results with monthly data from a nearby long-term water quality monitoring station (1993-2011) support the hypothesis that chlorophyll a accumulations occur after marine inflow events, whereas phytoplankton does not accumulate during high water outflow, when the water residence time is short. These results suggest that changing hydrological pattern is the most important mechanism underlying phytoplankton blooms in the PLE.

  8. From Enumerating to Generating: A Linear Time Algorithm for Generating 2D Lattice Paths with a Given Number of Turns

    Directory of Open Access Journals (Sweden)

    Ting Kuo

    2015-05-01

    Full Text Available We propose a linear time algorithm, called G2DLP, for generating 2D lattice L(n1, n2 paths, equivalent to two-item  multiset permutations, with a given number of turns. The usage of turn has three meanings: in the context of multiset permutations, it means that two consecutive elements of a permutation belong to two different items; in lattice path enumerations, it means that the path changes its direction, either from eastward to northward or from northward to eastward; in open shop scheduling, it means that we transfer a job from one type of machine to another. The strategy of G2DLP is divide-and-combine; the division is based on the enumeration results of a previous study and is achieved by aid of an integer partition algorithm and a multiset permutation algorithm; the combination is accomplished by a concatenation algorithm that constructs the paths we require. The advantage of G2DLP is twofold. First, it is optimal in the sense that it directly generates all feasible paths without visiting an infeasible one. Second, it can generate all paths in any specified order of turns, for example, a decreasing order or an increasing order. In practice, two applications, scheduling and cryptography, are discussed.

  9. A hybrid procedure for MSW generation forecasting at multiple time scales in Xiamen City, China

    International Nuclear Information System (INIS)

    Xu, Lilai; Gao, Peiqing; Cui, Shenghui; Liu, Chun

    2013-01-01

    Highlights: ► We propose a hybrid model that combines seasonal SARIMA model and grey system theory. ► The model is robust at multiple time scales with the anticipated accuracy. ► At month-scale, the SARIMA model shows good representation for monthly MSW generation. ► At medium-term time scale, grey relational analysis could yield the MSW generation. ► At long-term time scale, GM (1, 1) provides a basic scenario of MSW generation. - Abstract: Accurate forecasting of municipal solid waste (MSW) generation is crucial and fundamental for the planning, operation and optimization of any MSW management system. Comprehensive information on waste generation for month-scale, medium-term and long-term time scales is especially needed, considering the necessity of MSW management upgrade facing many developing countries. Several existing models are available but of little use in forecasting MSW generation at multiple time scales. The goal of this study is to propose a hybrid model that combines the seasonal autoregressive integrated moving average (SARIMA) model and grey system theory to forecast MSW generation at multiple time scales without needing to consider other variables such as demographics and socioeconomic factors. To demonstrate its applicability, a case study of Xiamen City, China was performed. Results show that the model is robust enough to fit and forecast seasonal and annual dynamics of MSW generation at month-scale, medium- and long-term time scales with the desired accuracy. In the month-scale, MSW generation in Xiamen City will peak at 132.2 thousand tonnes in July 2015 – 1.5 times the volume in July 2010. In the medium term, annual MSW generation will increase to 1518.1 thousand tonnes by 2015 at an average growth rate of 10%. In the long term, a large volume of MSW will be output annually and will increase to 2486.3 thousand tonnes by 2020 – 2.5 times the value for 2010. The hybrid model proposed in this paper can enable decision makers to

  10. Automatic Generation of Cycle-Approximate TLMs with Timed RTOS Model Support

    Science.gov (United States)

    Hwang, Yonghyun; Schirner, Gunar; Abdi, Samar

    This paper presents a technique for automatically generating cycle-approximate transaction level models (TLMs) for multi-process applications mapped to embedded platforms. It incorporates three key features: (a) basic block level timing annotation, (b) RTOS model integration, and (c) RTOS overhead delay modeling. The inputs to TLM generation are application C processes and their mapping to processors in the platform. A processor data model, including pipelined datapath, memory hierarchy and branch delay model is used to estimate basic block execution delays. The delays are annotated to the C code, which is then integrated with a generated SystemC RTOS model. Our abstract RTOS provides dynamic scheduling and inter-process communication (IPC) with processor- and RTOS-specific pre-characterized timing. Our experiments using a MP3 decoder and a JPEG encoder show that timed TLMs, with integrated RTOS models, can be automatically generated in less than a minute. Our generated TLMs simulated three times faster than real-time and showed less than 10% timing error compared to board measurements.

  11. The Drift Burst Hypothesis

    OpenAIRE

    Christensen, Kim; Oomen, Roel; Renò, Roberto

    2016-01-01

    The Drift Burst Hypothesis postulates the existence of short-lived locally explosive trends in the price paths of financial assets. The recent US equity and Treasury flash crashes can be viewed as two high profile manifestations of such dynamics, but we argue that drift bursts of varying magnitude are an expected and regular occurrence in financial markets that can arise through established mechanisms such as feedback trading. At a theoretical level, we show how to build drift bursts into the...

  12. Bayesian Hypothesis Testing

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Stephen A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sigeti, David E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-11-15

    These are a set of slides about Bayesian hypothesis testing, where many hypotheses are tested. The conclusions are the following: The value of the Bayes factor obtained when using the median of the posterior marginal is almost the minimum value of the Bayes factor. The value of τ2 which minimizes the Bayes factor is a reasonable choice for this parameter. This allows a likelihood ratio to be computed with is the least favorable to H0.

  13. The interactive brain hypothesis.

    Science.gov (United States)

    Di Paolo, Ezequiel; De Jaegher, Hanne

    2012-01-01

    Enactive approaches foreground the role of interpersonal interaction in explanations of social understanding. This motivates, in combination with a recent interest in neuroscientific studies involving actual interactions, the question of how interactive processes relate to neural mechanisms involved in social understanding. We introduce the Interactive Brain Hypothesis (IBH) in order to help map the spectrum of possible relations between social interaction and neural processes. The hypothesis states that interactive experience and skills play enabling roles in both the development and current function of social brain mechanisms, even in cases where social understanding happens in the absence of immediate interaction. We examine the plausibility of this hypothesis against developmental and neurobiological evidence and contrast it with the widespread assumption that mindreading is crucial to all social cognition. We describe the elements of social interaction that bear most directly on this hypothesis and discuss the empirical possibilities open to social neuroscience. We propose that the link between coordination dynamics and social understanding can be best grasped by studying transitions between states of coordination. These transitions form part of the self-organization of interaction processes that characterize the dynamics of social engagement. The patterns and synergies of this self-organization help explain how individuals understand each other. Various possibilities for role-taking emerge during interaction, determining a spectrum of participation. This view contrasts sharply with the observational stance that has guided research in social neuroscience until recently. We also introduce the concept of readiness to interact to describe the practices and dispositions that are summoned in situations of social significance (even if not interactive). This latter idea links interactive factors to more classical observational scenarios.

  14. The interactive brain hypothesis

    Directory of Open Access Journals (Sweden)

    Ezequiel Alejandro Di Paolo

    2012-06-01

    Full Text Available Enactive approaches foreground the role of interpersonal interaction in explanations of social understanding. This motivates, in combination with a recent interest in neuroscientific studies involving actual interactions, the question of how interactive processes relate to neural mechanisms involved in social understanding. We introduce the Interactive Brain Hypothesis in order to help map the possible relations between interaction and neural processes. The hypothesis states that interactive experience and skills play enabling roles in both the development and current function of social brain mechanisms, even in cases where social understanding happens in the absence of immediate interaction. We examine the plausibility of this hypothesis against developmental and neurobiological evidence and contrast it with the widespread assumption that mindreading is crucial to all social cognition. We describe the elements of social interaction that bear most directly on this hypothesis and discuss the empirical possibilities open to social neuroscience. We propose that the link between coordination dynamics and social understanding can be best grasped by studying transitions between states of coordination. These transitions form part of the self-organisation of interaction processes that characterise the dynamics of social engagement. The patterns and synergies of this self-organisation help explain how individuals understand each other. Various possibilities for role-taking emerge during interaction, determining a spectrum of participation. This view contrasts sharply with the observational stance that has guided research in social neuroscience until recently. We also introduce the concept of readiness to interact to describe the developed practices and dispositions that are summoned in situations of social significance (even if not interactive. This latter idea could link interactive factors to more classical observational scenarios.

  15. Mastery Learning and the Decreasing Variability Hypothesis.

    Science.gov (United States)

    Livingston, Jennifer A.; Gentile, J. Ronald

    1996-01-01

    This report results from studies that tested two variations of Bloom's decreasing variability hypothesis using performance on successive units of achievement in four graduate classrooms that used mastery learning procedures. Data do not support the decreasing variability hypothesis; rather, they show no change over time. (SM)

  16. Short irradiation time characteristics of the inverter type X-ray generator

    International Nuclear Information System (INIS)

    Miyazaki, Shigeru; Hara, Takamitu; Matutani, Kazuo; Saito, Kazuhiko.

    1994-01-01

    The linearity of the X-ray output is an important factor in radiography. It is a composite of the linearities of the X-ray tube voltage, the X-ray tube current, and the exposure time. This paper focuses on the linearity of exposure time. Non-linearity of the X-ray output for short-time exposure became a problem when the three-phase X-ray generator was introduced. This paper describes the inverter-type X-ray generator, which is expected to become predominant in the future. Previously, we investigated X-ray output linearity during short-time exposure using the technique of dynamic study. In this paper, we describe the application of a digital memory and a personal computer to further investigation. The non-linearity of the X-ray output was caused by irregular waveforms of the X-ray tube voltage found at the rise time and the fall time. When the rise time was about 0.6 ms, the non-linearity was about 2%, which is negligibly small. The non-linearity due to the fall time of the X-ray tube varied greatly according to the X-ray tube current. For the minimum irradiation time of 1 ms, 4% to 27% of the non-linearity was attributable to the fall time. The main cause was the stray capacitance of the X-ray high-voltage cables. When the X-ray tube current exceeded 400 mA, the rise time was almost equal to the fall time, and the problem did not occur. Consequently, the ideal generator should have a fall time which is equal to the rise time of the X-ray tube voltage. Strictly speaking, such a generator should have rectangular waveforms. (author)

  17. Hypothesis in research

    Directory of Open Access Journals (Sweden)

    Eudaldo Enrique Espinoza Freire

    2018-01-01

    Full Text Available It is intended with this work to have a material with the fundamental contents, which enable the university professor to formulate the hypothesis, for the development of an investigation, taking into account the problem to be solved. For its elaboration, the search of information in primary documents was carried out, such as thesis of degree and reports of research results, selected on the basis of its relevance with the analyzed subject, current and reliability, secondary documents, as scientific articles published in journals of recognized prestige, the selection was made with the same terms as in the previous documents. It presents a conceptualization of the updated hypothesis, its characterization and an analysis of the structure of the hypothesis in which the determination of the variables is deepened. The involvement of the university professor in the teaching-research process currently faces some difficulties, which are manifested, among other aspects, in an unstable balance between teaching and research, which leads to a separation between them.

  18. Calculation model for 16N transit time in the secondary side of steam generators

    International Nuclear Information System (INIS)

    Liu Songyu; Xu Jijun; Xu Ming

    1998-01-01

    The 16 N transit time is essential to determine the leak-rate of steam generator tubes leaks with 16 N monitoring system, which is a new technique. A model was developed for calculation 16 N transit time in the secondary side of steam generators. According to the flow characters of secondary side fluid, the transit times divide into four sectors from tube sheet to the sensor on steam line. The model assumes that 16 N is moving as vapor phase in the secondary-side. So the model for vapor velocity distribution in tube bundle is presented in detail. The 16 N transit time calculation results of this model compare with these of EDF on steam generator of Qinshan NPP

  19. A coupled weather generator - rainfall-runoff approach on hourly time steps for flood risk analysis

    Science.gov (United States)

    Winter, Benjamin; Schneeberger, Klaus; Dung Nguyen, Viet; Vorogushyn, Sergiy; Huttenlau, Matthias; Merz, Bruno; Stötter, Johann

    2017-04-01

    The evaluation of potential monetary damage of flooding is an essential part of flood risk management. One possibility to estimate the monetary risk is to analyze long time series of observed flood events and their corresponding damages. In reality, however, only few flood events are documented. This limitation can be overcome by the generation of a set of synthetic, physically and spatial plausible flood events and subsequently the estimation of the resulting monetary damages. In the present work, a set of synthetic flood events is generated by a continuous rainfall-runoff simulation in combination with a coupled weather generator and temporal disaggregation procedure for the study area of Vorarlberg (Austria). Most flood risk studies focus on daily time steps, however, the mesoscale alpine study area is characterized by short concentration times, leading to large differences between daily mean and daily maximum discharge. Accordingly, an hourly time step is needed for the simulations. The hourly metrological input for the rainfall-runoff model is generated in a two-step approach. A synthetic daily dataset is generated by a multivariate and multisite weather generator and subsequently disaggregated to hourly time steps with a k-Nearest-Neighbor model. Following the event generation procedure, the negative consequences of flooding are analyzed. The corresponding flood damage for each synthetic event is estimated by combining the synthetic discharge at representative points of the river network with a loss probability relation for each community in the study area. The loss probability relation is based on exposure and susceptibility analyses on a single object basis (residential buildings) for certain return periods. For these impact analyses official inundation maps of the study area are used. Finally, by analyzing the total event time series of damages, the expected annual damage or losses associated with a certain probability of occurrence can be estimated for

  20. Calculation of the importance-weighted neutron generation time using MCNIC method

    International Nuclear Information System (INIS)

    Feghhi, S.A.H.; Shahriari, M.; Afarideh, H.

    2008-01-01

    In advanced nuclear power systems, such as ADS, the need for reliable kinetics parameters is of considerable importance because of the lower value for β eff due to the large amount of transuranic elements loaded in the core of those systems. All reactor kinetic parameters are weighted quantities. In other words each neutron with a given position and energy is weighted with its importance. Neutron generation time as an important kinetic parameter, in all nuclear power systems has a significant role in the analysis of fast transients. The difference between non-weighted neutron generation time; Λ; standard in most Monte Carlo codes; and the weighted one Λ + can be quite significant depending on the type of the system. In previous work, based on the physical concept of neutron importance, a new method; MCNIC; using the MCNP code has been introduced for the calculation of neutron importance in fissionable assemblies for all criticality states. In the present work the applicability of MCNIC method has been extended for the calculation of the importance-weighted neutron generation time. The influence of reflector thickness on importance-weighted neutron generation time has been investigated by the development of an auxiliary code, IWLA, for a hypothetic assembly. The results of these calculations were compared with the non-weighted neutron generation times calculated using the Monte Carlo code MCNP. The difference between the importance-weighted and non-weighted quantity is more significant in a reflected system and increases with reflector thickness

  1. One-Time Password Generation and Two-Factor Authentication Using Molecules and Light.

    Science.gov (United States)

    Naren, Gaowa; Li, Shiming; Andréasson, Joakim

    2017-07-05

    Herein, we report the first example of one-time password (OTP) generation and two-factor authentication (2FA) using a molecular approach. OTPs are passwords that are valid for one entry only. For the next login session, a new, different password is generated. This brings the advantage that any undesired recording of a password will not risk the security of the authentication process. Our molecular realization of the OTP generator is based on a photochromic molecular triad where the optical input required to set the triad to the fluorescent form differs depending on the initial isomeric state. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. A new model for describing remission times: the generalized beta-generated Lindley distribution

    Directory of Open Access Journals (Sweden)

    MARIA DO CARMO S. LIMA

    Full Text Available New generators are required to define wider distributions for modeling real data in survival analysis. To that end we introduce the four-parameter generalized beta-generated Lindley distribution. It has explicit expressions for the ordinary and incomplete moments, mean deviations, generating and quantile functions. We propose a maximum likelihood procedure to estimate the model parameters, which is assessed through a Monte Carlo simulation study. We also derive an additional estimation scheme by means of least square between percentiles. The usefulness of the proposed distribution to describe remission times of cancer patients is illustrated by means of an application to real data.

  3. Does the timing of estrogen administration after castration affect its ability to preserve sexual interest in male rats?--exploring the critical period hypothesis.

    Science.gov (United States)

    Wibowo, Erik; Wassersug, Richard J

    2013-02-17

    Loss of libido is a major side effect that reduces the quality of life of prostate cancer patients on androgen-deprivation therapy. Estrogen restores sexual interest to some extent in castrated male mammals; however, the beneficial effects of estrogen vary greatly among different studies. We investigated whether the timing of estrogen treatment after castration affected its ability to restore sexual interest in male rats. For each rat, sexual behavior was tested with receptive female rats before castration, and after 2 weeks of either oil alone (as a control) or oil plus estradiol (E2) treatment administered via Silastic tubes implanted immediately, at 1 month (Short-Term), or at 3 months (Long-Term) after castration. Intromission frequency decreased and genital sniffing frequency increased significantly after castration compared to pre-castration levels, regardless of the testing time post-castration. E2 treatment at any time point did not reverse these changes. However, more E2-treated than control rats exhibited mounting behavior, with a significant difference between the Long-Term groups. Mounting frequency did not differ from pre-castration levels for either E2 or control rats under the Immediate condition, but declined significantly in rats treated with oil only under both the Short- and Long-Term conditions. In contrast, E2 treatment elevated mounting frequency above the castrate levels to a similar extent in both the Short and Long-term groups. In conclusion, E2 administration partially restores sexual interest in castrated male rats, and the length of post-castration delay in E2 administration does not affect the ability of E2 to restore mounting behavior. Copyright © 2012 Elsevier Inc. All rights reserved.

  4. Antibody producing B lineage cells invade the central nervous system predominantly at the time of and triggered by acute Epstein-Barr virus infection: A hypothesis on the origin of intrathecal immunoglobulin synthesis in multiple sclerosis.

    Science.gov (United States)

    Otto, Carolin; Hofmann, Jörg; Ruprecht, Klemens

    2016-06-01

    Patients with multiple sclerosis (MS), a chronic inflammatory disease of the central nervous system (CNS), typically have an intrathecal synthesis of immunoglobulin (Ig)G. Intrathecal IgG is produced by B lineage cells that entered the CNS, but why and when these cells invade the CNS of patients with MS is unknown. The intrathecal IgG response in patients with MS is polyspecific and part of it is directed against different common viruses (e.g. measles virus, rubella virus, varicella zoster virus). Strong and consistent evidence suggests an association of MS and Epstein-Barr virus (EBV) infection and EBV seroprevalence in patients with MS is practically 100%. However, intriguingly, despite of the universal EBV seroprevalence, the frequency of intrathecally produced IgG to EBV in patients with MS is much lower than that of intrathecally produced IgG to other common viruses. The acute phase of primary EBV infection is characterized by a strong polyclonal B cell activation. As typical for humoral immune responses against viruses, EBV specific IgG is produced only with a temporal delay after acute EBV infection. Aiming to put the above facts into a logical structure, we here propose the hypothesis that in individuals going on to develop MS antibody producing B lineage cells invade the CNS predominantly at the time of and triggered by acute primary EBV infection. Because at the time of acute EBV infection EBV IgG producing B lineage cells have not yet occurred, the hypothesis could explain the universal EBV seroprevalence and the low frequency of intrathecally produced IgG to EBV in patients with MS. Evidence supporting the hypothesis could be provided by large prospective follow-up studies of individuals with symptomatic primary EBV infection (infectious mononucleosis). Furthermore, the clarification of the molecular mechanism underlying an EBV induced invasion of B lineage cells into the CNS of individuals going on to develop MS could corroborate it, too. If true, our

  5. The timing hypothesis and hormone replacement therapy: a paradigm shift in the primary prevention of coronary heart disease in women. Part 2: comparative risks.

    Science.gov (United States)

    Hodis, Howard N; Mack, Wendy J

    2013-06-01

    A major misperception concerning postmenopausal hormone replacement therapy (HRT) is that the associated risks are large in magnitude and unique to HRT, but over the past 10 years, sufficient data have accumulated so that the magnitude and perspective of risks associated with the primary coronary heart disease prevention therapies of statins, aspirin, and postmenopausal HRT have become more fully defined. Review of randomized controlled trials indicates that the risks of primary prevention therapies and other medications commonly used in women's health are of similar type and magnitude, with the majority of these risks categorized as rare to infrequent (risks of postmenopausal HRT are predominantly rare (risks, including breast cancer, stroke, and venous thromboembolism are common across medications and are rare, and even rarer when HRT is initiated in women younger than 60 or who are less than 10 years since menopause. In Part 1 of this series, the sex-specificity of statins and aspirin and timing of initiation of HRT as modifiers of efficacy in women were reviewed. Herein, the comparative risks of primary prevention therapies in women are discussed. © 2013, Copyright the Authors Journal compilation © 2013, The American Geriatrics Society.

  6. A Molecular–Structure Hypothesis

    Directory of Open Access Journals (Sweden)

    Jan C. A. Boeyens

    2010-11-01

    Full Text Available The self-similar symmetry that occurs between atomic nuclei, biological growth structures, the solar system, globular clusters and spiral galaxies suggests that a similar pattern should characterize atomic and molecular structures. This possibility is explored in terms of the current molecular structure-hypothesis and its extension into four-dimensional space-time. It is concluded that a quantum molecule only has structure in four dimensions and that classical (Newtonian structure, which occurs in three dimensions, cannot be simulated by quantum-chemical computation.

  7. Refinements of the column generation process for the Vehicle Routing Problem with Time Windows

    DEFF Research Database (Denmark)

    Larsen, Jesper

    2004-01-01

    interval denoted the time window. The objective is to determine routes for the vehicles that minimizes the accumulated cost (or distance) with respect to the above mentioned constraints. Currently the best approaches for determining optimal solutions are based on column generation and Branch......-and-Bound, also known as Branch-and-Price. This paper presents two ideas for run-time improvements of the Branch-and-Price framework for the Vehicle Routing Problem with Time Windows. Both ideas reveal a significant potential for using run-time refinements when speeding up an exact approach without compromising...

  8. Cognitive Vulnerabilities Amplify the Effect of Early Pubertal Timing on Interpersonal Stress Generation During Adolescence

    Science.gov (United States)

    Stange, Jonathan P.; Kleiman, Evan M.; Hamlat, Elissa J.; Abramson, Lyn Y.; Alloy, Lauren B.

    2013-01-01

    Early pubertal timing has been found to confer risk for the occurrence of interpersonal stressful events during adolescence. However, pre-existing vulnerabilities may exacerbate the effects of early pubertal timing on the occurrence of stressors. Thus, the current study prospectively examined whether cognitive vulnerabilities amplified the effects of early pubertal timing on interpersonal stress generation. In a diverse sample of 310 adolescents (M age = 12.83 years, 55 % female; 53 % African American), early pubertal timing predicted higher levels of interpersonal dependent events among adolescents with more negative cognitive style and rumination, but not among adolescents with lower levels of these cognitive vulnerabilities. These findings suggest that cognitive vulnerabilities may heighten the risk of generating interpersonal stress for adolescents who undergo early pubertal maturation, which may subsequently place adolescents at greater risk for the development of psychopathology. PMID:24061858

  9. Generation and evaluation of space-Time trajectories of photovoltaic power

    DEFF Research Database (Denmark)

    Golestaneh, Faranak; Gooi, Hoay Beng; Pinson, Pierre

    2016-01-01

    on performance assessment of space-time trajectories of PV generation is also studied. Finally, the advantage of taking into account space-time correlations over probabilistic and point forecasts is investigated. The empirical investigation is based on the solar PV dataset of the Global Energy Forecasting......In the probabilistic energy forecasting literature, emphasis is mainly placed on deriving marginal predictive densities for which each random variable is dealt with individually. Such marginals description is sufficient for power systems related operational problems if and only if optimal decisions...... are to be made for each lead-time and each location independently of each other. However, many of these operational processes are temporally and spatially coupled, while uncertainty in photovoltaic (PV) generation is strongly dependent in time and in space. This issue is addressed here by analysing and capturing...

  10. Power in the loop real time simulation platform for renewable energy generation

    Science.gov (United States)

    Li, Yang; Shi, Wenhui; Zhang, Xing; He, Guoqing

    2018-02-01

    Nowadays, a large scale of renewable energy sources has been connecting to power system and the real time simulation platform is widely used to carry out research on integration control algorithm, power system stability etc. Compared to traditional pure digital simulation and hardware in the loop simulation, power in the loop simulation has higher accuracy and degree of reliability. In this paper, a power in the loop analog digital hybrid simulation platform has been built and it can be used not only for the single generation unit connecting to grid, but also for multiple new energy generation units connecting to grid. A wind generator inertia control experiment was carried out on the platform. The structure of the inertia control platform was researched and the results verify that the platform is up to need for renewable power in the loop real time simulation.

  11. Determination of Permissible Short-Time Emergency Overloading of Turbo-Generators and Synchronous Compensators

    Directory of Open Access Journals (Sweden)

    V. A. Anischenko

    2011-01-01

    Full Text Available The paper shows that failure to take into account variable ratio of short-time emergency overloading of turbo-generators (synchronous compensators that can lead to underestimation of overloading capacity or impermissible insulation over-heating.A method has been developed for determination of permissible duration of short-time emergency over-loading that takes into account changes of over-loading ratio in case of a failure.

  12. Computer-generated versus nurse-determined strategy for incubator humidity and time to regain birthweight

    NARCIS (Netherlands)

    Helder, Onno K.; Mulder, Paul G. H.; van Goudoever, Johannes B.

    2008-01-01

    To compare effects on premature infants' weight gain of a computer-generated and a nurse-determined incubator humidity strategy. An optimal humidity protocol is thought to reduce time to regain birthweight. Prospective randomized controlled design. Level IIIC neonatal intensive care unit in the

  13. Towards provably correct code generation for a hard real-time programming language

    DEFF Research Database (Denmark)

    Fränzle, Martin; Müller-Olm, Markus

    1994-01-01

    This paper sketches a hard real-time programming language featuring operators for expressing timeliness requirements in an abstract, implementation-independent way and presents parts of the design and verification of a provably correct code generator for that language. The notion of implementation...

  14. Local inertial oscillations in the surface ocean generated by time-varying winds

    Science.gov (United States)

    Chen, Shengli; Polton, Jeff A.; Hu, Jianyu; Xing, Jiuxing

    2015-12-01

    A new relationship is presented to give a review study on the evolution of inertial oscillations in the surface ocean locally generated by time-varying wind stress. The inertial oscillation is expressed as the superposition of a previous oscillation and a newly generated oscillation, which depends upon the time-varying wind stress. This relationship is employed to investigate some idealized wind change events. For a wind series varying temporally with different rates, the induced inertial oscillation is dominated by the wind with the greatest variation. The resonant wind, which rotates anti-cyclonically at the local inertial frequency with time, produces maximal amplitude of inertial oscillations, which grows monotonically. For the wind rotating at non-inertial frequencies, the responses vary periodically, with wind injecting inertial energy when it is in phase with the currents, but removing inertial energy when it is out of phase. The wind rotating anti-cyclonically with time is much more favorable to generate inertial oscillations than the cyclonic rotating wind. The wind with a frequency closer to the inertial frequency generates stronger inertial oscillations. For a diurnal wind, the induced inertial oscillation is dependent on latitude and is most significant at 30 °. This relationship is also applied to examine idealized moving cyclones. The inertial oscillation is much stronger on the right-hand side of the cyclone path than on the left-hand side (in the northern hemisphere). This is due to the wind being anti-cyclonic with time on the right-hand side, but cyclonic on the other side. The inertial oscillation varies with the cyclone translation speed. The optimal translation speed generating the greatest inertial oscillations is 2 m/s at the latitude of 10 ° and gradually increases to 6 m/s at the latitude of 30 °.

  15. Isothermal calorimeter for measurements of time-dependent heat generation rate in individual supercapacitor electrodes

    Science.gov (United States)

    Munteshari, Obaidallah; Lau, Jonathan; Krishnan, Atindra; Dunn, Bruce; Pilon, Laurent

    2018-01-01

    Heat generation in electric double layer capacitors (EDLCs) may lead to temperature rise and reduce their lifetime and performance. This study aims to measure the time-dependent heat generation rate in individual carbon electrode of EDLCs under various charging conditions. First, the design, fabrication, and validation of an isothermal calorimeter are presented. The calorimeter consisted of two thermoelectric heat flux sensors connected to a data acquisition system, two identical and cold plates fed with a circulating coolant, and an electrochemical test section connected to a potentiostat/galvanostat system. The EDLC cells consisted of two identical activated carbon electrodes and a separator immersed in an electrolyte. Measurements were performed on three cells with different electrolytes under galvanostatic cycling for different current density and polarity. The measured time-averaged irreversible heat generation rate was in excellent agreement with predictions for Joule heating. The reversible heat generation rate in the positive electrode was exothermic during charging and endothermic during discharging. By contrast, the negative electrode featured both exothermic and endothermic heat generation during both charging and discharging. The results of this study can be used to validate existing thermal models, to develop thermal management strategies, and to gain insight into physicochemical phenomena taking place during operation.

  16. Time stamp generation with inverse FIR filters for Positron Emission Tomography

    International Nuclear Information System (INIS)

    Namias, Mauro

    2009-01-01

    Photon coincidence detection is the process by which Positron Emission Tomography (PET) works. This requires the determination of the time of impact of each coincident photon at the detector system, also known as time stamp. In this work, the timestamp was generated by means of digital time-domain deconvolution with FIR filters for a INa(Tl) based system. The detector deadtime was reduced from 350 ns to 175 ns while preserving the system's energy resolution and a direct relation between the amount of light collected and the temporal resolution was found.(author)

  17. Partial path column generation for the vehicle routing problem with time windows

    DEFF Research Database (Denmark)

    Petersen, Bjørn; Jepsen, Mads Kehlet

    2009-01-01

    This paper presents a column generation algorithm for the Vehicle Routing Problem with Time Windows (VRPTW). Traditionally, column generation models of the VRPTW have consisted of a Set Partitioning master problem with each column representing a route, i.e., a resource feasible path starting...... number of customers. We suggest to relax that ‘each column is a route’ into ‘each column is a part of the giant tour’; a so-called partial path, i.e., not necessarily starting and ending in the depot. This way, the length of the partial path can be bounded and a better control of the size of the solution...

  18. Time-of-arrival analysis applied to ELF/VLF wave generation experiments at HAARP

    Science.gov (United States)

    Moore, R. C.; Fujimaru, S.

    2012-12-01

    Time-of-arrival (TOA) analysis is applied to observations performed during ELF/VLF wave generation experiments at the High-frequency Active Auroral Research Program (HAARP) HF transmitter in Gakona, Alaska. In 2012, a variety of ELF/VLF wave generation techniques were employed to identify the dominant source altitude for each case. Observations were performed for beat-wave modulation, AM modulation, STF modulation, ICD modulation, and cubic frequency modulation, among others. For each of these cases, we identify the dominant ELF/VLF source altitude and compare the experimental results with theoretical HF heating predictions.

  19. Space-time scenarios of wind power generation produced using a Gaussian copula with parametrized precision matrix

    DEFF Research Database (Denmark)

    Tastu, Julija; Pinson, Pierre; Madsen, Henrik

    The emphasis in this work is placed on generating space-time trajectories (also referred to as scenarios) of wind power generation. This calls for prediction of multivariate densities describing wind power generation at a number of distributed locations and for a number of successive lead times. ...... and direction-dependent cross-correlations. Accounting for the space-time effects is shown to be crucial for generating high quality scenarios....

  20. Development of automatic intercomparison system for generation of time scale ensembling several atomic clocks

    Directory of Open Access Journals (Sweden)

    Thorat P.P.

    2015-01-01

    Full Text Available National physical laboratory India (NPLI has five commercial cesium atomic clocks. Till recently one of these clocks had been used to maintain coordinated universal time (UTC of NPLI. To utilize all these clocks in an ensemble manner to generate a smoother time scale, it has been essential to inter-compare them very precisely. This has been achieved with an automatic measurement system with well-conceived software. Though few laboratories have developed such automatic measurement system by themselves based on the respective requirements; but they are not reported. So keeping in mind of the specific requirement of the time scale generation, a new system has been developed by NPLI. The design has taken into account of the associated infrastructure that exists and would be used. The performance of the new system has also been studied. It has been found to be quite satisfactory to serve the purpose. The system is being utilized for the generation of time scale of NPLI.

  1. Quality of Service for Real-Time Applications Over Next Generation Data Networks

    Science.gov (United States)

    Atiquzzaman, Mohammed; Jain, Raj

    2001-01-01

    This project, which started on January 1, 2000, was funded by the NASA Glenn Research Center for duration of one year. The deliverables of the project included the following tasks: (1) Study of QoS mapping between the edge and core networks envisioned in the Next Generation networks will provide us with the QoS guarantees that can be obtained from next generation networks; (2) Buffer management techniques to provide strict guarantees to real-time end-to-end applications through preferential treatment to packets belonging to real-time applications. In particular, use of ECN to help reduce the loss on high bandwidth-delay product satellite networks needs to be studied; (3) Effect of Prioritized Packet Discard to increase goodput of the network and reduce the buffering requirements in the ATM switches; (4) Provision of new IP circuit emulation services over Satellite IP backbones using MPLS will be studied; and (5) Determine the architecture and requirements for internetworking ATN and the Next Generation Internet for real-time applications. The project has been completed on time. All the objectives and deliverables of the project have been completed. Research results obtained from this project have been published in a number of papers in journals, conferences, and technical reports, included in this document.

  2. Determination of neutron generation time in miniature neutron source reactor by measurement of neutronics transfer function

    International Nuclear Information System (INIS)

    Hainoun, A.; Khamis, I.

    2000-01-01

    The prompt neutron generation time Λ and the total effective fraction of delayed neutrons (including the effect of photoneutrons) β have been experimentally determined for the miniature neutron source reactor (MNSR) of Syria. The neutron generation time was found by taking measurements of the reactor open-loop transfer function using newly devised reactivity-step-ejection method by the reactor pneumatic rabbit system. Small reactivity perturbations i.e. step changes of reactivity starting from steady state, were introduced into the reactor during operation at low power level i.e. zero-power. Relative neutron flux and reactivity versus time were obtained. Using transfer function analysis as well as least square fitting techniques and measuring the delayed neutrons fraction, the neutron generation time was determined to be 74.6±1.57 μs. Using the prompt jump approximation of neutron flux, the total effective fraction of delayed neutrons was measured and found to be 0.00783±0.00017. Measured values of Λ and β were found to be very consistent with calculated ones reported in the safety analysis report. (orig.)

  3. Next-Generation Library Catalogs and the Problem of Slow Response Time

    Directory of Open Access Journals (Sweden)

    Margaret Brown-Sica

    2010-12-01

    Full Text Available Response time as defined for this study is the time that it takes for all files that constitute a single webpage to travel across the Internet from a Web server to the end user’s browser. In this study, the authors tested response times on queries for identical items in five different library catalogs, one of them a next-generation (NextGen catalog. The authors also discuss acceptable response time and how it may affect the discovery process. They suggest that librarians and vendors should develop standards for acceptable response time and use it in the product selection and development processes.

  4. Generation of artificial earthquake time histories for seismic design at Hanford, Washington

    International Nuclear Information System (INIS)

    Salmon, M.W.; Kuilanoff, G.

    1991-01-01

    The purpose of the development of artificial time-histories is to provide the designer with ground motion estimates which will meet the requirements of the design guidelines at the Hanford site. In particular, the artificial time histories presented in this paper were prepared to assist designers of the Hanford Waste Vitrification Plant (HWVP) with time histories that envelop the requirements for both a large magnitude earthquake (MI > 6.0) and a small magnitude, near-field earthquake (MI < 5. 0). A background of the requirements for both the large magnitude and small magnitude events is presented in this paper. The work done in generating time histories which produce response spectra matching those of the design seismic events is also presented. Finally, some preliminary results from studies performed using the small-magnitude near-filed earthquake time-history are presented

  5. Generation of future high-resolution rainfall time series with a disaggregation model

    Science.gov (United States)

    Müller, Hannes; Haberlandt, Uwe

    2017-04-01

    High-resolution rainfall data are needed in many fields of hydrology and water resources management. For analyzes of future rainfall condition climate scenarios exist with hourly values of rainfall. However, the direct usage of these data is associated with uncertainties which can be indicated by comparisons of observations and C20 control runs. An alternative is the derivation of changes of rainfall behavior over the time from climate simulations. Conclusions about future rainfall conditions can be drawn by adding these changes to observed time series. A multiplicative cascade model is used in this investigation for the disaggregation of daily rainfall amounts to hourly values. Model parameters can be estimated by REMO rainfall time series (UBA-, BfG- and ENS-realization), based on ECHAM5. Parameter estimation is carried out for C20 period as well as near term and long term future (2021-2050 and 2071-2100). Change factors for both future periods are derived by parameter comparisons and added to the parameters estimated from observed time series. This enables the generation of hourly rainfall time series from observed daily values with respect to future changes. The investigation is carried out for rain gauges in Lower Saxony. Generated Time series are analyzed regarding statistical characteristics, e.g. extreme values, event-based (wet spell duration and amounts, dry spell duration, …) and continuum characteristics (average intensity, fraction of dry intervals,…). The generation of the time series is validated by comparing the changes in the statistical characteristics from the REMO data and from the disaggregated data.

  6. Generation of a Tph2 Conditional Knockout Mouse Line for Time- and Tissue-Specific Depletion of Brain Serotonin

    Science.gov (United States)

    Migliarini, Sara; Pacini, Giulia; Pasqualetti, Massimo

    2015-01-01

    Serotonin has been gaining increasing attention during the last two decades due to the dual function of this monoamine as key regulator during critical developmental events and as neurotransmitter. Importantly, unbalanced serotonergic levels during critical temporal phases might contribute to the onset of neuropsychiatric disorders, such as schizophrenia and autism. Despite increasing evidences from both animal models and human genetic studies have underpinned the importance of serotonin homeostasis maintenance during central nervous system development and adulthood, the precise role of this molecule in time-specific activities is only beginning to be elucidated. Serotonin synthesis is a 2-step process, the first step of which is mediated by the rate-limiting activity of Tph enzymes, belonging to the family of aromatic amino acid hydroxylases and existing in two isoforms, Tph1 and Tph2, responsible for the production of peripheral and brain serotonin, respectively. In the present study, we generated and validated a conditional knockout mouse line, Tph2flox/flox, in which brain serotonin can be effectively ablated with time specificity. We demonstrated that the Cre-mediated excision of the third exon of Tph2 gene results in the production of a Tph2null allele in which we observed the near-complete loss of brain serotonin, as well as the growth defects and perinatal lethality observed in serotonin conventional knockouts. We also revealed that in mice harbouring the Tph2null allele, but not in wild-types, two distinct Tph2 mRNA isoforms are present, namely Tph2Δ3 and Tph2Δ3Δ4, with the latter showing an in-frame deletion of amino acids 84–178 and coding a protein that could potentially retain non-negligible enzymatic activity. As we could not detect Tph1 expression in the raphe, we made the hypothesis that the Tph2Δ3Δ4 isoform can be at the origin of the residual, sub-threshold amount of serotonin detected in the brain of Tph2null/null mice. Finally, we set up

  7. Generation of a Tph2 Conditional Knockout Mouse Line for Time- and Tissue-Specific Depletion of Brain Serotonin.

    Directory of Open Access Journals (Sweden)

    Barbara Pelosi

    Full Text Available Serotonin has been gaining increasing attention during the last two decades due to the dual function of this monoamine as key regulator during critical developmental events and as neurotransmitter. Importantly, unbalanced serotonergic levels during critical temporal phases might contribute to the onset of neuropsychiatric disorders, such as schizophrenia and autism. Despite increasing evidences from both animal models and human genetic studies have underpinned the importance of serotonin homeostasis maintenance during central nervous system development and adulthood, the precise role of this molecule in time-specific activities is only beginning to be elucidated. Serotonin synthesis is a 2-step process, the first step of which is mediated by the rate-limiting activity of Tph enzymes, belonging to the family of aromatic amino acid hydroxylases and existing in two isoforms, Tph1 and Tph2, responsible for the production of peripheral and brain serotonin, respectively. In the present study, we generated and validated a conditional knockout mouse line, Tph2flox/flox, in which brain serotonin can be effectively ablated with time specificity. We demonstrated that the Cre-mediated excision of the third exon of Tph2 gene results in the production of a Tph2null allele in which we observed the near-complete loss of brain serotonin, as well as the growth defects and perinatal lethality observed in serotonin conventional knockouts. We also revealed that in mice harbouring the Tph2null allele, but not in wild-types, two distinct Tph2 mRNA isoforms are present, namely Tph2Δ3 and Tph2Δ3Δ4, with the latter showing an in-frame deletion of amino acids 84-178 and coding a protein that could potentially retain non-negligible enzymatic activity. As we could not detect Tph1 expression in the raphe, we made the hypothesis that the Tph2Δ3Δ4 isoform can be at the origin of the residual, sub-threshold amount of serotonin detected in the brain of Tph2null/null mice

  8. Efficient generation of random multipartite entangled states using time-optimal unitary operations

    Science.gov (United States)

    Borras, A.; Majtey, A. P.; Casas, M.

    2008-08-01

    We review the generation of random pure states using a protocol of repeated two-qubit gates. We study the dependence of the convergence to states with Haar multipartite entanglement distribution. We investigate the optimal generation of such states in terms of the physical (real) time needed to apply the protocol, instead of the gate complexity point of view used in other works. This physical time can be obtained, for a given Hamiltonian, within the theoretical framework offered by the quantum brachistochrone formalism, the quantum analogue to the brachistochrone problem in classical mechanics [Carlini , Phys. Rev. Lett. 96, 060503 (2006)]. Using an anisotropic Heisenberg Hamiltonian as an example, we find that different optimal quantum gates arise according to the optimality point of view used in each case. We also study how the convergence to random entangled states depends on different entanglement measures.

  9. Rapid growth, early maturation and short generation time in African annual fishes

    Czech Academy of Sciences Publication Activity Database

    Blažek, Radim; Polačik, Matej; Reichard, Martin

    2013-01-01

    Roč. 4, č. 24 (2013), s. 24 ISSN 2041-9139 R&D Projects: GA ČR(CZ) GAP506/11/0112 Institutional support: RVO:68081766 Keywords : extreme life history * annual fish * explosive growth * rapid maturation * generation time * killifish * diapause * vertebrate * reaction norm * Savanna Subject RIV: EG - Zoology Impact factor: 3.104, year: 2013 http://www.evodevojournal.com/content/4/1/24

  10. Time delay generation at high frequency using SOA based slow and fast light.

    Science.gov (United States)

    Berger, Perrine; Bourderionnet, Jérôme; Bretenaker, Fabien; Dolfi, Daniel; Alouini, Mehdi

    2011-10-24

    We show how Up-converted Coherent Population Oscillations (UpCPO) enable to get rid of the intrinsic limitation of the carrier lifetime, leading to the generation of time delays at any high frequencies in a single SOA device. The linear dependence of the RF phase shift with respect to the RF frequency is theoretically predicted and experimentally evidenced at 16 and 35 GHz. © 2011 Optical Society of America

  11. Real-time convolution method for generating light diffusion profiles of layered turbid media.

    Science.gov (United States)

    Kim, Hoe-Min; Ko, Kwang Hee; Lee, Kwan H

    2011-06-01

    In this paper we present a technique to obtain a diffusion profile of layered turbid media in real time by using the quasi fast Hankel transform (QFHT) and the latest graphics processing unit technique. We apply the QFHT to convolve the diffusion profiles of each layer so as to dramatically reduce the time for the convolution step while maintaining the accuracy. In addition, we also introduce an accelerated technique to generate individual discrete diffusion profiles for each layer through parallel processing. The proposed method is 2 orders of magnitude faster than the existing method, and we validate its efficiency by comparing it with Monte Carlo simulation and another relevant methods.

  12. Advances in high-order harmonic generation sources for time-resolved investigations

    Energy Technology Data Exchange (ETDEWEB)

    Reduzzi, Maurizio [Dipartimento di Fisica, Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milano (Italy); Institute of Photonics and Nanotechnologies, CNR-IFN, Piazza Leonardo da Vinci 32, 20133 Milano (Italy); Carpeggiani, Paolo [Dipartimento di Fisica, Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milano (Italy); Kühn, Sergei [ELI-ALPS, ELI-Hu Kft., Dugonics ter 13, H-6720 Szeged (Hungary); Calegari, Francesca [Institute of Photonics and Nanotechnologies, CNR-IFN, Piazza Leonardo da Vinci 32, 20133 Milano (Italy); Nisoli, Mauro; Stagira, Salvatore [Dipartimento di Fisica, Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milano (Italy); Institute of Photonics and Nanotechnologies, CNR-IFN, Piazza Leonardo da Vinci 32, 20133 Milano (Italy); Vozzi, Caterina [Institute of Photonics and Nanotechnologies, CNR-IFN, Piazza Leonardo da Vinci 32, 20133 Milano (Italy); Dombi, Peter [ELI-ALPS, ELI-Hu Kft., Dugonics ter 13, H-6720 Szeged (Hungary); Wigner Research Center for Physics, 1121 Budapest (Hungary); Kahaly, Subhendu [ELI-ALPS, ELI-Hu Kft., Dugonics ter 13, H-6720 Szeged (Hungary); Tzallas, Paris; Charalambidis, Dimitris [ELI-ALPS, ELI-Hu Kft., Dugonics ter 13, H-6720 Szeged (Hungary); Foundation for Research and Technology – Hellas, Institute of Electronic Structure and Lasers, P.O. Box 1527, GR-711 10 Heraklion, Crete (Greece); Varju, Katalin [ELI-ALPS, ELI-Hu Kft., Dugonics ter 13, H-6720 Szeged (Hungary); Department of Optics and Quantum Electronics, University of Szeged, Dóm tér 9, 6720 Szeged (Hungary); Osvay, Karoly [ELI-ALPS, ELI-Hu Kft., Dugonics ter 13, H-6720 Szeged (Hungary); and others

    2015-10-15

    We review the main research directions ongoing in the development of extreme ultraviolet sources based on high-harmonic generation for the synthesization and application of trains and isolated attosecond pulses to time-resolved spectroscopy. A few experimental and theoretical works will be discussed in connection to well-established attosecond techniques. In this context, we present the unique possibilities offered for time-resolved investigations on the attosecond timescale by the new Extreme Light Infrastructure Attosecond Light Pulse Source, which is currently under construction.

  13. Advances in high-order harmonic generation sources for time-resolved investigations

    International Nuclear Information System (INIS)

    Reduzzi, Maurizio; Carpeggiani, Paolo; Kühn, Sergei; Calegari, Francesca; Nisoli, Mauro; Stagira, Salvatore; Vozzi, Caterina; Dombi, Peter; Kahaly, Subhendu; Tzallas, Paris; Charalambidis, Dimitris; Varju, Katalin; Osvay, Karoly

    2015-01-01

    We review the main research directions ongoing in the development of extreme ultraviolet sources based on high-harmonic generation for the synthesization and application of trains and isolated attosecond pulses to time-resolved spectroscopy. A few experimental and theoretical works will be discussed in connection to well-established attosecond techniques. In this context, we present the unique possibilities offered for time-resolved investigations on the attosecond timescale by the new Extreme Light Infrastructure Attosecond Light Pulse Source, which is currently under construction.

  14. Generation of artificial time-histories, rich in all frequencies, from given response spectra

    International Nuclear Information System (INIS)

    Levy, S.; Wilkinson, J.P.D.

    1975-01-01

    In order to apply the time-history method of seismic analysis, it is often desirable to generate a suitable artificial time-history from a given response spectrum. The method described allows the generation of such a time-history that is also rich in all frequencies in the spectrum. This richness is achieved by choosing a large number of closely-spaced frequency points such that the adjacent frequencies have their half-power points overlap. The adjacent frequencies satisfy the condition that the frequency interval Δf near a given frequency f is such that (Δf)/f<2c/csub(c) where c is the damping of the system and csub(c) is the critical damping. In developing an artificial time-history, it is desirable to specify the envelope and duration of the record, very often in such a manner as to reproduce the envelope property of a specific earthquake record, and such an option is available in the method described. Examples are given of the development of typical artificial time-histories from earthquake design response spectra and from floor response spectra

  15. Copyright and Computer Generated Materials – Is it Time to Reboot the Discussion About Authorship?

    Directory of Open Access Journals (Sweden)

    Anne Fitzgerald

    2013-12-01

    Full Text Available Computer generated materials are ubiquitous and we encounter them on a daily basis, even though most people are unaware that this is the case. Blockbuster movies, television weather reports and telephone directories all include material that is produced by utilising computer technologies. Copyright protection for materials generated by a programmed computer was considered by the Federal Court and Full Court of the Federal Court in Telstra Corporation Limited v Phone Directories Company Pty Ltd.  The court held that the White and Yellow pages telephone directories produced by Telstra and its subsidiary, Sensis, were not protected by copyright because they were computer-generated works which lacked the requisite human authorship.The Copyright Act 1968 (Cth does not contain specific provisions on the subsistence of copyright in computer-generated materials. Although the issue of copyright protection for computer-generated materials has been examined in Australia on two separate occasions by independently-constituted Copyright Law Review Committees over a period of 10 years (1988 to 1998, the Committees’ recommendations for legislative clarification by the enactment of specific amendments to the Copyright Act have not yet been implemented and the legal position remains unclear. In the light of the decision of the Full Federal Court in Telstra v Phone Directories it is timely to consider whether specific provisions should be enacted to clarify the position of computer-generated works under copyright law and, in particular, whether the requirement of human authorship for original works protected under Part III of the Copyright Act should now be reconceptualised to align with the realities of how copyright materials are created in the digital era.

  16. Substitute CT generation from a single ultra short time echo MRI sequence: preliminary study

    Science.gov (United States)

    Ghose, Soumya; Dowling, Jason A.; Rai, Robba; Liney, Gary P.

    2017-04-01

    In MR guided radiation therapy planning both MR and CT images for a patient are acquired and co-registered to obtain a tissue specific HU map. Generation of the HU map directly from the MRI would eliminate the CT acquisition and may improve radiation therapy planning. In this preliminary study of substitute CT (sCT) generation, two porcine leg phantoms were scanned using a 3D ultrashort echo time (PETRA) sequence and co-registered to corresponding CT images to build tissue specific regression models. The model was created from one co-registered CT-PETRA pair to generate the sCT for the other PETRA image. An expectation maximization based clustering was performed on the co-registered PETRA image to identify the soft tissues, dense bone and air class membership probabilities. A tissue specific non linear regression model was built from one registered CT-PETRA pair dataset to predict the sCT of the second PETRA image in a two-fold cross validation schema. A complete substitute CT is generated in 3 min. The mean absolute HU error for air was 0.3 HU, bone was 95 HU, fat was 30 HU and for muscle it was 10 HU. The mean surface reconstruction error for the bone was 1.3 mm. The PETRA sequence enabled a low mean absolute surface distance for the bone and a low HU error for other classes. The sCT generated from a single PETRA sequence shows promise for the generation of fast sCT for MRI based radiation therapy planning.

  17. Real-time dynamic analysis for complete loop of direct steam generation solar trough collector

    International Nuclear Information System (INIS)

    Guo, Su; Liu, Deyou; Chu, Yinghao; Chen, Xingying; Shen, Bingbing; Xu, Chang; Zhou, Ling; Wang, Pei

    2016-01-01

    Highlights: • A nonlinear distribution parameter dynamic model has been developed. • Real-time local heat transfer coefficient and friction coefficient are adopted. • The dynamic behavior of the solar trough collector loop are simulated. • High-frequency chattering of outlet fluid flow are analyzed and modeled. • Irradiance disturbance at subcooled water region generates larger influence. - Abstract: Direct steam generation is a potential approach to further reduce the levelized electricity cost of solar trough. Dynamic modeling of the collector loop is essential for operation and control of direct steam generation solar trough. However, the dynamic behavior of fluid based on direct steam generation is complex because of the two-phase flow in the pipeline. In this work, a nonlinear distribution parameter model has been developed to model the dynamic behaviors of direct steam generation parabolic trough collector loops under either full or partial solar irradiance disturbance. Compared with available dynamic model, the proposed model possesses two advantages: (1) real-time local values of heat transfer coefficient and friction resistance coefficient, and (2) considering of the complete loop of collectors, including subcooled water region, two-phase flow region and superheated steam region. The proposed model has shown superior performance, particularly in case of sensitivity study of fluid parameters when the pipe is partially shaded. The proposed model has been validated using experimental data from Solar Thermal Energy Laboratory of University of New South Wales, with an outlet fluid temperature relative error of only 1.91%. The validation results show that: (1) The proposed model successfully outperforms two reference models in predicting the behavior of direct steam generation solar trough. (2) The model theoretically predicts that, during solar irradiance disturbance, the discontinuities of fluid physical property parameters and the moving back and

  18. Model Selection and Quality Estimation of Time Series Models for Artificial Technical Surface Generation

    Directory of Open Access Journals (Sweden)

    Matthias Eifler

    2017-12-01

    Full Text Available Standard compliant parameter calculation in surface topography analysis takes the manufacturing process into account. Thus, the measurement technician can be supported with automated suggestions for preprocessing, filtering and evaluation of the measurement data based on the character of the surface topography. Artificial neuronal networks (ANN are one approach for the recognition or classification of technical surfaces. However the required set of training data for ANN is often not available, especially when data acquisition is time consuming or expensive—as e.g., measuring surface topography. Thus, generation of artificial (simulated data becomes of interest. An approach from time series analysis is chosen and examined regarding its suitability for the description of technical surfaces: the ARMAsel model, an approach for time series modelling which is capable of choosing the statistical model with the smallest prediction error and the best number of coefficients for a certain surface. With a reliable model which features the relevant stochastic properties of a surface, a generation of training data for classifiers of artificial neural networks is possible. Based on the determined ARMA-coefficients from the ARMAsel-approach, with only few measured datasets many different artificial surfaces can be generated which can be used for training classifiers of an artificial neural network. In doing so, an improved calculation of the model input data for the generation of artificial surfaces is possible as the training data generation is based on actual measurement data. The trained artificial neural network is tested with actual measurement data of surfaces that were manufactured with varying manufacturing methods and a recognition rate of the according manufacturing principle between 60% and 78% can be determined. This means that based on only few measured datasets, stochastic surface information of various manufacturing principles can be extracted

  19. Irregular oscillatory patterns in the early-time region of coherent phonon generation in silicon

    Science.gov (United States)

    Watanabe, Yohei; Hino, Ken-ichi; Hase, Muneaki; Maeshima, Nobuya

    2017-09-01

    Coherent phonon (CP) generation in an undoped Si crystal is theoretically investigated to shed light on unexplored quantum-mechanical effects in the early-time region immediately after the irradiation of ultrashort laser pulses. We examine time signals attributed to an induced charge density of an ionic core, placing the focus on the effects of the Rabi frequency Ω0 c v on the signals; this frequency corresponds to the peak electric-field of the pulse. It is found that at specific Ω0 c v's, where the energy of plasmon caused by photoexcited carriers coincides with the longitudinal-optical phonon energy, the energetically resonant interaction between these two modes leads to striking anticrossings, revealing irregular oscillations with anomalously enhanced amplitudes in the observed time signals. Also, the oscillatory pattern is subject to the Rabi flopping of the excited carrier density that is controlled by Ω0 c v. These findings show that the early-time region is enriched with quantum-mechanical effects inherent in the CP generation, though experimental signals are more or less masked by the so-called coherent artifact due to nonlinear optical effects.

  20. Towards a New Generation of Time-Series Visualization Tools in the ESA Heliophysics Science Archives

    Science.gov (United States)

    Perez, H.; Martinez, B.; Cook, J. P.; Herment, D.; Fernandez, M.; De Teodoro, P.; Arnaud, M.; Middleton, H. R.; Osuna, P.; Arviset, C.

    2017-12-01

    During the last decades a varied set of Heliophysics missions have allowed the scientific community to gain a better knowledge on the solar atmosphere and activity. The remote sensing images of missions such as SOHO have paved the ground for Helio-based spatial data visualization software such as JHelioViewer/Helioviewer. On the other hand, the huge amount of in-situ measurements provided by other missions such as Cluster provide a wide base for plot visualization software whose reach is still far from being fully exploited. The Heliophysics Science Archives within the ESAC Science Data Center (ESDC) already provide a first generation of tools for time-series visualization focusing on each mission's needs: visualization of quicklook plots, cross-calibration time series, pre-generated/on-demand multi-plot stacks (Cluster), basic plot zoom in/out options (Ulysses) and easy navigation through the plots in time (Ulysses, Cluster, ISS-Solaces). However, as the needs evolve and the scientists involved in new missions require to plot multi-variable data, heat maps stacks interactive synchronization and axis variable selection among other improvements. The new Heliophysics archives (such as Solar Orbiter) and the evolution of existing ones (Cluster) intend to address these new challenges. This paper provides an overview of the different approaches for visualizing time-series followed within the ESA Heliophysics Archives and their foreseen evolution.

  1. Generation and Validation of Spatial Distribution of Hourly Wind Speed Time-Series using Machine Learning

    International Nuclear Information System (INIS)

    Veronesi, F; Grassi, S

    2016-01-01

    Wind resource assessment is a key aspect of wind farm planning since it allows to estimate the long term electricity production. Moreover, wind speed time-series at high resolution are helpful to estimate the temporal changes of the electricity generation and indispensable to design stand-alone systems, which are affected by the mismatch of supply and demand. In this work, we present a new generalized statistical methodology to generate the spatial distribution of wind speed time-series, using Switzerland as a case study. This research is based upon a machine learning model and demonstrates that statistical wind resource assessment can successfully be used for estimating wind speed time-series. In fact, this method is able to obtain reliable wind speed estimates and propagate all the sources of uncertainty (from the measurements to the mapping process) in an efficient way, i.e. minimizing computational time and load. This allows not only an accurate estimation, but the creation of precise confidence intervals to map the stochasticity of the wind resource for a particular site. The validation shows that machine learning can minimize the bias of the wind speed hourly estimates. Moreover, for each mapped location this method delivers not only the mean wind speed, but also its confidence interval, which are crucial data for planners. (paper)

  2. Image encryption scheme based on computer generated holography and time-averaged moiré

    Science.gov (United States)

    Palevicius, Paulius; Ragulskis, Minvydas; Janusas, Giedrius; Palevicius, Arvydas

    2017-08-01

    A technique of computational image encryption and optical decryption based on computer generated holography and time-averaged moiŕe is investigated in this paper. Dynamic visual cryptography (a visual cryptography scheme based on time-averaging geometric moiŕe), Gerchberg-Saxton algorithm and 3D microstructure manufacturing techniques are used to construct the optical scheme. The secret is embedded into a cover image by using a stochastic moiŕe grating and can be visually decoded by a naked eye. The secret is revealed if the amplitude of harmonic oscillations in the Fourier plane corresponds to an accurately preselected value. The process of the production of 3D microstructure is described in details. Computer generated holography is used in the design step and electron beam lithography is exploited for physical 3D patterning. The phase data of a complex 3D microstructure is obtained by Gerchberg-Saxton algorithm and is used to produce a computer generated hologram. Physical implementation of microstructure is performed by using a single layer polymethyl methacrylate as a basis for 3D microstructure. Numerical simulations demonstrate efficient applicability of this technique.

  3. Real-Time Terrain Storage Generation from Multiple Sensors towards Mobile Robot Operation Interface

    Science.gov (United States)

    Cho, Seoungjae; Xi, Yulong; Cho, Kyungeun

    2014-01-01

    A mobile robot mounted with multiple sensors is used to rapidly collect 3D point clouds and video images so as to allow accurate terrain modeling. In this study, we develop a real-time terrain storage generation and representation system including a nonground point database (PDB), ground mesh database (MDB), and texture database (TDB). A voxel-based flag map is proposed for incrementally registering large-scale point clouds in a terrain model in real time. We quantize the 3D point clouds into 3D grids of the flag map as a comparative table in order to remove the redundant points. We integrate the large-scale 3D point clouds into a nonground PDB and a node-based terrain mesh using the CPU. Subsequently, we program a graphics processing unit (GPU) to generate the TDB by mapping the triangles in the terrain mesh onto the captured video images. Finally, we produce a nonground voxel map and a ground textured mesh as a terrain reconstruction result. Our proposed methods were tested in an outdoor environment. Our results show that the proposed system was able to rapidly generate terrain storage and provide high resolution terrain representation for mobile mapping services and a graphical user interface between remote operators and mobile robots. PMID:25101321

  4. Real-Time Terrain Storage Generation from Multiple Sensors towards Mobile Robot Operation Interface

    Directory of Open Access Journals (Sweden)

    Wei Song

    2014-01-01

    Full Text Available A mobile robot mounted with multiple sensors is used to rapidly collect 3D point clouds and video images so as to allow accurate terrain modeling. In this study, we develop a real-time terrain storage generation and representation system including a nonground point database (PDB, ground mesh database (MDB, and texture database (TDB. A voxel-based flag map is proposed for incrementally registering large-scale point clouds in a terrain model in real time. We quantize the 3D point clouds into 3D grids of the flag map as a comparative table in order to remove the redundant points. We integrate the large-scale 3D point clouds into a nonground PDB and a node-based terrain mesh using the CPU. Subsequently, we program a graphics processing unit (GPU to generate the TDB by mapping the triangles in the terrain mesh onto the captured video images. Finally, we produce a nonground voxel map and a ground textured mesh as a terrain reconstruction result. Our proposed methods were tested in an outdoor environment. Our results show that the proposed system was able to rapidly generate terrain storage and provide high resolution terrain representation for mobile mapping services and a graphical user interface between remote operators and mobile robots.

  5. Reference manual for generation and analysis of Habitat Time Series: version II

    Science.gov (United States)

    Milhous, Robert T.; Bartholow, John M.; Updike, Marlys A.; Moos, Alan R.

    1990-01-01

    The selection of an instream flow requirement for water resource management often requires the review of how the physical habitat changes through time. This review is referred to as 'Time Series Analysis." The Tune Series Library (fSLIB) is a group of programs to enter, transform, analyze, and display time series data for use in stream habitat assessment. A time series may be defined as a sequence of data recorded or calculated over time. Examples might be historical monthly flow, predicted monthly weighted usable area, daily electrical power generation, annual irrigation diversion, and so forth. The time series can be analyzed, both descriptively and analytically, to understand the importance of the variation in the events over time. This is especially useful in the development of instream flow needs based on habitat availability. The TSLIB group of programs assumes that you have an adequate study plan to guide you in your analysis. You need to already have knowledge about such things as time period and time step, species and life stages to consider, and appropriate comparisons or statistics to be produced and displayed or tabulated. Knowing your destination, you must first evaluate whether TSLIB can get you there. Remember, data are not answers. This publication is a reference manual to TSLIB and is intended to be a guide to the process of using the various programs in TSLIB. This manual is essentially limited to the hands-on use of the various programs. a TSLIB use interface program (called RTSM) has been developed to provide an integrated working environment where the use has a brief on-line description of each TSLIB program with the capability to run the TSLIB program while in the user interface. For information on the RTSM program, refer to Appendix F. Before applying the computer models described herein, it is recommended that the user enroll in the short course "Problem Solving with the Instream Flow Incremental Methodology (IFIM)." This course is offered

  6. Is PMI the Hypothesis or the Null Hypothesis?

    Science.gov (United States)

    Tarone, Aaron M; Sanford, Michelle R

    2017-09-01

    Over the past several decades, there have been several strident exchanges regarding whether forensic entomologists estimate the postmortem interval (PMI), minimum PMI, or something else. During that time, there has been a proliferation of terminology reflecting this concern regarding "what we do." This has been a frustrating conversation for some in the community because much of this debate appears to be centered on what assumptions are acknowledged directly and which are embedded within a list of assumptions (or ignored altogether) in the literature and in case reports. An additional component of the conversation centers on a concern that moving away from the use of certain terminology like PMI acknowledges limitations and problems that would make the application of entomology appear less useful in court-a problem for lawyers, but one that should not be problematic for scientists in the forensic entomology community, as uncertainty is part of science that should and can be presented effectively in the courtroom (e.g., population genetic concepts in forensics). Unfortunately, a consequence of the way this conversation is conducted is that even as all involved in the debate acknowledge the concerns of their colleagues, parties continue to talk past one another advocating their preferred terminology. Progress will not be made until the community recognizes that all of the terms under consideration take the form of null hypothesis statements and that thinking about "what we do" as a null hypothesis has useful legal and scientific ramifications that transcend arguments over the usage of preferred terminology. © The Authors 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. A Preliminary Examination of the Second Generation CMORPH Real-time Production

    Science.gov (United States)

    Joyce, R.; Xie, P.; Wu, S.

    2017-12-01

    The second generation CMORPH (CMORPH2) has started test real-time production of 30-minute precipitation estimates on a 0.05olat/lon grid over the entire globe, from pole-to-pole. The CMORPH2 is built upon the Kalman Filter based CMORPH algorithm of Joyce and Xie (2011). Inputs to the system include rainfall and snowfall rate retrievals from passive microwave (PMW) measurements aboard all available low earth orbit (LEO) satellites, precipitation estimates derived from infrared (IR) observations of geostationary (GEO) and LEO platforms, and precipitation simulations from the NCEP operational global forecast system (GFS). Inputs from the various sources are first inter-calibrated to ensure quantitative consistencies in representing precipitation events of different intensities through PDF calibration against a common reference standard. The inter-calibrated PMW retrievals and IR-based precipitation estimates are then propagated from their respective observation times to the target analysis time along the motion vectors of the precipitating clouds. Motion vectors are first derived separately from the satellite IR based precipitation estimates and the GFS precipitation fields. These individually derived motion vectors are then combined through a 2D-VAR technique to form an analyzed field of cloud motion vectors over the entire globe. The propagated PMW and IR based precipitation estimates are finally integrated into a single field of global precipitation through the Kalman Filter framework. A set of procedures have been established to examine the performance of the CMORPH2 real-time production. CMORPH2 satellite precipitation estimates are compared against the CPC daily gauge analysis, Stage IV radar precipitation over the CONUS, and numerical model forecasts to discover potential shortcomings and quantify improvements against the first generation CMORPH. Special attention has been focused on the CMORPH behavior over high-latitude areas beyond the coverage of the first

  8. Synthetic river flow time series generator for dispatch and spot price forecast

    Energy Technology Data Exchange (ETDEWEB)

    Flores, R.A. [Chalmers Univ. of Technology, Gothenburg (Sweden). Signal Processing Dept.; Szczupak, J. [Pontifical Catholic Univ., Rio de Janeiro (Brazil). Electrical Engineering Dept.; Pinto, L. [Engenho, Rio de Janeiro (Brazil)

    2007-07-01

    Decision-making in electricity markets is complicated by uncertainties in demand growth, power supplies and fuel prices. In Peru, where the electrical power system is highly dependent on water resources at dams and river flows, hydrological uncertainties play a primary role in planning, price and dispatch forecast. This paper proposed a signal processing method for generating new synthetic river flow time series as a support for planning and spot market price forecasting. River flow time series are natural phenomena representing a continuous-time domain process. As an alternative synthetic representation of the original river flow time series, this proposed signal processing method preserves correlations, basic statistics and seasonality. It takes into account deterministic, periodic and non periodic components such as those due to the El Nino Southern Oscillation phenomenon. The new synthetic time series has many correlations with the original river flow time series, rendering it suitable for possible replacement of the classical method of sorting historical river flow time series. As a dispatch and planning approach to spot pricing, the proposed method offers higher accuracy modeling by decomposing the signal into deterministic, periodic, non periodic and stochastic sub signals. 4 refs., 4 tabs., 13 figs.

  9. Synthetic river flow time series generator for dispatch and spot price forecast

    International Nuclear Information System (INIS)

    Flores, R.A.

    2007-01-01

    Decision-making in electricity markets is complicated by uncertainties in demand growth, power supplies and fuel prices. In Peru, where the electrical power system is highly dependent on water resources at dams and river flows, hydrological uncertainties play a primary role in planning, price and dispatch forecast. This paper proposed a signal processing method for generating new synthetic river flow time series as a support for planning and spot market price forecasting. River flow time series are natural phenomena representing a continuous-time domain process. As an alternative synthetic representation of the original river flow time series, this proposed signal processing method preserves correlations, basic statistics and seasonality. It takes into account deterministic, periodic and non periodic components such as those due to the El Nino Southern Oscillation phenomenon. The new synthetic time series has many correlations with the original river flow time series, rendering it suitable for possible replacement of the classical method of sorting historical river flow time series. As a dispatch and planning approach to spot pricing, the proposed method offers higher accuracy modeling by decomposing the signal into deterministic, periodic, non periodic and stochastic sub signals. 4 refs., 4 tabs., 13 figs

  10. Processing Binary and Fuzzy Logic by Chaotic Time Series Generated by a Hydrodynamic Photochemical Oscillator.

    Science.gov (United States)

    Gentili, Pier Luigi; Giubila, Maria Sole; Heron, B Mark

    2017-07-05

    This work demonstrates the computational power of a hydrodynamic photochemical oscillator based on a photochromic naphthopyran generating aperiodic time series. The chaotic character of the time series is tested by calculating its largest Lyapunov exponent and the correlation dimension of its attractor after building its phase space through the Takens' theorem. Then, the chaotic dynamic is shown to be suitable to implement all the fundamental Boolean two-inputs-one-output logic gates. Finally, the strategy to implement fuzzy logic systems (FLSs) based on the time series is described. Such FLSs promise to be useful in the field of computational linguistics, which is concerned with the development of artificial intelligent systems able to transform collections of numerical data into natural language texts. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Dynamic parabolic pulse generation using temporal shaping of wavelength to time mapped pulses.

    Science.gov (United States)

    Nguyen, Dat; Piracha, Mohammad Umar; Mandridis, Dimitrios; Delfyett, Peter J

    2011-06-20

    Self-phase modulation in fiber amplifiers can significantly degrade the quality of compressed pulses in chirped pulse amplification systems. Parabolic pulses with linear frequency chirp are suitable for suppressing nonlinearities, and to achieve high peak power pulses after compression. In this paper, we present an active time domain technique to generate parabolic pulses for chirped pulse amplification applications. Pulses from a mode-locked laser are temporally stretched and launched into an amplitude modulator, where the drive voltage is designed using the spectral shape of the input pulse and the transfer function of the modulator, resulting in the generation of parabolic pulses. Experimental results of pulse shaping with a pulse train from a mode-locked laser are presented, with a residual error of less than 5%. Moreover, an extinction ratio of 27 dB is achieved, which is ideal for chirped pulse amplification applications.

  12. A SPIRAL-BASED DOWNSCALING METHOD FOR GENERATING 30 M TIME SERIES IMAGE DATA

    Directory of Open Access Journals (Sweden)

    B. Liu

    2017-09-01

    Full Text Available The spatial detail and updating frequency of land cover data are important factors influencing land surface dynamic monitoring applications in high spatial resolution scale. However, the fragmentized patches and seasonal variable of some land cover types (e. g. small crop field, wetland make it labor-intensive and difficult in the generation of land cover data. Utilizing the high spatial resolution multi-temporal image data is a possible solution. Unfortunately, the spatial and temporal resolution of available remote sensing data like Landsat or MODIS datasets can hardly satisfy the minimum mapping unit and frequency of current land cover mapping / updating at the same time. The generation of high resolution time series may be a compromise to cover the shortage in land cover updating process. One of popular way is to downscale multi-temporal MODIS data with other high spatial resolution auxiliary data like Landsat. But the usual manner of downscaling pixel based on a window may lead to the underdetermined problem in heterogeneous area, result in the uncertainty of some high spatial resolution pixels. Therefore, the downscaled multi-temporal data can hardly reach high spatial resolution as Landsat data. A spiral based method was introduced to downscale low spatial and high temporal resolution image data to high spatial and high temporal resolution image data. By the way of searching the similar pixels around the adjacent region based on the spiral, the pixel set was made up in the adjacent region pixel by pixel. The underdetermined problem is prevented to a large extent from solving the linear system when adopting the pixel set constructed. With the help of ordinary least squares, the method inverted the endmember values of linear system. The high spatial resolution image was reconstructed on the basis of high spatial resolution class map and the endmember values band by band. Then, the high spatial resolution time series was formed with these

  13. Generation of high-dimensional energy-time-entangled photon pairs

    Science.gov (United States)

    Zhang, Da; Zhang, Yiqi; Li, Xinghua; Zhang, Dan; Cheng, Lin; Li, Changbiao; Zhang, Yanpeng

    2017-11-01

    High-dimensional entangled photon pairs have many excellent properties compared to two-dimensional entangled two-photon states, such as greater information capacity, stronger nonlocality, and higher security. Traditionally, the degree of freedom that can produce high-dimensional entanglement mainly consists of angular momentum and energy time. In this paper, we propose a type of high-dimensional energy-time-entangled qudit, which is different from the traditional model with an extended propagation path. In addition, our method mainly focuses on the generation with multiple frequency modes, while two- and three-dimensional frequency-entangled qudits are examined as examples in detail through the linear or nonlinear optical response of the medium. The generation of high-dimensional energy-time-entangled states can be verified by coincidence counts in the damped Rabi oscillation regime, where the paired Stokes-anti-Stokes wave packet is determined by the structure of resonances in the third-order nonlinearity. Finally, we extend the dimension to N in the sequential-cascade mode. Our results have potential applications in quantum communication and quantum computation.

  14. A space-time rainfall generator for highly convective Mediterranean rainstorms

    Directory of Open Access Journals (Sweden)

    S. Salsón

    2003-01-01

    Full Text Available Distributed hydrological models require fine resolution rainfall inputs, enhancing the practical interest of space-time rainfall models, capable of generating through numerical simulation realistic space-time rainfall intensity fields. Among different mathematical approaches, those based on point processes and built upon a convenient analytical description of the raincell as the fundamental unit, have shown to be particularly suitable and well adapted when extreme rainfall events of convective nature are considered. Starting from previous formulations, some analytical refinements have been considered, allowing practical generation of space-time rainfall intensity fields for that type of rainstorm events. Special attention is placed on the analytical description of the spatial and temporal evolution of the rainfall intensities produced by the raincells. After deriving the necessary analytical results, the seven parameters of the model have been estimated by the method of moments, for each of the 30 selected rainfall events in the Jucar River Basin (ValenciaSpain – period 1991 to 2000, using 5-min aggregated rainfall data series from an automatic raingauge network.

  15. A compact, low jitter, nanosecond rise time, high voltage pulse generator with variable amplitude.

    Science.gov (United States)

    Mao, Jiubing; Wang, Xin; Tang, Dan; Lv, Huayi; Li, Chengxin; Shao, Yanhua; Qin, Lan

    2012-07-01

    In this paper, a compact, low jitter, nanosecond rise time, command triggered, high peak power, gas-switch pulse generator system is developed for high energy physics experiment. The main components of the system are a high voltage capacitor, the spark gap switch and R = 50 Ω load resistance built into a structure to obtain a fast high power pulse. The pulse drive unit, comprised of a vacuum planar triode and a stack of avalanche transistors, is command triggered by a single or multiple TTL (transistor-transistor logic) level pulses generated by a trigger pulse control unit implemented using the 555 timer circuit. The control unit also accepts user input TTL trigger signal. The vacuum planar triode in the pulse driving unit that close the first stage switches is applied to drive the spark gap reducing jitter. By adjusting the charge voltage of a high voltage capacitor charging power supply, the pulse amplitude varies from 5 kV to 10 kV, with a rise time of capacitor recovery time.

  16. Is the Aluminum Hypothesis Dead?

    Science.gov (United States)

    2014-01-01

    The Aluminum Hypothesis, the idea that aluminum exposure is involved in the etiology of Alzheimer disease, dates back to a 1965 demonstration that aluminum causes neurofibrillary tangles in the brains of rabbits. Initially the focus of intensive research, the Aluminum Hypothesis has gradually been abandoned by most researchers. Yet, despite this current indifference, the Aluminum Hypothesis continues to attract the attention of a small group of scientists and aluminum continues to be viewed with concern by some of the public. This review article discusses reasons that mainstream science has largely abandoned the Aluminum Hypothesis and explores a possible reason for some in the general public continuing to view aluminum with mistrust. PMID:24806729

  17. Modified mean generation time parameter in the neutron point kinetics equations

    Energy Technology Data Exchange (ETDEWEB)

    Diniz, Rodrigo C.; Gonçalves, Alessandro C.; Rosa, Felipe S.S., E-mail: alessandro@nuclear.ufrj.br, E-mail: frosa@if.ufrj.br [Coordenacao de Pos-Graduacao e Pesquisa de Engenharia (PEN/COPPE/UFRJ), Rio de Janeiro, RJ (Brazil)

    2017-07-01

    This paper proposes an approximation for the modified point kinetics equations proposed by NUNES et. al, 2015, through the adjustment of a kinetic parameter. This approximation consists of analyzing the terms of the modified point kinetics equations in order to identify the least important ones for the solution, resulting in a modification of the mean generation time parameter that incorporates all influences of the additional terms of the modified kinetics. This approximation is applied on the inverse kinetics, to compare the results with the inverse kinetics from the modified kinetics in order to validate the proposed model. (author)

  18. Wind farms generation limits and its impact in real-time voltage stability assessment

    DEFF Research Database (Denmark)

    Perez, Angel; Jóhannsson, Hjörtur; Østergaard, Jacob

    2015-01-01

    This paper proposes a method to consider the impactof the wind farms maximum current limits on real-time voltagestability assessment. The approach is based in a multi-portequivalent of the system which makes possible to assess theeffect of each wind farm limit on the stability boundary, theapproach...... indicates the distance to the limit activation and theeffect of each load in such a limit. The wind farm control schemesincludes voltage control and it is represented as a constantcurrent at its limit. A criteria to select the critical bus bar, basedon the generator transformation coefficients, is presented...

  19. Self-Motion Perception: Assessment by Real-Time Computer Generated Animations

    Science.gov (United States)

    Parker, Donald E.

    1999-01-01

    Our overall goal is to develop materials and procedures for assessing vestibular contributions to spatial cognition. The specific objective of the research described in this paper is to evaluate computer-generated animations as potential tools for studying self-orientation and self-motion perception. Specific questions addressed in this study included the following. First, does a non- verbal perceptual reporting procedure using real-time animations improve assessment of spatial orientation? Are reports reliable? Second, do reports confirm expectations based on stimuli to vestibular apparatus? Third, can reliable reports be obtained when self-motion description vocabulary training is omitted?

  20. Fluorescent real-time quantitative measurements of intracellular peroxynitrite generation and inhibition.

    Science.gov (United States)

    Luo, Zhen; Zhao, Qin; Liu, Jixiang; Liao, Jinfang; Peng, Ruogu; Xi, Yunting; Diwu, Zhenjun

    2017-03-01

    Peroxynitrite (ONOO - ), a strong oxidant species, is produced by the reaction of nitric oxide (NO) and superoxide (O 2 .- ) radicals. It plays an important role as a biological regulator in numbers of physiological and pathological processes. In this study, we developed fluorescence-based real-time quantitative measurements to detect intracellular ONOO - . The probe DAX-J2 PON Green showed high selectivity toward ONOO - over other competing species, and has been successfully applied in microplate reader and flow cytometer to quantitatively measure endogenous ONOO - production. Moreover, the results demonstrated the inhibitory effects of curcumin on intracellular ONOO - generation. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Modified mean generation time parameter in the neutron point kinetics equations

    International Nuclear Information System (INIS)

    Diniz, Rodrigo C.; Gonçalves, Alessandro C.; Rosa, Felipe S.S.

    2017-01-01

    This paper proposes an approximation for the modified point kinetics equations proposed by NUNES et. al, 2015, through the adjustment of a kinetic parameter. This approximation consists of analyzing the terms of the modified point kinetics equations in order to identify the least important ones for the solution, resulting in a modification of the mean generation time parameter that incorporates all influences of the additional terms of the modified kinetics. This approximation is applied on the inverse kinetics, to compare the results with the inverse kinetics from the modified kinetics in order to validate the proposed model. (author)

  2. Real-time dynamic PC image generation techniques for high performance and high dynamic range fidelity

    Science.gov (United States)

    Bunfield, Dennis H.; Trimble, Darian E.; Fronckowiak, Thomas, Jr.; Ballard, Gary; Morris, Joesph

    2008-04-01

    AMRDEC has developed and implemented new techniques for rendering real-time 32-bit floating point energy-conserved dynamic scenes using commercial-off-the-shelf (COTS) Personal Computer (PC) based hardware and high performance nVidia Graphics Processing Units (GPU). The AMRDEC IGStudio rendering framework with the real-time Joint Scientific Image Generator (JSIG) core has been integrated into numerous AMRDEC Hardware-in-the-loop (HWIL) facilities, successfully replacing the lower fidelity legacy SGI hardware and software. JSIG uses high dynamic range unnormalized radiometric 32-bit floating point rendering through the use of GPU frame buffer objects (FBOs). A high performance nested zoom anti-aliasing (NZAA) technique was developed to address performance and geometric errors of past zoom anti-aliasing (ZAA) implementations. The NZAA capability for multi-object and occluded object representations includes: cluster ZAA, object ZAA, sub-object ZAA, and point source generation for unresolved objects. This technique has an optimal 128x128 pixel asymmetrical field-of-view zoom. The current NZAA capability supports up to 8 objects in real-time with a near future capability of increasing to a theoretical 128 objects in real-time. JSIG performs other dynamic entity effects which are applied in vertex and fragment shaders. These effects include floating point dynamic signature application, dynamic model ablation heating models, and per-material thermal emissivity rolloff interpolated on a per-pixel zoomed window basis. JSIG additionally performs full scene per-pixel effects in a post render process. These effects include real-time convolutions, optical scene corrections, per-frame calibrations, and energy distribution blur used to compensate for projector element energy limitations.

  3. Validity of Real-Time Data Generated by a Wearable Microtechnology Device.

    Science.gov (United States)

    Weaving, Dan; Whitehead, Sarah; Till, Kevin; Jones, Ben

    2017-10-01

    The purpose of this study was to investigate the validity of global positioning system (GPS) and micro-electrical-mechanical-system (MEMS) data generated in real time through a dedicated receiver. Postsession data acted as the criterion as it is used to plan the volume and intensity of future training and is downloaded directly from the device. Twenty-five professional rugby league players completed 2 training sessions wearing an MEMS device (Catapult S5, firmware version: 5.27). During sessions, real-time data were collected through the manufacturer receiver and dedicated software (Openfield v1.14), which was positioned outdoors at the same location for every session. The GPS variables included total-, low- (0-3 m·s), moderate- (3.1-5 m·s), high- (5.1-7 m·s), and very high-speed (>7.1 m·s) distances. Micro-electrical-mechanical-system data included total session PlayerLoad. When compared to postsession data, mean bias for total-, low-, moderate-, high-, and very high-speed distances were all trivial, with the typical error of the estimate (TEE) small, small, trivial, trivial and small, respectively. Pearson correlation coefficients for total-, low-, moderate-, high- and very-high-speed distances were nearly perfect, nearly perfect, perfect, perfect, and nearly perfect, respectively. For PlayerLoad, mean bias was trivial, whereas TEE was moderate and correlation nearly perfect. Practitioners should be confident that when interpreting real-time speed-derived metrics, the data generated in real-time are comparable with those downloaded directly from the device postsession. However, practitioners should refrain from interpreting accelerometer-derived data (i.e., PlayerLoad) or acknowledge the moderate error associated with this real-time measure.

  4. Real-time transient stabilization and voltage regulation of power generators with unknown mechanical power input

    International Nuclear Information System (INIS)

    Kenne, Godpromesse; Goma, Raphael; Nkwawo, Homere; Lamnabhi-Lagarrigue, Francoise; Arzande, Amir; Vannier, Jean Claude

    2010-01-01

    A nonlinear adaptive excitation controller is proposed to enhance the transient stability and voltage regulation of synchronous generators with unknown power angle and mechanical power input. The proposed method is based on a standard third-order model of a synchronous generator which requires only information about the physical available measurements of relative angular speed, active electric power, infinite bus and generator terminal voltages. The operating conditions are computed online using the above physical available measurements, the terminal voltage reference value and the estimate of the mechanical power input. The proposed design is therefore capable of providing satisfactory voltage in the presence of unknown variations of the power system operating conditions. Using the concept of sliding mode equivalent control techniques, a robust decentralized adaptive controller which insures the exponential convergence of the outputs to the desired ones, is obtained. Real-time experimental results are reported, comparing the performance of the proposed adaptive nonlinear control scheme to one of the conventional AVR/PSS controller. The high simplicity of the overall adaptive control scheme and its robustness with respect to line impedance variation including critical unbalanced operating condition and temporary turbine fault, constitute the main positive features of the proposed approach.

  5. Interactive Light Stimulus Generation with High Performance Real-Time Image Processing and Simple Scripting

    Directory of Open Access Journals (Sweden)

    László Szécsi

    2017-12-01

    Full Text Available Light stimulation with precise and complex spatial and temporal modulation is demanded by a series of research fields like visual neuroscience, optogenetics, ophthalmology, and visual psychophysics. We developed a user-friendly and flexible stimulus generating framework (GEARS GPU-based Eye And Retina Stimulation Software, which offers access to GPU computing power, and allows interactive modification of stimulus parameters during experiments. Furthermore, it has built-in support for driving external equipment, as well as for synchronization tasks, via USB ports. The use of GEARS does not require elaborate programming skills. The necessary scripting is visually aided by an intuitive interface, while the details of the underlying software and hardware components remain hidden. Internally, the software is a C++/Python hybrid using OpenGL graphics. Computations are performed on the GPU, and are defined in the GLSL shading language. However, all GPU settings, including the GPU shader programs, are automatically generated by GEARS. This is configured through a method encountered in game programming, which allows high flexibility: stimuli are straightforwardly composed using a broad library of basic components. Stimulus rendering is implemented solely in C++, therefore intermediary libraries for interfacing could be omitted. This enables the program to perform computationally demanding tasks like en-masse random number generation or real-time image processing by local and global operations.

  6. In the time of significant generational diversity - surgical leadership must step up!

    Science.gov (United States)

    Money, Samuel R; O'Donnell, Mark E; Gray, Richard J

    2014-02-01

    The diverse attitudes and motivations of surgeons and surgical trainees within different age groups present an important challenge for surgical leaders and educators. These challenges to surgical leadership are not unique, and other industries have likewise needed to grapple with how best to manage these various age groups. The authors will herein explore management and leadership for surgeons in a time of age diversity, define generational variations within "Baby-Boomer", "Generation X" and "Generation Y" populations, and identify work ethos concepts amongst these three groups. The surgical community must understand and embrace these concepts in order to continue to attract a stellar pool of applicants from medical school. By not accepting the changing attitudes and motivations of young trainees and medical students, we may disenfranchise a high percentage of potential future surgeons. Surgical training programs will fill, but will they contain the highest quality trainees? Copyright © 2013 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.

  7. Real-Time Extended Interface Automata for Software Testing Cases Generation

    Directory of Open Access Journals (Sweden)

    Shunkun Yang

    2014-01-01

    Full Text Available Testing and verification of the interface between software components are particularly important due to the large number of complex interactions, which requires the traditional modeling languages to overcome the existing shortcomings in the aspects of temporal information description and software testing input controlling. This paper presents the real-time extended interface automata (RTEIA which adds clearer and more detailed temporal information description by the application of time words. We also establish the input interface automaton for every input in order to solve the problems of input controlling and interface covering nimbly when applied in the software testing field. Detailed definitions of the RTEIA and the testing cases generation algorithm are provided in this paper. The feasibility and efficiency of this method have been verified in the testing of one real aircraft braking system.

  8. Effects of electrolysis time and electric potential on chlorine generation of electrolyzed deep ocean water.

    Science.gov (United States)

    Hsu, Guoo-Shyng Wang; Lu, Yi-Fa; Hsu, Shun-Yao

    2017-10-01

    Electrolyzed water is a sustainable disinfectant, which can comply with food safety regulations and is environmentally friendly. A two-factor central composite design was adopted for studying the effects of electrolysis time and electric potential on the chlorine generation efficiency of electrolyzed deep ocean water (DOW). DOW was electrolyzed in a glass electrolyzing cell equipped with platinum-plated titanium anode and cathode. The results showed that chlorine concentration reached maximal level in the batch process. Prolonged electrolysis reduced chlorine concentration in the electrolyte and was detrimental to electrolysis efficiency, especially under high electric potential conditions. Therefore, the optimal choice of electrolysis time depends on the electrolyzable chloride in DOW and cell potential adopted for electrolysis. The higher the electric potential, the faster the chlorine level reaches its maximum, but the lower the electric efficiency will be. Copyright © 2016. Published by Elsevier B.V.

  9. Real-time extended interface automata for software testing cases generation.

    Science.gov (United States)

    Yang, Shunkun; Xu, Jiaqi; Man, Tianlong; Liu, Bin

    2014-01-01

    Testing and verification of the interface between software components are particularly important due to the large number of complex interactions, which requires the traditional modeling languages to overcome the existing shortcomings in the aspects of temporal information description and software testing input controlling. This paper presents the real-time extended interface automata (RTEIA) which adds clearer and more detailed temporal information description by the application of time words. We also establish the input interface automaton for every input in order to solve the problems of input controlling and interface covering nimbly when applied in the software testing field. Detailed definitions of the RTEIA and the testing cases generation algorithm are provided in this paper. The feasibility and efficiency of this method have been verified in the testing of one real aircraft braking system.

  10. Real-time Trading Strategies for Proactive Distribution Company with Distributed Generation and Demand Response

    DEFF Research Database (Denmark)

    Wang, Qi

    Distributed energy resources (DERs), such as distributed generation (DG) and demand response (DR), have been recognized worldwide as valuable resources. High integration of DG and DR in the distribution network inspires a potential deregulated environment for the distribution company (DISCO......) directly procuring capacities from local DG and DR. In this situation, a hierarchical market structure is achieved comprising the transmission-level (TL) and distribution-level (DL) markets. Focusing on the real-time process, as the interface actor, the DISCO's behavior covers downwardly procuring DL DG...... and DR resources, and upwardly trading in the TL real-time market, resulting in a proactive manner. The DL aggregator (DA) is dened to manage these small-scale and dispersed DGs and DRs. A methodology is proposed in this thesis for a proactive DISCO (PDISCO) to strategically trade with DAs...

  11. Probabilistic analysis of degradation incubation time of steam generator tubing materials

    International Nuclear Information System (INIS)

    Pandey, M.D.; Jyrkama, M.I.; Lu, Y.; Chi, L.

    2012-01-01

    The prediction of degradation free lifetime of steam generator (SG) tubing material is an important step in the life cycle management and decision for replacement of steam generators during the refurbishment of a nuclear station. Therefore, an extensive experimental research program has been undertaken by the Canadian Nuclear Industry to investigate the degradation of widely-used SG tubing alloys, namely, Alloy 600 TT, Alloy 690 TT, and Alloy 800. The corrosion related degradations of passive metals, such as pitting, crevice corrosion and stress corrosion cracking (SCC) etc. are assumed to start with the break down of the passive film at the tube-environment interface, which is characterized by the incubation time for passivity breakdown and then the degradation growth rate, and both are influenced by the chemical environment and coolant temperature. Since the incubation time and growth rate exhibit significant variability in the laboratory tests used to simulate these degradation processes, the use of probabilistic modeling is warranted. A pit is initiated with the breakdown of the passive film on the SG tubing surface. Upon exposure to aggressive environments, pitting corrosion may not initiate immediately, or may initiate and then re-passivate. The time required to initiate pitting corrosion is called the pitting incubation time, and that can be used to characterize the corrosion resistance of a material under specific test conditions. Pitting may be the precursor to other corrosion degradation mechanisms, such as environmentally-assisted cracking. This paper will provide an overview of the results of the first stage of experimental program in which samples of Alloy 600 TT, Alloy 690 TT, and Alloy 800 were tested under various temperatures and potentials and simulated crevice environments. The testing environment was chosen to represent layup, startup, and full operating conditions of the steam generators. Degradation incubation times for over 80 samples were

  12. Real-time Walking Pattern Generation for a Biped Robot with Hybrid CPG-ZMP Algorithm

    Directory of Open Access Journals (Sweden)

    Bin He

    2014-10-01

    Full Text Available Biped robots have better mobility than conventional wheeled robots. The bio-inspired method based on a central pattern generator (CPG can be used to control biped robot walking in a manner like human beings. However, to achieve stable locomotion, it is difficult to modulate the parameters for the neural networks to coordinate every degree of freedom of the walking robot. The zero moment point (ZMP method is very popular for the stability control of biped robot walking. However, the reference trajectories have low energy efficiency, lack naturalness and need significant offline calculation. This paper presents a new method for biped real-time walking generation using a hybrid CPG-ZMP control algorithm. The method can realize a stable walking pattern by combining the ZMP criterion with rhythmic motion control. The CPG component is designed to generate the desired motion for each robot joint, which is modulated by phase resetting according to foot contact information. By introducing the ZMP location, the activity of the CPG output signal is adjusted to coordinate the limbs’ motion and allow the robot to maintain balance during the process of locomotion. The numerical simulation results show that, compared with the CPG method, the new hybrid CPG-ZMP algorithm can enhance the robustness of the CPG parameters and improve the stability of the robot. In addition, the proposed algorithm is more energy efficient than the ZMP method. The results also demonstrate that the control system can generate an adaptive walking pattern through interactions between the robot, the CPG and the environment.

  13. A Decentralized Framework for Real-Time Energy Trading in Distribution Networks with Load and Generation Uncertainty

    OpenAIRE

    Bahrami, Shahab; Amini, M. Hadi

    2017-01-01

    The proliferation of small-scale renewable generators and price-responsive loads makes it a challenge for distribution network operators (DNOs) to schedule the controllable loads of the load aggregators and the generation of the generators in real-time. Additionally, the high computational burden and violation of the entities' (i.e., load aggregators' and generators') privacy make a centralized framework impractical. In this paper, we propose a decentralized energy trading algorithm that can ...

  14. Generational and time period differences in American adolescents' religious orientation, 1966-2014.

    Directory of Open Access Journals (Sweden)

    Jean M Twenge

    Full Text Available In four large, nationally representative surveys (N = 11.2 million, American adolescents and emerging adults in the 2010s (Millennials were significantly less religious than previous generations (Boomers, Generation X at the same age. The data are from the Monitoring the Future studies of 12th graders (1976-2013, 8th and 10th graders (1991-2013, and the American Freshman survey of entering college students (1966-2014. Although the majority of adolescents and emerging adults are still religiously involved, twice as many 12th graders and college students, and 20%-40% more 8th and 10th graders, never attend religious services. Twice as many 12th graders and entering college students in the 2010s (vs. the 1960s-70s give their religious affiliation as "none," as do 40%-50% more 8th and 10th graders. Recent birth cohorts report less approval of religious organizations, are less likely to say that religion is important in their lives, report being less spiritual, and spend less time praying or meditating. Thus, declines in religious orientation reach beyond affiliation to religious participation and religiosity, suggesting a movement toward secularism among a growing minority. The declines are larger among girls, Whites, lower-SES individuals, and in the Northeastern U.S., very small among Blacks, and non-existent among political conservatives. Religious affiliation is lower in years with more income inequality, higher median family income, higher materialism, more positive self-views, and lower social support. Overall, these results suggest that the lower religious orientation of Millennials is due to time period or generation, and not to age.

  15. Generational and Time Period Differences in American Adolescents’ Religious Orientation, 1966–2014

    Science.gov (United States)

    Twenge, Jean M.; Exline, Julie J.; Grubbs, Joshua B.; Sastry, Ramya; Campbell, W. Keith

    2015-01-01

    In four large, nationally representative surveys (N = 11.2 million), American adolescents and emerging adults in the 2010s (Millennials) were significantly less religious than previous generations (Boomers, Generation X) at the same age. The data are from the Monitoring the Future studies of 12th graders (1976–2013), 8th and 10th graders (1991–2013), and the American Freshman survey of entering college students (1966–2014). Although the majority of adolescents and emerging adults are still religiously involved, twice as many 12th graders and college students, and 20%–40% more 8th and 10th graders, never attend religious services. Twice as many 12th graders and entering college students in the 2010s (vs. the 1960s–70s) give their religious affiliation as “none,” as do 40%–50% more 8th and 10th graders. Recent birth cohorts report less approval of religious organizations, are less likely to say that religion is important in their lives, report being less spiritual, and spend less time praying or meditating. Thus, declines in religious orientation reach beyond affiliation to religious participation and religiosity, suggesting a movement toward secularism among a growing minority. The declines are larger among girls, Whites, lower-SES individuals, and in the Northeastern U.S., very small among Blacks, and non-existent among political conservatives. Religious affiliation is lower in years with more income inequality, higher median family income, higher materialism, more positive self-views, and lower social support. Overall, these results suggest that the lower religious orientation of Millennials is due to time period or generation, and not to age. PMID:25962174

  16. Generational and time period differences in American adolescents' religious orientation, 1966-2014.

    Science.gov (United States)

    Twenge, Jean M; Exline, Julie J; Grubbs, Joshua B; Sastry, Ramya; Campbell, W Keith

    2015-01-01

    In four large, nationally representative surveys (N = 11.2 million), American adolescents and emerging adults in the 2010s (Millennials) were significantly less religious than previous generations (Boomers, Generation X) at the same age. The data are from the Monitoring the Future studies of 12th graders (1976-2013), 8th and 10th graders (1991-2013), and the American Freshman survey of entering college students (1966-2014). Although the majority of adolescents and emerging adults are still religiously involved, twice as many 12th graders and college students, and 20%-40% more 8th and 10th graders, never attend religious services. Twice as many 12th graders and entering college students in the 2010s (vs. the 1960s-70s) give their religious affiliation as "none," as do 40%-50% more 8th and 10th graders. Recent birth cohorts report less approval of religious organizations, are less likely to say that religion is important in their lives, report being less spiritual, and spend less time praying or meditating. Thus, declines in religious orientation reach beyond affiliation to religious participation and religiosity, suggesting a movement toward secularism among a growing minority. The declines are larger among girls, Whites, lower-SES individuals, and in the Northeastern U.S., very small among Blacks, and non-existent among political conservatives. Religious affiliation is lower in years with more income inequality, higher median family income, higher materialism, more positive self-views, and lower social support. Overall, these results suggest that the lower religious orientation of Millennials is due to time period or generation, and not to age.

  17. Wolf (Canis lupus Generation Time and Proportion of Current Breeding Females by Age.

    Directory of Open Access Journals (Sweden)

    L David Mech

    Full Text Available Information is sparse about aspects of female wolf (Canis lupus breeding in the wild, including age of first reproduction, mean age of primiparity, generation time, and proportion of each age that breeds in any given year. We studied these subjects in 86 wolves (113 captures in the Superior National Forest (SNF, Minnesota (MN, during 1972-2013 where wolves were legally protected for most of the period, and in 159 harvested wolves from throughout MN wolf range during 2012-2014. Breeding status of SNF wolves were assessed via nipple measurements, and wolves from throughout MN wolf range, by placental scars. In the SNF, proportions of currently breeding females (those breeding in the year sampled ranged from 19% at age 2 to 80% at age 5, and from throughout wolf range, from 33% at age 2 to 100% at age 7. Excluding pups and yearlings, only 33% to 36% of SNF females and 58% of females from throughout MN wolf range bred in any given year. Generation time for SNF wolves was 4.3 years and for MN wolf range, 4.7 years. These findings will be useful in modeling wolf population dynamics and in wolf genetic and dog-domestication studies.

  18. Wolf (Canis lupus) Generation Time and Proportion of Current Breeding Females by Age.

    Science.gov (United States)

    Mech, L David; Barber-Meyer, Shannon M; Erb, John

    2016-01-01

    Information is sparse about aspects of female wolf (Canis lupus) breeding in the wild, including age of first reproduction, mean age of primiparity, generation time, and proportion of each age that breeds in any given year. We studied these subjects in 86 wolves (113 captures) in the Superior National Forest (SNF), Minnesota (MN), during 1972-2013 where wolves were legally protected for most of the period, and in 159 harvested wolves from throughout MN wolf range during 2012-2014. Breeding status of SNF wolves were assessed via nipple measurements, and wolves from throughout MN wolf range, by placental scars. In the SNF, proportions of currently breeding females (those breeding in the year sampled) ranged from 19% at age 2 to 80% at age 5, and from throughout wolf range, from 33% at age 2 to 100% at age 7. Excluding pups and yearlings, only 33% to 36% of SNF females and 58% of females from throughout MN wolf range bred in any given year. Generation time for SNF wolves was 4.3 years and for MN wolf range, 4.7 years. These findings will be useful in modeling wolf population dynamics and in wolf genetic and dog-domestication studies.

  19. Wolf (Canis lupus) generation time and proportion of current breeding females by age

    Science.gov (United States)

    Mech, L. David; Barber-Meyer, Shannon M.; Erb, John

    2016-01-01

    Information is sparse about aspects of female wolf (Canis lupus) breeding in the wild, including age of first reproduction, mean age of primiparity, generation time, and proportion of each age that breeds in any given year. We studied these subjects in 86 wolves (113 captures) in the Superior National Forest (SNF), Minnesota (MN), during 1972–2013 where wolves were legally protected for most of the period, and in 159 harvested wolves from throughout MN wolf range during 2012–2014. Breeding status of SNF wolves were assessed via nipple measurements, and wolves from throughout MN wolf range, by placental scars. In the SNF, proportions of currently breeding females (those breeding in the year sampled) ranged from 19% at age 2 to 80% at age 5, and from throughout wolf range, from 33% at age 2 to 100% at age 7. Excluding pups and yearlings, only 33% to 36% of SNF females and 58% of females from throughout MN wolf range bred in any given year. Generation time for SNF wolves was 4.3 years and for MN wolf range, 4.7 years. These findings will be useful in modeling wolf population dynamics and in wolf genetic and dog-domestication studies.

  20. Hypothesis Validity of Clinical Research.

    Science.gov (United States)

    Wampold, Bruce E.; And Others

    1990-01-01

    Describes hypothesis validity as extent to which research results reflect theoretically derived predictions about relations between or among constructs. Discusses role of hypotheses in theory testing. Presents four threats to hypothesis validity: (1) inconsequential research hypotheses; (2) ambiguous research hypotheses; (3) noncongruence of…

  1. The pyrophilic primate hypothesis.

    Science.gov (United States)

    Parker, Christopher H; Keefe, Earl R; Herzog, Nicole M; O'connell, James F; Hawkes, Kristen

    2016-01-01

    Members of genus Homo are the only animals known to create and control fire. The adaptive significance of this unique behavior is broadly recognized, but the steps by which our ancestors evolved pyrotechnic abilities remain unknown. Many hypotheses attempting to answer this question attribute hominin fire to serendipitous, even accidental, discovery. Using recent paleoenvironmental reconstructions, we present an alternative scenario in which, 2 to 3 million years ago in tropical Africa, human fire dependence was the result of adapting to progressively fire-prone environments. The extreme and rapid fluctuations between closed canopy forests, woodland, and grasslands that occurred in tropical Africa during that time, in conjunction with reductions in atmospheric carbon dioxide levels, changed the fire regime of the region, increasing the occurrence of natural fires. We use models from optimal foraging theory to hypothesize benefits that this fire-altered landscape provided to ancestral hominins and link these benefits to steps that transformed our ancestors into a genus of active pyrophiles whose dependence on fire for survival contributed to its rapid expansion out of Africa. © 2016 Wiley Periodicals, Inc.

  2. Generator estimation of Markov jump processes based on incomplete observations nonequidistant in time

    Science.gov (United States)

    Metzner, Philipp; Horenko, Illia; Schütte, Christof

    2007-12-01

    Markov jump processes can be used to model the effective dynamics of observables in applications ranging from molecular dynamics to finance. In this paper we present a different method which allows the inverse modeling of Markov jump processes based on incomplete observations in time: We consider the case of a given time series of the discretely observed jump process. We show how to compute efficiently the maximum likelihood estimator of its infinitesimal generator and demonstrate in detail that the method allows us to handle observations nonequidistant in time. The method is based on the work of and Bladt and Sørensen [J. R. Stat. Soc. Ser. B (Stat. Methodol.) 67, 395 (2005)] but scales much more favorably than it with the length of the time series and the dimension and size of the state space of the jump process. We illustrate its performance on a toy problem as well as on data arising from simulations of biochemical kinetics of a genetic toggle switch.

  3. Multiple hypothesis tracking for the cyber domain

    Science.gov (United States)

    Schwoegler, Stefan; Blackman, Sam; Holsopple, Jared; Hirsch, Michael J.

    2011-09-01

    This paper discusses how methods used for conventional multiple hypothesis tracking (MHT) can be extended to domain-agnostic tracking of entities from non-kinematic constraints such as those imposed by cyber attacks in a potentially dense false alarm background. MHT is widely recognized as the premier method to avoid corrupting tracks with spurious data in the kinematic domain but it has not been extensively applied to other problem domains. The traditional approach is to tightly couple track maintenance (prediction, gating, filtering, probabilistic pruning, and target confirmation) with hypothesis management (clustering, incompatibility maintenance, hypothesis formation, and Nassociation pruning). However, by separating the domain specific track maintenance portion from the domain agnostic hypothesis management piece, we can begin to apply the wealth of knowledge gained from ground and air tracking solutions to the cyber (and other) domains. These realizations led to the creation of Raytheon's Multiple Hypothesis Extensible Tracking Architecture (MHETA). In this paper, we showcase MHETA for the cyber domain, plugging in a well established method, CUBRC's INFormation Engine for Real-time Decision making, (INFERD), for the association portion of the MHT. The result is a CyberMHT. We demonstrate the power of MHETA-INFERD using simulated data. Using metrics from both the tracking and cyber domains, we show that while no tracker is perfect, by applying MHETA-INFERD, advanced nonkinematic tracks can be captured in an automated way, perform better than non-MHT approaches, and decrease analyst response time to cyber threats.

  4. Memory in astrocytes: a hypothesis

    Directory of Open Access Journals (Sweden)

    Caudle Robert M

    2006-01-01

    Full Text Available Abstract Background Recent work has indicated an increasingly complex role for astrocytes in the central nervous system. Astrocytes are now known to exchange information with neurons at synaptic junctions and to alter the information processing capabilities of the neurons. As an extension of this trend a hypothesis was proposed that astrocytes function to store information. To explore this idea the ion channels in biological membranes were compared to models known as cellular automata. These comparisons were made to test the hypothesis that ion channels in the membranes of astrocytes form a dynamic information storage device. Results Two dimensional cellular automata were found to behave similarly to ion channels in a membrane when they function at the boundary between order and chaos. The length of time information is stored in this class of cellular automata is exponentially related to the number of units. Therefore the length of time biological ion channels store information was plotted versus the estimated number of ion channels in the tissue. This analysis indicates that there is an exponential relationship between memory and the number of ion channels. Extrapolation of this relationship to the estimated number of ion channels in the astrocytes of a human brain indicates that memory can be stored in this system for an entire life span. Interestingly, this information is not affixed to any physical structure, but is stored as an organization of the activity of the ion channels. Further analysis of two dimensional cellular automata also demonstrates that these systems have both associative and temporal memory capabilities. Conclusion It is concluded that astrocytes may serve as a dynamic information sink for neurons. The memory in the astrocytes is stored by organizing the activity of ion channels and is not associated with a physical location such as a synapse. In order for this form of memory to be of significant duration it is necessary

  5. Using electromagnetic conductivity imaging to generate time-lapse soil moisture estimates.

    Science.gov (United States)

    Huang, Jingyi; Scuderio, Elia; Corwin, Dennis; Triantafilis, John

    2015-04-01

    Irrigated agriculture is crucial to the agricultural productivity of the Moreno valley. To maintain profitability, more will need to be done by irrigators with less water, owing to competing demands from rapidly expanding urbanisation in southern California. In this regard, irrigators need to understand the spatial and temporal variation of soil moisture to discern inefficiencies. However, soil moisture is difficult to measure and monitor, unless a large bank of soil sensors are installed and at various depths in the profile. In order to value add to the limited amount of information, geophysical techniques, such as direct current resisivity (DCR) arrays are used to develop electrical resistivity images (ERI). Whilst successful the approach is time consuming and labour intensive. In this research we describe how equivalent data can be collected using a proximal sensing electromagnetic (EM) induction instrument (i.e. DUALEM-421) and inversion software (EM4Soil) to generate EM conductivity images (EMCI). Figure 1 shows the EMCI generated from DUALEM-421 data acquired at various days of a time-lapse experiment and including; day a) 0, b) 1, c) 2, d) 3, e) 5, f) 7 and g) 11. We calibrate the estimates of true electrical conductivity (sigma - mS/m) with volumetric moisture content and show with good accuracy the spatial and temporal variation of soil moisture status and over 12 day period. The results show clearly that the pivot sprinkler irrigation system is effective at providing sufficient amounts of water to the top 0.5 m of a Lucerne crop (i.e. red shaded areas of high sigma). However, in some places faulty sprinklers are evident owing to the lack of wetting (i.e. blue shaded areas of low sigma). In addition, and over time, our approach shows clearly the effect the Lucerne crop has in drying the soil profile and using the soil moisture.

  6. Time synchronization of new-generation BDS satellites using inter-satellite link measurements

    Science.gov (United States)

    Pan, Junyang; Hu, Xiaogong; Zhou, Shanshi; Tang, Chengpan; Guo, Rui; Zhu, Lingfeng; Tang, Guifeng; Hu, Guangming

    2018-01-01

    Autonomous satellite navigation is based on the ability of a Global Navigation Satellite System (GNSS), such as Beidou, to estimate orbits and clock parameters onboard satellites using Inter-Satellite Link (ISL) measurements instead of tracking data from a ground monitoring network. This paper focuses on the time synchronization of new-generation Beidou Navigation Satellite System (BDS) satellites equipped with an ISL payload. Two modes of Ka-band ISL measurements, Time Division Multiple Access (TDMA) mode and the continuous link mode, were used onboard these BDS satellites. Using a mathematical formulation for each measurement mode along with a derivation of the satellite clock offsets, geometric ranges from the dual one-way measurements were introduced. Then, pseudoranges and clock offsets were evaluated for the new-generation BDS satellites. The evaluation shows that the ranging accuracies of TDMA ISL and the continuous link are approximately 4 cm and 1 cm (root mean square, RMS), respectively. Both lead to ISL clock offset residuals of less than 0.3 ns (RMS). For further validation, time synchronization between these satellites to a ground control station keeping the systematic time in BDT was conducted using L-band Two-way Satellite Time Frequency Transfer (TWSTFT). System errors in the ISL measurements were calibrated by comparing the derived clock offsets with the TWSTFT. The standard deviations of the estimated ISL system errors are less than 0.3 ns, and the calibrated ISL clock parameters are consistent with that of the L-band TWSTFT. For the regional BDS network, the addition of ISL measurements for medium orbit (MEO) BDS satellites increased the clock tracking coverage by more than 40% for each orbital revolution. As a result, the clock predicting error for the satellite M1S was improved from 3.59 to 0.86 ns (RMS), and the predicting error of the satellite M2S was improved from 1.94 to 0.57 ns (RMS), which is a significant improvement by a factor of 3-4.

  7. Runoff Generation Mechanisms and Mean Transit Time in a High-Elevation Tropical Ecosystem

    Science.gov (United States)

    Mosquera, G.

    2015-12-01

    Understanding runoff generation processes in tropical mountainous regions remains poorly understood, particularly in ecosystems above the tree line. Here, we provide insights on the process dominating the ecohydrology of the tropical alpine biome (i.e., páramo) of the Zhurucay River Ecohydrological Observatory. The study site is located in south Ecuador between 3400-3900 m in elevation. We used a nested monitoring system with eight catchments (20-753 ha) to measure hydrometric data since December 2010. Biweekly samples of rainfall, streamflow, and soil water at low tension were collected for three years (May 2011-May2014) and analyzed for water stable isotopes. We conducted an isotopic characterization of rainfall, streamflow, and soil waters to investigate runoff generation. These data were also integrated into a lumped model to estimate the mean transit time (MTT) and to investigate landscape features that control its variability. The isotopic characterization evidenced that the water stored in the shallow organic horizon of the Histosol soils (Andean wetlands) located near the streams is the major contributor of water to the streams year-round, whereas the water draining through the hillslope soils, the Andosols, regulates discharge by recharging the wetlands at the valley bottoms. The MTT evaluation indicated relatively short MTTs (0.15-0.73 yr) linked to short subsurface flow paths of water. We also found evidence for topographic controls on the MTT variability. These results reveal that: 1) the ecohydrology of this ecosystem is dominated by shallow subsurface flow in the organic horizon of the soils and 2) the combination of the high storage capacity of the Andean wetlands and the slope of the catchments controls runoff generation and the high water regulation capacity of the ecosystem.

  8. Evaluation of 'period-generated' control laws for the time-optimal control of reactor power

    International Nuclear Information System (INIS)

    Bernard, J.A.

    1988-01-01

    Time-Optimal control of neutronic power has recently been achieved by developing control laws that determine the actuator mechanism velocity necessary to produce a specified reactor period. These laws are designated as the 'MIT-SNL Period-Generated Minimum Time Control Laws'. Relative to time-optimal response, they function by altering the rate of change of reactivity so that the instantaneous period is stepped from infinity to its minimum allowed value, held at that value until the desired power level is attained, and then stepped back to infinity. The results of a systematic evaluation of these laws are presented. The behavior of each term in the control laws is shown and the capability of these laws to control properly the reactor power is demonstrated. Factors affecting the implementation of these laws, such as the prompt neutron lifetime and the differential reactivity worth of the actuators, are discussed. Finally, the results of an experimental study in which these laws were used to adjust the power of the 5 MWt MIT Research Reactor are shown. The information presented should be of interest to those designing high performance control systems for test, spacecraft, or, in certain instances, commercial reactors

  9. Magnitude and sign of long-range correlated time series: Decomposition and surrogate signal generation.

    Science.gov (United States)

    Gómez-Extremera, Manuel; Carpena, Pedro; Ivanov, Plamen Ch; Bernaola-Galván, Pedro A

    2016-04-01

    We systematically study the scaling properties of the magnitude and sign of the fluctuations in correlated time series, which is a simple and useful approach to distinguish between systems with different dynamical properties but the same linear correlations. First, we decompose artificial long-range power-law linearly correlated time series into magnitude and sign series derived from the consecutive increments in the original series, and we study their correlation properties. We find analytical expressions for the correlation exponent of the sign series as a function of the exponent of the original series. Such expressions are necessary for modeling surrogate time series with desired scaling properties. Next, we study linear and nonlinear correlation properties of series composed as products of independent magnitude and sign series. These surrogate series can be considered as a zero-order approximation to the analysis of the coupling of magnitude and sign in real data, a problem still open in many fields. We find analytical results for the scaling behavior of the composed series as a function of the correlation exponents of the magnitude and sign series used in the composition, and we determine the ranges of magnitude and sign correlation exponents leading to either single scaling or to crossover behaviors. Finally, we obtain how the linear and nonlinear properties of the composed series depend on the correlation exponents of their magnitude and sign series. Based on this information we propose a method to generate surrogate series with controlled correlation exponent and multifractal spectrum.

  10. Generation of Stationary Non-Gaussian Time Histories with a Specified Cross-spectral Density

    Directory of Open Access Journals (Sweden)

    David O. Smallwood

    1997-01-01

    Full Text Available The paper reviews several methods for the generation of stationary realizations of sampled time histories with non-Gaussian distributions and introduces a new method which can be used to control the cross-spectral density matrix and the probability density functions (pdfs of the multiple input problem. Discussed first are two methods for the specialized case of matching the auto (power spectrum, the skewness, and kurtosis using generalized shot noise and using polynomial functions. It is then shown that the skewness and kurtosis can also be controlled by the phase of a complex frequency domain description of the random process. The general case of matching a target probability density function using a zero memory nonlinear (ZMNL function is then covered. Next methods for generating vectors of random variables with a specified covariance matrix for a class of spherically invariant random vectors (SIRV are discussed. Finally the general case of matching the cross-spectral density matrix of a vector of inputs with non-Gaussian marginal distributions is presented.

  11. GPU-based real-time generation of large ultrasound volumes from freehand 3D sweeps

    Directory of Open Access Journals (Sweden)

    Jauer Philipp

    2015-09-01

    Full Text Available In the recent past, 3D ultrasound has been gaining relevance in many biomedical applications. One main limitation, however, is that typical ultrasound volumes are either very poorly resolved or only cover small areas. We have developed a GPU-accelerated method for live fusion of freehand 3D ultrasound sweeps to create one large volume. The method has been implemented in CUDA and is capable of generating an output volume with 0.5 mm resolution in real time while processing more than 45 volumes per second, with more than 300.000 voxels per volume. First experiments indicate that large structures like a whole forearm or high-resolution volumes of smaller structures like the hand can be combined efficiently. It is anticipated that this technology will be helpful in pediatric surgery where X-ray or CT imaging is not always possible.

  12. Particle swarm optimization for discrete-time inverse optimal control of a doubly fed induction generator.

    Science.gov (United States)

    Ruiz-Cruz, Riemann; Sanchez, Edgar N; Ornelas-Tellez, Fernando; Loukianov, Alexander G; Harley, Ronald G

    2013-12-01

    In this paper, the authors propose a particle swarm optimization (PSO) for a discrete-time inverse optimal control scheme of a doubly fed induction generator (DFIG). For the inverse optimal scheme, a control Lyapunov function (CLF) is proposed to obtain an inverse optimal control law in order to achieve trajectory tracking. A posteriori, it is established that this control law minimizes a meaningful cost function. The CLFs depend on matrix selection in order to achieve the control objectives; this matrix is determined by two mechanisms: initially, fixed parameters are proposed for this matrix by a trial-and-error method and then by using the PSO algorithm. The inverse optimal control scheme is illustrated via simulations for the DFIG, including the comparison between both mechanisms.

  13. Numerical study of fourth-harmonic generation of a picosecond laser pulse with time predelay

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, T.; Kato, Y.; Daido, H. [Institute of Laser Engineering, Osaka University, Yamada-oka 2-6, Suita, Osaka 565 (Japan)

    1996-06-01

    We describe fourth-harmonic generation of a picosecond laser pulse with KDP crystals. The coupled nonlinear equations for the parametric process including the third-order nonlinear susceptibility have been solved. Applying a time predelay in the doubling crystal between the extraordinary and the ordinary waves of the fundamental pulse causes the group-velocity mismatch and the nonlinear phase shift in the doubling crystal to be compensated for each other, resulting in pulse duration compression at the fourth-harmonic wavelength. It is shown that the reduction from a 1-ps fundamental pulse to a 0.25-ps fourth-harmonic pulse can be achieved at an incident intensity of 50 GW/cm{sup 2}. {copyright} {ital 1996 Optical Society of America.}

  14. Multi-Objective Planning of Multi-Type Distributed Generation Considering Timing Characteristics and Environmental Benefits

    Directory of Open Access Journals (Sweden)

    Yajing Gao

    2014-09-01

    Full Text Available This paper presents a novel approach to multi-type distributed generation (DG planning based on the analysis of investment and income brought by grid-connected DG. Firstly, the timing characteristics of loads and DG outputs, as well as the environmental benefits of DG are analyzed. Then, on the basis of the classification of daily load sequences, the typical daily load sequence and the typical daily output sequence of DG per unit capacity can be computed. The proposed planning model takes the location, capacity and types of DG into account as optimization variables. An improved adaptive genetic algorithm is proposed to solve the model. Case studies have been carried out on the IEEE 14-node distribution system to verify the feasibility and effectiveness of the proposed method and model.

  15. LCFM - LIVING COLOR FRAME MAKER: PC GRAPHICS GENERATION AND MANAGEMENT TOOL FOR REAL-TIME APPLICATIONS

    Science.gov (United States)

    Truong, L. V.

    1994-01-01

    Computer graphics are often applied for better understanding and interpretation of data under observation. These graphics become more complicated when animation is required during "run-time", as found in many typical modern artificial intelligence and expert systems. Living Color Frame Maker is a solution to many of these real-time graphics problems. Living Color Frame Maker (LCFM) is a graphics generation and management tool for IBM or IBM compatible personal computers. To eliminate graphics programming, the graphic designer can use LCFM to generate computer graphics frames. The graphical frames are then saved as text files, in a readable and disclosed format, which can be easily accessed and manipulated by user programs for a wide range of "real-time" visual information applications. For example, LCFM can be implemented in a frame-based expert system for visual aids in management of systems. For monitoring, diagnosis, and/or controlling purposes, circuit or systems diagrams can be brought to "life" by using designated video colors and intensities to symbolize the status of hardware components (via real-time feedback from sensors). Thus status of the system itself can be displayed. The Living Color Frame Maker is user friendly with graphical interfaces, and provides on-line help instructions. All options are executed using mouse commands and are displayed on a single menu for fast and easy operation. LCFM is written in C++ using the Borland C++ 2.0 compiler for IBM PC series computers and compatible computers running MS-DOS. The program requires a mouse and an EGA/VGA display. A minimum of 77K of RAM is also required for execution. The documentation is provided in electronic form on the distribution medium in WordPerfect format. A sample MS-DOS executable is provided on the distribution medium. The standard distribution medium for this program is one 5.25 inch 360K MS-DOS format diskette. The contents of the diskette are compressed using the PKWARE archiving tools

  16. Multi-GHz Synchronous Waveform Acquisition With Real-Time Pattern-Matching Trigger Generation

    Science.gov (United States)

    Kleinfelder, Stuart A.; Chiang, Shiuh-hua Wood; Huang, Wei

    2013-10-01

    A transient waveform capture and digitization circuit with continuous synchronous 2-GHz sampling capability and real-time programmable windowed trigger generation has been fabricated and tested. Designed in 0.25 μm CMOS, the digitizer contains a circular array of 128 sample and hold circuits for continuous sample acquisition, and attains 2-GHz sample speeds with over 800-MHz analog bandwidth. Sample clock generation is synchronous, combining a phase-locked loop for high-speed clock generation and a high-speed fully-differential shift register for distributing clocks to all 128 sample circuits. Using two comparators per sample, the sampled voltage levels are compared against two reference levels, a high threshold and a low threshold, that are set via per-comparator digital to analog converters (DACs). The 256 per-comparator 5-bit DACs compensate for comparator offsets and allow for fine reference level adjustment. The comparator results are matched in 8-sample-wide windows against up to 72 programmable patterns in real time using an on-chip programmable logic array. Each 8-sample trigger window is equivalent to 4 ns of acquisition, overlapped sample by sample in a circular fashion through the entire 128-sample array. The 72 pattern-matching trigger criteria can be programmed to be any combination of High-above the high threshold, Low-below the low threshold, Middle-between the two thresholds, or “Don't Care”-any state is accepted. A trigger pattern of “HLHLHLHL,” for example, watches for a waveform that is oscillating at about 1 GHz given the 2-GHz sample rate. A trigger is flagged in under 20 ns if there is a match, after which sampling is stopped, and on-chip digitization can proceed via 128 parallel 10-bit converters, or off-chip conversion can proceed via an analog readout. The chip exceeds 11 bits of dynamic range, nets over 800-MHz -3-dB bandwidth in a realistic system, and jitter in the PLL-based sampling clock has been measured to be about 1 part

  17. Femtosecond timing distribution and control for next generation accelerators and light sources

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Li -Jin [Idesta Quantum Electronics, LLC, Newton, NJ (United States)

    2014-03-31

    Femtosecond Timing Distribution At LCLS Free-electron-lasers (FEL) have the capability of producing high photon flux from the IR to the hard x-ray wavelength range and to emit femtosecond and eventually even attosecond pulses. This makes them an ideal tool for fundamental as well as applied re-search. Timing precision at the Stanford Linear Coherent Light Source (LCLS) between the x-ray FEL (XFEL) and ultrafast optical lasers is currently no better than 100 fs RMS. Ideally this precision should be much better and could be limited only by the x-ray pulse duration, which can be as short as a few femtoseconds. An increasing variety of science problems involving electron and nuclear dynamics in chemical and material systems will become accessible as the timing improves to a few femtoseconds. Advanced methods of electron beam conditioning or pulse injection could allow the FEL to achieve pulse durations less than one femtosecond. The objective of the work described in this proposal is to set up an optical timing distribution system based on mode locked Erbium doped fiber lasers at LCLS facility to improve the timing precision in the facility and allow time stamping with a 10 fs precision. The primary commercial applications for optical timing distributions systems are seen in the worldwide accelerator facilities and next generation light sources community. It is reasonable to expect that at least three major XFELs will be built in the next decade. In addition there will be up to 10 smaller machines, such as FERMI in Italy and Maxlab in Sweden, plus the market for upgrading already existing facilities like Jefferson Lab. The total market is estimated to be on the order of a 100 Million US Dollars. The company owns the exclusive rights to the IP covering the technology enabling sub-10 fs synchronization systems. Testing this technology, which has set records in a lab environment, at LCLS, hence in a real world scenario, is an important corner stone of bringing the

  18. An approach for generating synthetic fine temporal resolution solar radiation time series from hourly gridded datasets

    Directory of Open Access Journals (Sweden)

    Matthew Perry

    2017-06-01

    Full Text Available A tool has been developed to statistically increase the temporal resolution of solar irradiance time series. Fine temporal resolution time series are an important input into the planning process for solar power plants, and lead to increased understanding of the likely short-term variability of solar energy. The approach makes use of the spatial variability of hourly gridded datasets around a location of interest to make inferences about the temporal variability within the hour. The unique characteristics of solar irradiance data are modelled by classifying each hour into a typical weather situation. Low variability situations are modelled using an autoregressive process which is applied to ramps of clear-sky index. High variability situations are modelled as a transition between states of clear sky conditions and different levels of cloud opacity. The methods have been calibrated to Australian conditions using 1 min data from four ground stations for a 10 year period. These stations, together with an independent dataset, have also been used to verify the quality of the results using a number of relevant metrics. The results show that the method generates realistic fine resolution synthetic time series. The synthetic time series correlate well with observed data on monthly and annual timescales as they are constrained to the nearest grid-point value on each hour. The probability distributions of the synthetic and observed global irradiance data are similar, with Kolmogorov-Smirnov test statistic less than 0.04 at each station. The tool could be useful for the estimation of solar power output for integration studies.

  19. PriLive: Privacy-preserving real-time filtering for Next Generation Sequencing.

    Science.gov (United States)

    Loka, Tobias P; Tausch, Simon H; Dabrowski, Piotr Wojciech; Radonic, Aleksandar; Nitsche, Andreas; Renard, Bernhard Y

    2018-03-06

    In Next Generation Sequencing (NGS), re-identification of individuals and other privacy-breaching strategies can be applied even for anonymized data. This also holds true for applications in which human DNA is acquired as a by-product, e.g. for viral or metagenomic samples from a human host. Conventional data protection strategies including cryptography and post-hoc filtering are only appropriate for the final and processed sequencing data. This can result in an insufficient level of data protection and a considerable time delay in the further analysis workflow. We present PriLive, a novel tool for the automated removal of sensitive data while the sequencing machine is running. Thereby, human sequence information can be detected and removed before being completely produced. This facilitates the compliance with strict data protection regulations. The unique characteristic to cause almost no time delay for further analyses is also a clear benefit for applications other than data protection. Especially if the sequencing data are dominated by known background signals, PriLive considerably accelerates consequent analyses by having only fractions of input data. Besides these conceptual advantages, PriLive achieves filtering results at least as accurate as conventional post-hoc filtering tools. PriLive is open-source software available at https://gitlab.com/rki_bioinformatics/PriLive. RenardB@rki.de. Supplementary data are available at Bioinformatics online.

  20. Self-motion perception: assessment by real-time computer-generated animations

    Science.gov (United States)

    Parker, D. E.; Phillips, J. O.

    2001-01-01

    We report a new procedure for assessing complex self-motion perception. In three experiments, subjects manipulated a 6 degree-of-freedom magnetic-field tracker which controlled the motion of a virtual avatar so that its motion corresponded to the subjects' perceived self-motion. The real-time animation created by this procedure was stored using a virtual video recorder for subsequent analysis. Combined real and illusory self-motion and vestibulo-ocular reflex eye movements were evoked by cross-coupled angular accelerations produced by roll and pitch head movements during passive yaw rotation in a chair. Contrary to previous reports, illusory self-motion did not correspond to expectations based on semicircular canal stimulation. Illusory pitch head-motion directions were as predicted for only 37% of trials; whereas, slow-phase eye movements were in the predicted direction for 98% of the trials. The real-time computer-generated animations procedure permits use of naive, untrained subjects who lack a vocabulary for reporting motion perception and is applicable to basic self-motion perception studies, evaluation of motion simulators, assessment of balance disorders and so on.

  1. New generation of magnetic and luminescent nanoparticles for in vivo real-time imaging.

    Science.gov (United States)

    Lacroix, Lise-Marie; Delpech, Fabien; Nayral, Céline; Lachaize, Sébastien; Chaudret, Bruno

    2013-06-06

    A new generation of optimized contrast agents is emerging, based on metallic nanoparticles (NPs) and semiconductor nanocrystals for, respectively, magnetic resonance imaging (MRI) and near-infrared (NIR) fluorescent imaging techniques. Compared with established contrast agents, such as iron oxide NPs or organic dyes, these NPs benefit from several advantages: their magnetic and optical properties can be tuned through size, shape and composition engineering, their efficiency can exceed by several orders of magnitude that of contrast agents clinically used, their surface can be modified to incorporate specific targeting agents and antifolding polymers to increase blood circulation time and tumour recognition, and they can possibly be integrated in complex architecture to yield multi-modal imaging agents. In this review, we will report the materials of choice based on the understanding of the basic physics of NIR and MRI techniques and their corresponding syntheses as NPs. Surface engineering, water transfer and specific targeting will be highlighted prior to their first use for in vivo real-time imaging. Highly efficient NPs that are safer and target specific are likely to enter clinical application in a near future.

  2. The Generation of Near-Real Time Data Products for MODIS

    Science.gov (United States)

    Teague, M.; Schmaltz, J. E.; Ilavajhala, S.; Ye, G.; Masuoka, E.; Murphy, K. J.; Michael, K.

    2010-12-01

    The GSFC Terrestrial Information Systems Branch (614.5) operate the Land and Atmospheres Near-real-time Capability for EOS (LANCE-MODIS) system. Other LANCE elements include -AIRS, -MLS, -OMI, and -AMSR-E. LANCE-MODIS incorporates the former Rapid Response system and will, in early 2011, include the Fire Information for Resource Management System (FIRMS). The purpose of LANCE is to provide applications users with a variety of products on a near-real time basis. The LANCE-MODIS data products include Level 1 (L1), L2 fire, snow, sea ice, cloud mask/profiles, aerosols, clouds, land surface reflectance, land surface temperature, and L2G and L3 gridded, daily, land surface reflectance products. Data are available either by ftp access (pull) or by subscription (push) and the L1 and L2 data products are available within an average of 2.5 hours of the observation time. The use of ancillary data products input to the standard science algorithms has been modified in order to obtain these latencies. The resulting products have been approved for applications use by the MODIS Science Team. The http://lance.nasa.gov site provides registration information and extensive information concerning the MODIS data products and imagery including a comparison between the LANCE-MODIS and the standard science-quality products generated by the MODAPS system. The LANCE-MODIS system includes a variety of tools that enable users to manipulate the data products including: parameter, band, and geographic subsetting, re-projection, mosaicing, and generation of data in the GeoTIFF format. In most instances the data resulting from use of these tools has a latency of less than 3 hours. Access to these tools is available through a Web Coverage Service. A Google Earth/Web Mapping Service is available to access image products. LANCE-MODIS supports a wide variety of applications users in civilian, military, and foreign agencies as well as universities and the private sector. Examples of applications are

  3. Transmission and generation maintenance scheduling with different time scales in power systems

    Science.gov (United States)

    Marwali, Muhammad Kemala Cita

    Typically, maintenance scheduling is considered as a part of long-term scheduling (LTS) problem in power systems. However, LTS is not an independent problem. In the maintenance scheduling, LTS is used to determine specific windows for transmission or generation maintenance scheduling. Using the window given by LTS, short-term scheduling (STS) will determine the commitment of available units. In this dissertation, LTS and STS, taking into account their interdependency, have been designed to perform an integrated scheduling. We begin our formulation with unit maintenance scheduling which includes a maintenance cost as objective function, and prevailing network and maintenance constraints such as crew constraints, resources availability as well as maintenance windows. Then we extend our formulation to include transmission maintenance in an integrated maintenance scheduling (IMS) problem in which the network is modeled as a probabilistic entity. We apply a double decomposition to solve LTS problem which consists of IMS, fuel dispatch and emission constraints. The first decomposition is a relaxation of the original problem in that it contains only IMS and emission constraints. This first decomposition is treated as master problem and fuel dispatch is treated as sub-problem for the second decomposition. In coordinating STS and LTS, we solve LTS and the STS problems independently using dynamic maintenance scheduling. We employ a double decomposition for LTS and Lagrangian relaxation for STS. The constraints relationship and input/output variables between LTS and STS are explored. We use a two-state continues time Markov model for generating units and transmission lines. To test the proposed dynamic scheduling approach, various randomly generated scenarios, reflecting future uncertainty, are considered using the Monte-Carlo simulation. A wide range of tests on the modified IEEE-RTS system and IEEE-118 bus system is presented to demonstrate the efficiency of the proposed

  4. 5-HTP hypothesis of schizophrenia.

    Science.gov (United States)

    Fukuda, K

    2014-01-01

    To pose a new hypothesis of schizophrenia that affirms and unifies conventional hypotheses. Outside the brain, there are 5-HTP-containing argyrophil cells that have tryptophan hydroxylase 1 without l-aromatic amino acid decarboxylase. Monoamine oxidase in the liver and lung metabolize 5-HT, rather than 5-HTP, and 5-HTP freely crosses the blood-brain barrier, converting to 5-HT in the brain. Therefore I postulate that hyperfunction of 5-HTP-containing argyrophil cells may be a cause of schizophrenia. I investigate the consistency of this hypothesis with other hypotheses using a deductive method. Overactive 5-HTP-containing argyrophil cells produce excess amounts of 5-HTP. Abundant 5-HTP increases 5-HT within the brain (linking to the 5-HT hypothesis), and leads to negative feedback of 5-HT synthesis at the rate-limiting step catalysed by tryptophan hydroxylase 2. Owing to this negative feedback, brain tryptophan is further metabolized via the kynurenine pathway. Increased kynurenic acid contributes to deficiencies of glutamate function and dopamine activity, known causes of schizophrenia. The 5-HTP hypothesis affirms conventional hypotheses, as the metabolic condition caused by acceleration of tryptophan hydroxylase 1 and suppression of tryptophan hydroxylase 2, activates both 5-HT and kynurenic acid. In order to empirically test the theory, it will be useful to monitor serum 5-HTP and match it to different phases of schizophrenia. This hypothesis may signal a new era with schizophrenia treated as a brain-gut interaction. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Test of Taylor's Hypothesis with Distributed Temperature

    Science.gov (United States)

    Cheng, Y.; Gentine, P.; Sayde, C.; Tanner, E.; Ochsner, T. E.; Dong, J.

    2016-12-01

    Taylor's hypothesis[Taylor, 1938] assumes that mean wind speed carries the spatial pattern of turbulent motion past a fixed point in a "frozen" way, which has been widely used to relate streamwise wavenumber and angular frequency . Experiments[Fisher, 1964; Tong, 1996] have shown some deviation from Taylor's hypothesis at highly turbulent intensity flows and at high wavenumbers. However, the velocity or scalar measurements have always been fixed at a few spatial points rather than distributed in space. This experiment was designed for the first time to directly compare the time and spatial spectrum of temperature to test Taylor's hypothesis, measuring temperature with high resolution in both time and space by Distributed Temperature Sensing utilizing the attenuation difference of Raman scattering in the optic fiber at the MOISST site Oklahoma. The length of transact is 233 meters along the dominant wind direction. The temperature sampling distance is 0.127m and sampling time frequency is 1 Hz. The heights of the 4 fiber cables parallel to ground are 1m, 1.254m, 1.508m and 1.762m respectively. Also, eddy covariance instrument was set up near the Distributed Temperature Sensing as comparison for temperature data. The temperature spatial spectrum could be obtained with one fixed time point, while the temperature time spectrum could be obtained with one fixed spatial point in the middle of transact. The preliminary results would be presented in the AGU fall meeting. Reference Fisher, M. J., and Davies, P.O.A.L (1964), Correlation measurements in a non-frozen pattern of turbulence, Journal of fluid mechanics, 18(1), 97-116. Taylor, G. I. (1938), The spectrum of turbulence, Proceedings of the Royal Society of London A: Mathematical, Physical and Engineering Sciences, 164(919), 476-490. Tong, C. (1996), Taylor's Hypothesis and Two-point Coherence Measurements, Boundary-Layer Meteorology, 81(3), 399-410.

  6. Digital Generation of Noise-Signals with Arbitrary Constant or Time-Varying Spectra (A noise generation software package and its application)

    CERN Document Server

    Tückmantel, Joachim

    2008-01-01

    Artificial creation of arbitrary noise signals is used in accelerator physics to reproduce a measured perturbation spectrum for simulations but also to generate real-time shaped noise spectra for controlled emittance blow-up giving tailored properties to the final bunch shape. It is demonstrated here how one can produce numerically what is, for all practical purposes, an unlimited quantity of non-periodic noise data having any predefined spectral density. This spectral density may be constant or varying with time. The noise output never repeats and has excellent statistical properties, important for very long-term applications. It is difficult to obtain such flexibility and spectral cleanliness using analogue techniques. This algorithm was applied both in computer simulations of bunch behaviour in the presence of RF noise in the PS, SPS and LHC and also to generate real-time noise, tracking the synchrotron frequency change during the energy ramp of the SPS and producing controlled longitudinal emittance blow-...

  7. Is carbon / CO2 taxes implementation timely for electricity and heat generation in Romania ?

    International Nuclear Information System (INIS)

    Tutuianu, O.; Fulger, E.D.; Vieru, A.; Feher, M.

    1996-01-01

    Lately, carbon / CO 2 taxes are very much discussed in Europe and in many countries of the world as economic and financial instruments for reducing the CO 2 emissions. Some countries have already introduced such taxes while in other countries or international organisations they are under study, especially concerning the moment, the way of implementation and the amount of taxes. CO 2 emissions in Romania, in absolute and specific values (per capita, per kWh equivalent) are lower than in other countries. This can be justified by the low level of electricity and heat output owing to the recent economic restructuring and by the energy sector characteristics: natural gas major contribution, hydroelectric power, cogeneration and nuclear power implementation. We can also mention, as a positive factor, the CO 2 absorption potential of the Romanian forests. Carbon / CO 2 taxes introduction has severe economic and social impact, such as: domestic coal extraction blockage, increase in the electricity and heat prices, decrease of Romanian export products competitiveness and reduction of population standard of living. Therefore, the authors are considering that carbon / CO 2 taxes introduction is not timely by the year 2000 for the Romanian electricity and heat generation. (author). 3 figs. 2 tabs. 10 refs

  8. A generative spike train model with time-structured higher order correlations.

    Science.gov (United States)

    Trousdale, James; Hu, Yu; Shea-Brown, Eric; Josić, Krešimir

    2013-01-01

    Emerging technologies are revealing the spiking activity in ever larger neural ensembles. Frequently, this spiking is far from independent, with correlations in the spike times of different cells. Understanding how such correlations impact the dynamics and function of neural ensembles remains an important open problem. Here we describe a new, generative model for correlated spike trains that can exhibit many of the features observed in data. Extending prior work in mathematical finance, this generalized thinning and shift (GTaS) model creates marginally Poisson spike trains with diverse temporal correlation structures. We give several examples which highlight the model's flexibility and utility. For instance, we use it to examine how a neural network responds to highly structured patterns of inputs. We then show that the GTaS model is analytically tractable, and derive cumulant densities of all orders in terms of model parameters. The GTaS framework can therefore be an important tool in the experimental and theoretical exploration of neural dynamics.

  9. Flavour generation during commercial barley and malt roasting operations: a time course study.

    Science.gov (United States)

    Yahya, Hafiza; Linforth, Robert S T; Cook, David J

    2014-02-15

    The roasting of barley and malt products generates colour and flavour, controlled principally by the time course of product temperature and moisture content. Samples were taken throughout the industrial manufacture of three classes of roasted product (roasted barley, crystal malt and black malt) and analysed for moisture content, colour and flavour volatiles. Despite having distinct flavour characteristics, the three products contained many compounds in common. The product concentrations through manufacture of 15 flavour compounds are used to consider the mechanisms (Maillard reaction, caramelisation, pyrolysis) by which they were formed. The use of water sprays resulted in transient increases in formation of certain compounds (e.g., 2-cyclopentene-1,4-dione) and a decrease in others (e.g., pyrrole). The study highlights rapid changes in colour and particularly flavour which occur at the end of roasting and onwards to the cooling floor. This highlights the need for commercial maltsters to ensure consistency of procedures from batch to batch. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Mobile Charge Generation Dynamics in P3HT:PCBM Observed by Time-Resolved Terahertz Spectroscopy

    DEFF Research Database (Denmark)

    Cooke, D. G.; Krebs, Frederik C; Jepsen, Peter Uhd

    2012-01-01

    Ultra-broadband time-resolved terahertz spectroscopy is used to examine the sub-ps conductivity dynamics of a conjugated polymer bulk heterojunction film P3HT:PCBM. We directly observe mobile charge generation dynamics on a sub-100 fs time scale.......Ultra-broadband time-resolved terahertz spectroscopy is used to examine the sub-ps conductivity dynamics of a conjugated polymer bulk heterojunction film P3HT:PCBM. We directly observe mobile charge generation dynamics on a sub-100 fs time scale....

  11. The atomic hypothesis: physical consequences

    International Nuclear Information System (INIS)

    Rivas, Martin

    2008-01-01

    The hypothesis that matter is made of some ultimate and indivisible objects, together with the restricted relativity principle, establishes a constraint on the kind of variables we are allowed to use for the variational description of elementary particles. We consider that the atomic hypothesis not only states the indivisibility of elementary particles, but also that these ultimate objects, if not annihilated, cannot be modified by any interaction so that all allowed states of an elementary particle are only kinematical modifications of any one of them. Therefore, an elementary particle cannot have excited states. In this way, the kinematical group of spacetime symmetries not only defines the symmetries of the system, but also the variables in terms of which the mathematical description of the elementary particles can be expressed in either the classical or the quantum mechanical description. When considering the interaction of two Dirac particles, the atomic hypothesis restricts the interaction Lagrangian to a kind of minimal coupling interaction

  12. Discussion of the Porter hypothesis

    International Nuclear Information System (INIS)

    1999-11-01

    In the reaction to the long-range vision of RMNO, published in 1996, The Dutch government posed the question whether a far-going and progressive modernization policy will lead to competitive advantages of high-quality products on partly new markets. Such a question is connected to the so-called Porter hypothesis: 'By stimulating innovation, strict environmental regulations can actually enhance competitiveness', from which statement it can be concluded that environment and economy can work together quite well. A literature study has been carried out in order to determine under which conditions that hypothesis is endorsed in the scientific literature and policy documents. Recommendations are given for further studies. refs

  13. The thrifty phenotype hypothesis revisited

    DEFF Research Database (Denmark)

    Vaag, A A; Grunnet, L G; Arora, G P

    2012-01-01

    Twenty years ago, Hales and Barker along with their co-workers published some of their pioneering papers proposing the 'thrifty phenotype hypothesis' in Diabetologia (4;35:595-601 and 3;36:62-67). Their postulate that fetal programming could represent an important player in the origin of type 2...... of the underlying molecular mechanisms. Type 2 diabetes is a multiple-organ disease, and developmental programming, with its idea of organ plasticity, is a plausible hypothesis for a common basis for the widespread organ dysfunctions in type 2 diabetes and the metabolic syndrome. Only two among the 45 known type 2...

  14. Effects of generation time on spray aerosol transport and deposition in models of the mouth-throat geometry.

    Science.gov (United States)

    Worth Longest, P; Hindle, Michael; Das Choudhuri, Suparna

    2009-06-01

    For most newly developed spray aerosol inhalers, the generation time is a potentially important variable that can be fully controlled. The objective of this study was to determine the effects of spray aerosol generation time on transport and deposition in a standard induction port (IP) and more realistic mouth-throat (MT) geometry. Capillary aerosol generation (CAG) was selected as a representative system in which spray momentum was expected to significantly impact deposition. Sectional and total depositions in the IP and MT geometries were assessed at a constant CAG flow rate of 25 mg/sec for aerosol generation times of 1, 2, and 4 sec using both in vitro experiments and a previously developed computational fluid dynamics (CFD) model. Both the in vitro and numerical results indicated that extending the generation time of the spray aerosol, delivered at a constant mass flow rate, significantly reduced deposition in the IP and more realistic MT geometry. Specifically, increasing the generation time of the CAG system from 1 to 4 sec reduced the deposition fraction in the IP and MT geometries by approximately 60 and 33%, respectively. Furthermore, the CFD predictions of deposition fraction were found to be in good agreement with the in vitro results for all times considered in both the IP and MT geometries. The numerical results indicated that the reduction in deposition fraction over time was associated with temporal dissipation of what was termed the spray aerosol "burst effect." Based on these results, increasing the spray aerosol generation time, at a constant mass flow rate, may be an effective strategy for reducing deposition in the standard IP and in more realistic MT geometries.

  15. Reverse hypothesis machine learning a practitioner's perspective

    CERN Document Server

    Kulkarni, Parag

    2017-01-01

    This book introduces a paradigm of reverse hypothesis machines (RHM), focusing on knowledge innovation and machine learning. Knowledge- acquisition -based learning is constrained by large volumes of data and is time consuming. Hence Knowledge innovation based learning is the need of time. Since under-learning results in cognitive inabilities and over-learning compromises freedom, there is need for optimal machine learning. All existing learning techniques rely on mapping input and output and establishing mathematical relationships between them. Though methods change the paradigm remains the same—the forward hypothesis machine paradigm, which tries to minimize uncertainty. The RHM, on the other hand, makes use of uncertainty for creative learning. The approach uses limited data to help identify new and surprising solutions. It focuses on improving learnability, unlike traditional approaches, which focus on accuracy. The book is useful as a reference book for machine learning researchers and professionals as ...

  16. Multi-agent sequential hypothesis testing

    KAUST Repository

    Kim, Kwang-Ki K.

    2014-12-15

    This paper considers multi-agent sequential hypothesis testing and presents a framework for strategic learning in sequential games with explicit consideration of both temporal and spatial coordination. The associated Bayes risk functions explicitly incorporate costs of taking private/public measurements, costs of time-difference and disagreement in actions of agents, and costs of false declaration/choices in the sequential hypothesis testing. The corresponding sequential decision processes have well-defined value functions with respect to (a) the belief states for the case of conditional independent private noisy measurements that are also assumed to be independent identically distributed over time, and (b) the information states for the case of correlated private noisy measurements. A sequential investment game of strategic coordination and delay is also discussed as an application of the proposed strategic learning rules.

  17. Exploring heterogeneous market hypothesis using realized volatility

    Science.gov (United States)

    Chin, Wen Cheong; Isa, Zaidi; Mohd Nor, Abu Hassan Shaari

    2013-04-01

    This study investigates the heterogeneous market hypothesis using high frequency data. The cascaded heterogeneous trading activities with different time durations are modelled by the heterogeneous autoregressive framework. The empirical study indicated the presence of long memory behaviour and predictability elements in the financial time series which supported heterogeneous market hypothesis. Besides the common sum-of-square intraday realized volatility, we also advocated two power variation realized volatilities in forecast evaluation and risk measurement in order to overcome the possible abrupt jumps during the credit crisis. Finally, the empirical results are used in determining the market risk using the value-at-risk approach. The findings of this study have implications for informationally market efficiency analysis, portfolio strategies and risk managements.

  18. Timely diagnosis of sitosterolemia by next generation sequencing in two children with severe hypercholesterolemia.

    Science.gov (United States)

    Buonuomo, Paola Sabrina; Iughetti, Lorenzo; Pisciotta, Livia; Rabacchi, Claudio; Papadia, Francesco; Bruzzi, Patrizia; Tummolo, Albina; Bartuli, Andrea; Cortese, Claudio; Bertolini, Stefano; Calandra, Sebastiano

    2017-07-01

    Severe hypercholesterolemia associated or not with xanthomas in a child may suggest the diagnosis of homozygous autosomal dominant hypercholesterolemia (ADH), autosomal recessive hypercholesterolemia (ARH) or sitosterolemia, depending on the transmission of hypercholesterolemia in the patient's family. Sitosterolemia is a recessive disorder characterized by high plasma levels of cholesterol and plant sterols due to mutations in the ABCG5 or the ABCG8 gene, leading to a loss of function of the ATP-binding cassette (ABC) heterodimer transporter G5-G8. We aimed to perform the molecular characterization of two children with severe primary hypercholesterolemia. Case #1 was a 2 year-old girl with high LDL-cholesterol (690 mg/dl) and tuberous and intertriginous xanthomas. Case #2 was a 7 year-old boy with elevated LDL-C (432 mg/dl) but no xanthomas. In both cases, at least one parent had elevated LDL-cholesterol levels. For the molecular diagnosis, we applied targeted next generation sequencing (NGS), which unexpectedly revealed that both patients were compound heterozygous for nonsense mutations: Case #1 in ABCG5 gene [p.(Gln251*)/p.(Arg446*)] and Case #2 in ABCG8 gene [p.(Ser107*)/p.(Trp361*)]. Both children had extremely high serum sitosterol and campesterol levels, thus confirming the diagnosis of sisterolemia. A low-fat/low-sterol diet was promptly adopted with and without the addition of ezetimibe for Case #1 and Case #2, respectively. In both patients, serum total and LDL-cholesterol decreased dramatically in two months and progressively normalized. Targeted NGS allows the rapid diagnosis of sitosterolemia in children with severe hypercholesterolemia, even though their family history does not unequivocally suggest a recessive transmission of hypercholesterolemia. A timely diagnosis is crucial to avoid delays in treatment. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Generation of Long-time Complex Signals for Testing the Instruments for Detection of Voltage Quality Disturbances

    Science.gov (United States)

    Živanović, Dragan; Simić, Milan; Kokolanski, Zivko; Denić, Dragan; Dimcev, Vladimir

    2018-04-01

    Software supported procedure for generation of long-time complex test sentences, suitable for testing the instruments for detection of standard voltage quality (VQ) disturbances is presented in this paper. This solution for test signal generation includes significant improvements of computer-based signal generator presented and described in the previously published paper [1]. The generator is based on virtual instrumentation software for defining the basic signal parameters, data acquisition card NI 6343, and power amplifier for amplification of output voltage level to the nominal RMS voltage value of 230 V. Definition of basic signal parameters in LabVIEW application software is supported using Script files, which allows simple repetition of specific test signals and combination of more different test sequences in the complex composite test waveform. The basic advantage of this generator compared to the similar solutions for signal generation is the possibility for long-time test sequence generation according to predefined complex test scenarios, including various combinations of VQ disturbances defined in accordance with the European standard EN50160. Experimental verification of the presented signal generator capability is performed by testing the commercial power quality analyzer Fluke 435 Series II. In this paper are shown some characteristic complex test signals with various disturbances and logged data obtained from the tested power quality analyzer.

  20. Linearly chirped waveform generation with large time-bandwidth product using sweeping laser and dual-polarization modulator

    Science.gov (United States)

    Li, Xuan; Zhao, Shanghong; Li, Yongjun; Zhu, Zihang; Qu, Kun; Li, Tao; Hu, Dapeng

    2018-03-01

    A method for photonic generation of a linearly chirped microwave waveform using a frequency-sweeping laser and a dual-polarization modulator is proposed and investigated. A frequency-sweeping continuous-wave light is generated from the laser and then sent to the modulator. In the modulator, one part of the light is modulated with an RF signal to generate a frequency-shifting optical signal, while another part of the light is passed through a polarization rotator to rotate the polarization to an orthogonal direction. At the output of the modulator, the two optical signals are combined with orthogonal polarizations, and then injected into a polarization delay device to introduce a time delay. After combining the two optical signals for heterodyning, a linearly chirped waveform can be generated. The bandwidth, time duration, chirp rate and sign, central frequency of the generated waveform can be tuned independently and flexibly, furthermore, frequency doubling for the central frequency can be achieved in the waveform generation. A simulation is demonstrated to verify the proposed scheme, a linearly chirped microwave pulse with up or down chirp, central frequency of 20 or 40 GHz, bandwidth of 20 GHz, time duration of 500 ns, time-bandwidth product (TBWP) of 10000 is obtained.

  1. Electrochemical sensing using comparison of voltage-current time differential values during waveform generation and detection

    Energy Technology Data Exchange (ETDEWEB)

    Woo, Leta Yar-Li; Glass, Robert Scott; Fitzpatrick, Joseph Jay; Wang, Gangqiang; Henderson, Brett Tamatea; Lourdhusamy, Anthoniraj; Steppan, James John; Allmendinger, Klaus Karl

    2018-01-02

    A device for signal processing. The device includes a signal generator, a signal detector, and a processor. The signal generator generates an original waveform. The signal detector detects an affected waveform. The processor is coupled to the signal detector. The processor receives the affected waveform from the signal detector. The processor also compares at least one portion of the affected waveform with the original waveform. The processor also determines a difference between the affected waveform and the original waveform. The processor also determines a value corresponding to a unique portion of the determined difference between the original and affected waveforms. The processor also outputs the determined value.

  2. The Japanese attitude towards nuclear power generation. Changes as seen through time series

    International Nuclear Information System (INIS)

    Kitada, Atsuko; Hayashi, Chikio

    1999-01-01

    This study is intended to determine people's attitudes toward nuclear power generation, shedding light on the changed and unchanged structures of attitudes by comparing data on nuclear power generation for 1993 and 1998. Although some nuclear facility accidents occurred during the last five years, public attitudes toward nuclear power generation remain almost the same. For the utilization of nuclear power generation, there was a slight increase in passive affirmation. The percentage of active affirmation was less than 10 percent, but if passive affirmation is included a high percentage exceeding 70 percent acknowledged the utilization of nuclear power. It was found that people's attitudes toward the utilization of nuclear power became slightly more positive in 1998 than in 1993. The difference was found in the general measure of attitudes based on many questions about nuclear power generation, and in the importance and the utility of nuclear power generation including the purpose of nuclear power generation. People are not conscious of the anxiety about nuclear power generation in ordinary life. However, when people were made to think about nuclear power generation, the degree of anxiety increases even if provided with data that prove its safety. On the other hand, it was revealed that the degree of anxiety about nuclear facility accidents remains the same in the last five years, that is, it has not increased, although a growing interest in the disposal and treatment of radioactive wastes was seen. As a result of a comparison of the structure of attitudes, based on the study by Hayashi 1994, it was found that the group that had no interest in nuclear power generation offered the most noticeable features in answering pattern in both 1993 and 1998. Moreover, it was found also that the latter group of respondents were characterized by a little opportunity to have information. A similarity in the relationship between people's attitudes toward nuclear power generation

  3. Note: A 10 Gbps real-time post-processing free physical random number generator chip

    Science.gov (United States)

    Qian, Yi; Liang, Futian; Wang, Xinzhe; Li, Feng; Chen, Lian; Jin, Ge

    2017-09-01

    A random number generator with high data rate, small size, and low power consumption is essential for a certain quantum key distribution (QKD) system. We designed a 10 Gbps random number generator ASIC, TRNG2016, for the QKD system. With a 6 mm × 6 mm QFN48 package, TRNG2016 has 10 independent physical random number generation channels, and each channel can work at a fixed frequency up to 1 Gbps. The random number generated by TRNG2016 can pass the NIST statistical tests without any post-processing. With 3.3 V IO power supply and 1.2 V core power supply, the typical power consumption of TRNG2016 is 773 mW with 10 channels on and running at 1 Gbps data rate.

  4. Real Time Monitoring and Test Vector Generation for Improved Flight Safety, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — As the complexity of flight controllers grows so does the cost associated with verification and validation (V&V). Current-generation controllers are reaching...

  5. Real Time Monitoring and Test Vector Generation for Improved Flight Safety Project

    Data.gov (United States)

    National Aeronautics and Space Administration — As the complexity of flight controllers grows so does the cost associated with verification and validation (V&V). Current-generation controllers are reaching a...

  6. Industrial Use of Distributed Generation in Real-Time Energy and Ancillary Service Markets

    Energy Technology Data Exchange (ETDEWEB)

    Hudson, C.R.

    2001-10-24

    Industrial consumers of energy now have the opportunity to participate directly in electricity generation. This report seeks to give the reader (1) insights into the various types of generation services that distributed generation (DG) units could provide, (2) a mechanism to evaluate the economics of using DG, (3) an overview of the status of DG deployment in selected states, and (4) a summary of the communication technologies involved with DG and what testing activities are needed to encourage industrial application of DG. Section 1 provides details on electricity markets and the types of services that can be offered. Subsequent sections in the report address the technical requirements for participating in such markets, the economic decision process that an industrial energy user should go through in evaluating distributed generation, the status of current deployment efforts, and the requirements for test-bed or field demonstration projects.

  7. Incorporating time and income constraints in dynamic agent-based models of activity generation and time use : Approach and illustration

    NARCIS (Netherlands)

    Arentze, Theo; Ettema, D.F.; Timmermans, Harry

    Existing theories and models in economics and transportation treat households’ decisions regarding allocation of time and income to activities as a resource-allocation optimization problem. This stands in contrast with the dynamic nature of day-by-day activity-travel choices. Therefore, in the

  8. Inspiring the Next Generation through Real Time Access to Ocean Exploration

    Science.gov (United States)

    Bell, K. L.; Ballard, R. D.; Witten, A. B.; O'Neal, A.; Argenta, J.

    2011-12-01

    Using live-access exposure to actual shipboard research activities where exciting discoveries are made can be a key contributor to engaging students and their families in learning about earth science and STEM subjects. The number of bachelor's degrees awarded annually in the Earth sciences peaked at nearly 8000 in 1984, and has since declined more than 50%; for the last several years, the number of bachelor's degrees issued in U.S. schools in the geosciences has hovered around 2500 (AGI, 2009). In 2008, the last year for which the data are published, only 533 Ph.D.s were awarded in Earth, Atmospheric and Ocean sciences (NSF, 2009). By 2030, the supply of geoscientists for the petroleum industry is expected to fall short of the demand by 30,000 scientists (AGI, 2009). The National Science Foundation (NSF) reports that minority students earn approximately 15% of all bachelor's degrees in science and engineering, but only 4.6% of degrees in the geosciences. Both of these percentages are very low in comparison to national and state populations, where Hispanics and African-Americans make up 29% of the U.S. overall. The Ocean Exploration Trust (OET) is a non-profit organization whose mission is to explore the world's ocean, and to capture the excitement of that exploration for audiences of all ages, but primarily to inspire and motivate the next generation of explorers. The flagship of OET's exploratory programs is the Exploration Vessel Nautilus, on which annual expeditions are carried out to support our mission. The ship is equipped with state of the art satellite telecommunications "telepresence" technology that enables 24/7 world-wide real time access to the data being collected by the ships remotely operated vehicles. It is this "live" access that affords OET and its partners the opportunity to engage and inspire audiences across the United States and abroad. OET has formed partnerships with a wide-range of educational organizations that collectively offer life-time

  9. Real-time simulation of a Doubly-Fed Induction Generator based wind power system on eMEGASimRTM Real-Time Digital Simulator

    Science.gov (United States)

    Boakye-Boateng, Nasir Abdulai

    The growing demand for wind power integration into the generation mix prompts the need to subject these systems to stringent performance requirements. This study sought to identify the required tools and procedures needed to perform real-time simulation studies of Doubly-Fed Induction Generator (DFIG) based wind generation systems as basis for performing more practical tests of reliability and performance for both grid-connected and islanded wind generation systems. The author focused on developing a platform for wind generation studies and in addition, the author tested the performance of two DFIG models on the platform real-time simulation model; an average SimpowerSystemsRTM DFIG wind turbine, and a detailed DFIG based wind turbine using ARTEMiSRTM components. The platform model implemented here consists of a high voltage transmission system with four integrated wind farm models consisting in total of 65 DFIG based wind turbines and it was developed and tested on OPAL-RT's eMEGASimRTM Real-Time Digital Simulator.

  10. Reproductive timing and larval dispersal of intertidal crabs: the predator avoidance hypothesis Sincronía reproductiva y de dispersión larval en cangrejos intermareales: la hipótesis anti-depredador

    Directory of Open Access Journals (Sweden)

    JOHN H. CHRISTY

    2003-06-01

    Full Text Available Many intertidal and shallow water crabs have strong reproductive cycles and migratory larvae. Females release larvae near the time of high water of the larger amplitude nocturnal tides during the semilunar or lunar cycles. Newly hatched larvae move quickly at night toward and into the sea where, weeks later, they develop to megalopae that then ride nocturnal flood tides inshore and up estuaries to settle in adult habitats. It was first thought that crabs might time larval release so that larvae will become megalopae when they can ride the larger amplitude spring flood tides to adult habitats. This idea was rejected when it was found that were was no change in the timing of hatching during the breeding season by several estuarine species that would compensate for the decrease in the larval development period as the water temperature increased. In addition, megalopae moved up-stream at night but not on the largest spring flood tides. Attention shifted to the possible value to larvae of leaving the estuary quickly to avoid high temperatures, low salinities or stranding. This idea was not supported when it was found that species on open coasts exhibit the same reproductive patterns as do estuarine species. Alternatively, by moving quickly to the ocean at night larvae may best escape visual planktivorous fishes that are especially abundant in shallow areas. This predator avoidance hypothesis has been broadly supported: species with larvae that are cryptic, spiny and better protected from predation lack both strong reproductive cycles and larval migration. The mechanisms that promote precise reproductive timing have been little studied. Evidence is presented that female fiddler crabs may adjust the timing of fertilization to compensate for variation in incubation temperatures that would otherwise induce timing errors. However, crabs on colder coasts, as in Chile, apparently do not exhibit biweekly or monthly cycles of larval release. The consequences

  11. A stochastic space-time rainfall forecasting system for real time flow forecasting I: Development of MTB conditional rainfall scenario generator

    Directory of Open Access Journals (Sweden)

    D. Mellor

    2000-01-01

    Full Text Available The need for the development of a method for generating an ensemble of rainfall scenarios, which are conditioned on the observed rainfall, and its place in the HYREX programme is discussed. A review of stochastic models for rainfall, and rainfall forecasting techniques, is followed by a justification for the choice of the Modified Turning Bands (MTB model in this context. This is a stochastic model of rainfall which is continuous over space and time, and which reproduces features of real rainfall fields at four distinct scales: raincells, cluster potential regions, rainbands and the overall outline of a storm at the synoptic scale. The model can be used to produce synthetic data sets, in the same format as data from a radar. An inversion procedure for inferring a construction of the MTB model which generates a given sequence of radar images is described. This procedure is used to generate an ensemble of future rainfall scenarios which are consistent with a currently observed storm. The combination of deterministic modelling at the large scales and stochastic modelling at smaller scales, within the MTB model, makes the system particularly suitable for short-term forecasts. As the lead time increases, so too does the variability across the set of generated scenarios. Keywords: MTB model, space-time rainfall field model, rainfall radar, HYREX, real-time flow forecasting

  12. Unsupervised SBAS-DInSAR time series generation: a small brick for building a Supersites ecosystem

    Science.gov (United States)

    Casu, F.; De Luca, C.; Elefante, S.; Imperatore, P.; Lanari, R.; Manunta, M.; Zinno, I.; Farres, J.; Lengert, W.

    2013-12-01

    Differential SAR Interferometry (DInSAR) is an effective tool to detect and monitor ground displacements with centimeter accuracy. The geoscience communities, as well as those related to hazard monitoring and risk mitigation, make extensively use of DInSAR. They take advantage from the current huge amount of SAR data and will benefit the incoming big data stream of Sentinel 1 system. The availability of this information makes possible the generation of Earth's surface displacement maps and time series with large spatial coverage and long time span and, often in conjunction to in-situ data, fosters advances in science. However, the managing, processing and analysis of such a huge amount of data is expected to be the major bottleneck, particularly when crisis phases occur. The emerging need of creating a common ecosystem in which data (space born and in-situ), results and processing tools are shared, is envisaged to be a successful way to address such a problem and contribute to information and knowledge spreading. The Supersites initiative as well as the ESA SuperSites Exploitation Platform (SSEP), through the ESA Grid Processing On Demand (G-POD) and Cloud Computing Operational Pilot (CIOP) projects, provide effective answers to this need. The existent tools for querying and analysing SAR data are required to be redesigned for both processing big data and for quickly replying to simultaneous user requests, mainly during emergency situations. These requirements push for the development of automatic and unsupervised processing tools as well as of scalable, widely accessible and high performance computing capabilities. The cloud-computing environment successfully responds to these objectives, particularly in case of spike and peak requests of processing resources linked to disaster events. In this work we present a parallel computational model for the Small BAseline Subset (SBAS) DInSAR algorithm as it was implemented within the computing environment provided by the

  13. Testing competing forms of the Milankovitch hypothesis

    DEFF Research Database (Denmark)

    Kaufmann, Robert K.; Juselius, Katarina

    2016-01-01

    We test competing forms of the Milankovitch hypothesis by estimating the coefficients and diagnostic statistics for a cointegrated vector autoregressive model that includes 10 climate variables and four exogenous variables for solar insolation. The estimates are consistent with the physical...... ice volume and solar insolation. The estimated adjustment dynamics show that solar insolation affects an array of climate variables other than ice volume, each at a unique rate. This implies that previous efforts to test the strong form of the Milankovitch hypothesis by examining the relationship...... mechanisms postulated to drive glacial cycles. They show that the climate variables are driven partly by solar insolation, determining the timing and magnitude of glaciations and terminations, and partly by internal feedback dynamics, pushing the climate variables away from equilibrium. We argue...

  14. Eigenstate Thermalization Hypothesis and Quantum Thermodynamics

    Science.gov (United States)

    Olshanii, Maxim

    2009-03-01

    One of the open questions in quantum thermodynamics reads: how can linear quantum dynamics provide chaos necessary for thermalization of an isolated quantum system? To this end, we perform an ab initio numerical analysis of a system of hard-core bosons on a lattice and show [Marcos Rigol, Vanja Dunjko & Maxim Olshanii, Nature 452, 854 (2008)] that the above controversy can be resolved via the Eigenstate Thermalization Hypothesis suggested independently by Deutsch [J. M. Deutsch, Phys. Rev. A 43, 2046 (1991)] and Srednicki [M. Srednicki, Phys. Rev. E 50, 888 (1994)]. According to this hypothesis, in quantum systems thermalization happens in each individual eigenstate of the system separately, but it is hidden initially by coherences between them. In course of the time evolution the thermal properties become revealed through (linear) decoherence that needs not to be chaotic.

  15. A high-fidelity weather time series generator using the Markov Chain process on a piecewise level

    Science.gov (United States)

    Hersvik, K.; Endrerud, O.-E. V.

    2017-12-01

    A method is developed for generating a set of unique weather time-series based on an existing weather series. The method allows statistically valid weather variations to take place within repeated simulations of offshore operations. The numerous generated time series need to share the same statistical qualities as the original time series. Statistical qualities here refer mainly to the distribution of weather windows available for work, including durations and frequencies of such weather windows, and seasonal characteristics. The method is based on the Markov chain process. The core new development lies in how the Markov Process is used, specifically by joining small pieces of random length time series together rather than joining individual weather states, each from a single time step, which is a common solution found in the literature. This new Markov model shows favorable characteristics with respect to the requirements set forth and all aspects of the validation performed.

  16. Effect of recovery time of fault current limiter on over current from distributed generator in micro grid after voltage sag

    Directory of Open Access Journals (Sweden)

    Daisuke Iioka

    2016-01-01

    Full Text Available This paper describes an effect of recovery time of fault current limiter on over current from a micro grid system which is interconnected to a power distribution system. We have assumed that the semi-conductor type fault current limiter is installed between the micro grid system with the synchronous generator and the power distribution system, measured the over current after a voltage sag occurrence in the power distribution system and a recovery of fault current limiter by experiments in our laboratory. Finally, it was found that the introduction of recovery time for fault current limiter after voltage sag is useful for suppressing the over current from the distributed generator.

  17. Real-time Distributed Economic Dispatch forDistributed Generation Based on Multi-Agent System

    DEFF Research Database (Denmark)

    Luo, Kui; Wu, Qiuwei; Nielsen, Arne Hejde

    2015-01-01

    The distributed economic dispatch for distributed generation is formulated as a optimization problem with equality and inequality constraints. An effective distributed approach based on multi-agent system is proposed for solving the economic dispatch problem in this paper. The proposed approach...... consists of two stages. In the first stage, an adjacency average allocation algorithm is proposed to ensure the generation-demand equality. In the second stage, a local replicator dynamics algorithm is applied to achieve nash equilibrium for the power dispatch game. The approach is implemented in a fully...

  18. Vehicle Detection Based on Probability Hypothesis Density Filter

    Directory of Open Access Journals (Sweden)

    Feihu Zhang

    2016-04-01

    Full Text Available In the past decade, the developments of vehicle detection have been significantly improved. By utilizing cameras, vehicles can be detected in the Regions of Interest (ROI in complex environments. However, vision techniques often suffer from false positives and limited field of view. In this paper, a LiDAR based vehicle detection approach is proposed by using the Probability Hypothesis Density (PHD filter. The proposed approach consists of two phases: the hypothesis generation phase to detect potential objects and the hypothesis verification phase to classify objects. The performance of the proposed approach is evaluated in complex scenarios, compared with the state-of-the-art.

  19. Conditions and timing of moderate and radical diaspora mobilization: evidence from conflict-generated diasporas

    NARCIS (Netherlands)

    Koinova, M.

    2009-01-01

    Based on extensive research among conflict-generated diasporas — Albanians, Armenians, Lebanese, Serbians, Ukrainians, and Chechens predominantly living in the U.S. — I argue here that academics and policy-makers alike need to revisit the notion that diasporas are not likely agents of moderate

  20. A computer program for estimating the power-density spectrum of advanced continuous simulation language generated time histories

    Science.gov (United States)

    Dunn, H. J.

    1981-01-01

    A computer program for performing frequency analysis of time history data is presented. The program uses circular convolution and the fast Fourier transform to calculate power density spectrum (PDS) of time history data. The program interfaces with the advanced continuous simulation language (ACSL) so that a frequency analysis may be performed on ACSL generated simulation variables. An example of the calculation of the PDS of a Van de Pol oscillator is presented.

  1. A test of the orthographic recoding hypothesis

    Science.gov (United States)

    Gaygen, Daniel E.

    2003-04-01

    The Orthographic Recoding Hypothesis [D. E. Gaygen and P. A. Luce, Percept. Psychophys. 60, 465-483 (1998)] was tested. According to this hypothesis, listeners recognize spoken words heard for the first time by mapping them onto stored representations of the orthographic forms of the words. Listeners have a stable orthographic representation of words, but no phonological representation, when those words have been read frequently but never heard or spoken. Such may be the case for low frequency words such as jargon. Three experiments using visually and auditorily presented nonword stimuli tested this hypothesis. The first two experiments were explicit tests of memory (old-new tests) for words presented visually. In the first experiment, the recognition of auditorily presented nonwords was facilitated when they previously appeared on a visually presented list. The second experiment was similar, but included a concurrent articulation task during a visual word list presentation, thus preventing covert rehearsal of the nonwords. The results were similar to the first experiment. The third experiment was an indirect test of memory (auditory lexical decision task) for visually presented nonwords. Auditorily presented nonwords were identified as nonwords significantly more slowly if they had previously appeared on the visually presented list accompanied by a concurrent articulation task.

  2. Antiaging therapy: a prospective hypothesis

    Directory of Open Access Journals (Sweden)

    Shahidi Bonjar MR

    2015-01-01

    Full Text Available Mohammad Rashid Shahidi Bonjar,1 Leyla Shahidi Bonjar2 1School of Dentistry, Kerman University of Medical Sciences, Kerman Iran; 2Department of Pharmacology, College of Pharmacy, Kerman University of Medical Sciences, Kerman, Iran Abstract: This hypothesis proposes a new prospective approach to slow the aging process in older humans. The hypothesis could lead to developing new treatments for age-related illnesses and help humans to live longer. This hypothesis has no previous documentation in scientific media and has no protocol. Scientists have presented evidence that systemic aging is influenced by peculiar molecules in the blood. Researchers at Albert Einstein College of Medicine, New York, and Harvard University in Cambridge discovered elevated titer of aging-related molecules (ARMs in blood, which trigger cascade of aging process in mice; they also indicated that the process can be reduced or even reversed. By inhibiting the production of ARMs, they could reduce age-related cognitive and physical declines. The present hypothesis offers a new approach to translate these findings into medical treatment: extracorporeal adjustment of ARMs would lead to slower rates of aging. A prospective “antiaging blood filtration column” (AABFC is a nanotechnological device that would fulfill the central role in this approach. An AABFC would set a near-youth homeostatic titer of ARMs in the blood. In this regard, the AABFC immobilizes ARMs from the blood while blood passes through the column. The AABFC harbors antibodies against ARMs. ARM antibodies would be conjugated irreversibly to ARMs on contact surfaces of the reaction platforms inside the AABFC till near-youth homeostasis is attained. The treatment is performed with the aid of a blood-circulating pump. Similar to a renal dialysis machine, blood would circulate from the body to the AABFC and from there back to the body in a closed circuit until ARMs were sufficiently depleted from the blood. The

  3. A durkheimian hypothesis on stress.

    Science.gov (United States)

    Mestrovic, S; Glassner, B

    1983-01-01

    Commonalities among the events that appear on life events lists and among the types of social supports which have been found to reduce the likelihood of illness are reviewed in the life events literature in an attempt to find a context within sociological theory. Social integration seems to underlie the stress-illness process. In seeking a tradition from which to understand these facts, we selected Durkheim's works in the context of the homo duplex concept wherein social integration involves the interplay of individualism and social forces. After presenting a specific hypothesis for the stress literature, the paper concludes with implications and suggestions for empirical research.

  4. ARIMA-Based Time Series Model of Stochastic Wind Power Generation

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Pedersen, Troels; Bak-Jensen, Birgitte

    2010-01-01

    This paper proposes a stochastic wind power model based on an autoregressive integrated moving average (ARIMA) process. The model takes into account the nonstationarity and physical limits of stochastic wind power generation. The model is constructed based on wind power measurement of one year from...... the Nysted offshore wind farm in Denmark. The proposed limited-ARIMA (LARIMA) model introduces a limiter and characterizes the stochastic wind power generation by mean level, temporal correlation and driving noise. The model is validated against the measurement in terms of temporal correlation...... and probability distribution. The LARIMA model outperforms a first-order transition matrix based discrete Markov model in terms of temporal correlation, probability distribution and model parameter number. The proposed LARIMA model is further extended to include the monthly variation of the stochastic wind power...

  5. A low noise clock generator for high-resolution time-to-digital convertors

    International Nuclear Information System (INIS)

    Prinzie, J.; Leroux, P.; Christiaensen, J.; Moreira, P.; Steyaert, M.

    2016-01-01

    A robust PLL clock generator has been designed for the harsh environment in high-energy physics applications. The PLL operates with a reference clock frequency of 40 MHz to 50 MHz and performs a multiplication by 64. An LC tank VCO with low internal phase noise can generate a frequency from 2.2 GHz up to 3.2 GHz with internal discrete bank switching. The PLL includes an automatic bank selection algorithm to correctly select the correct range of the oscillator. The PLL has been fabricated in a 65 nm CMOS technology and consumes less than 30 mW. The additive jitter of the PLL has been measured to be less than 400 fs RMS

  6. A low noise clock generator for high-resolution time-to-digital convertors

    Science.gov (United States)

    Prinzie, J.; Christiaensen, J.; Moreira, P.; Steyaert, M.; Leroux, P.

    2016-02-01

    A robust PLL clock generator has been designed for the harsh environment in high-energy physics applications. The PLL operates with a reference clock frequency of 40 MHz to 50 MHz and performs a multiplication by 64. An LC tank VCO with low internal phase noise can generate a frequency from 2.2 GHz up to 3.2 GHz with internal discrete bank switching. The PLL includes an automatic bank selection algorithm to correctly select the correct range of the oscillator. The PLL has been fabricated in a 65 nm CMOS technology and consumes less than 30 mW. The additive jitter of the PLL has been measured to be less than 400 fs RMS.

  7. A generation lost? Prolonged effects of labour market entry in times of high unemployment in the Netherlands

    NARCIS (Netherlands)

    Wolbers, M.H.J.

    2016-01-01

    After the economic crisis of the 1980s, concerns arose about whether the high youth unemployment at that time would produce a 'lost generation' of young people in the Netherlands. The same concerns have recently arisen about the potential effects of the current high rate of youth unemployment. The

  8. Forecast generation for real-time control of urban drainage systems using greybox modelling and radar rainfall

    DEFF Research Database (Denmark)

    Löwe, Roland; Mikkelsen, Peter Steen; Madsen, Henrik

    2012-01-01

    We present stochastic flow forecasts to be used in a real-time control setup for urban drainage systems. The forecasts are generated using greybox models with rain gauge and radar rainfall observations as input. Predictions are evaluated as intervals rather than just mean values. We obtain...

  9. Time-courses of lung function and respiratory muscle pressure generating capacity after spinal cord injury: A prospective cohort study.

    NARCIS (Netherlands)

    Mueller, G.; de Groot, S.; van der Woude, L.H.V.; Hopman, M.T.

    2008-01-01

    Objective: To investigate the time-courses of lung function and respiratory muscle pressure generating capacity after spinal cord injury. Design: Multi-centre, prospective cohort study. Subjects: One hundred and nine subjects with recent, motor complete spinal cord injury. Methods: Lung function and

  10. Time-courses of lung function and respiratory muscle pressure generating capacity after spinal cord injury : a prospective cohort study

    NARCIS (Netherlands)

    Mueller, Gabi; de Groot, Sonja; van der Woude, Lucas; Hopman, Maria T E

    OBJECTIVE: To investigate the time-courses of lung function and respiratory muscle pressure generating capacity after spinal cord injury. DESIGN: Multi-centre, prospective cohort study. SUBJECTS: One hundred and nine subjects with recent, motor complete spinal cord injury. METHODS: Lung function and

  11. Time-courses of lung function and respiratory muscle pressure generating capacity after spinal cord injury: a prospective cohort study.

    NARCIS (Netherlands)

    Muller, G.; Groot, S. de; Woude, L.H.V. van der; Hopman, M.T.E.

    2008-01-01

    OBJECTIVE: To investigate the time-courses of lung function and respiratory muscle pressure generating capacity after spinal cord injury. DESIGN: Multi-centre, prospective cohort study. SUBJECTS: One hundred and nine subjects with recent, motor complete spinal cord injury. METHODS: Lung function and

  12. Customised Column Generation for Rostering Problems: Using Compile-time Customisation to create a Flexible C++ Engine for Staff Rostering

    DEFF Research Database (Denmark)

    Mason, Andrew J.; Ryan, David; Hansen, Anders Dohn

    2009-01-01

    This paper describes a new approach for easily creating customised staff rostering column generation programs. In previous work, we have built a large very flexible software system which is tailored at run time to meet the particular needs of a client. This system has proven to be very capable, b...

  13. Form-Aware, Real-Time Adaptive Music Generation for Interactive Experiences

    OpenAIRE

    Aspromallis, C.; Gold, N. E.

    2016-01-01

    Many experiences offered to the public through interactive theatre, theme parks, video games, and virtual environments use music to complement the participants’ activity. There is a range of approaches to this, from straightforward playback of ‘stings’, to looped phrases, to on-the-fly note generation. Within the latter, traditional genres and forms are often not represented, with the music instead being typically loose in form and structure. We present work in progress on a new method for re...

  14. Mood Expression in Real-Time Computer Generated Music using Pure Data

    OpenAIRE

    Scirea, Marco; Nelson, Mark; Cheong, Yun-Gyung; Bae, Byung Chull

    2014-01-01

    This paper presents an empirical study that investigated if procedurally generated music based on a set of musical features can elicit a target mood in the music listener. Drawn from the two-dimensional affect model proposed by Russell, the musical features that we have chosen to express moods are intensity, timbre, rhythm, and dissonances. The eight types of mood investigated in this study are being bored, content, happy, miserable, tired, fearful, peaceful, and alarmed. We created 8 short m...

  15. A New Generation of Real-Time Systems in the JET Tokamak

    Science.gov (United States)

    Alves, Diogo; Neto, Andre C.; Valcarcel, Daniel F.; Felton, Robert; Lopez, Juan M.; Barbalace, Antonio; Boncagni, Luca; Card, Peter; De Tommasi, Gianmaria; Goodyear, Alex; Jachmich, Stefan; Lomas, Peter J.; Maviglia, Francesco; McCullen, Paul; Murari, Andrea; Rainford, Mark; Reux, Cedric; Rimini, Fernanda; Sartori, Filippo; Stephen, Adam V.; Vega, Jesus; Vitelli, Riccardo; Zabeo, Luca; Zastrow, Klaus-Dieter

    2014-04-01

    Recently, a new recipe for developing and deploying real-time systems has become increasingly adopted in the JET tokamak. Powered by the advent of x86 multi-core technology and the reliability of JET's well established Real-Time Data Network (RTDN) to handle all real-time I/O, an official Linux vanilla kernel has been demonstrated to be able to provide real-time performance to user-space applications that are required to meet stringent timing constraints. In particular, a careful rearrangement of the Interrupt ReQuests' (IRQs) affinities together with the kernel's CPU isolation mechanism allows one to obtain either soft or hard real-time behavior depending on the synchronization mechanism adopted. Finally, the Multithreaded Application Real-Time executor (MARTe) framework is used for building applications particularly optimised for exploring multi-core architectures. In the past year, four new systems based on this philosophy have been installed and are now part of JET's routine operation. The focus of the present work is on the configuration aspects that enable these new systems' real-time capability. Details are given about the common real-time configuration of these systems, followed by a brief description of each system together with results regarding their real-time performance. A cycle time jitter analysis of a user-space MARTe based application synchronizing over a network is also presented. The goal is to compare its deterministic performance while running on a vanilla and on a Messaging Real time Grid (MRG) Linux kernel.

  16. Capabilities needed for the next generation of thermo-hydraulic codes for use in real time applications

    Energy Technology Data Exchange (ETDEWEB)

    Arndt, S.A.

    1997-07-01

    The real-time reactor simulation field is currently at a crossroads in terms of the capability to perform real-time analysis using the most sophisticated computer codes. Current generation safety analysis codes are being modified to replace simplified codes that were specifically designed to meet the competing requirement for real-time applications. The next generation of thermo-hydraulic codes will need to have included in their specifications the specific requirement for use in a real-time environment. Use of the codes in real-time applications imposes much stricter requirements on robustness, reliability and repeatability than do design and analysis applications. In addition, the need for code use by a variety of users is a critical issue for real-time users, trainers and emergency planners who currently use real-time simulation, and PRA practitioners who will increasingly use real-time simulation for evaluating PRA success criteria in near real-time to validate PRA results for specific configurations and plant system unavailabilities.

  17. Capabilities needed for the next generation of thermo-hydraulic codes for use in real time applications

    International Nuclear Information System (INIS)

    Arndt, S.A.

    1997-01-01

    The real-time reactor simulation field is currently at a crossroads in terms of the capability to perform real-time analysis using the most sophisticated computer codes. Current generation safety analysis codes are being modified to replace simplified codes that were specifically designed to meet the competing requirement for real-time applications. The next generation of thermo-hydraulic codes will need to have included in their specifications the specific requirement for use in a real-time environment. Use of the codes in real-time applications imposes much stricter requirements on robustness, reliability and repeatability than do design and analysis applications. In addition, the need for code use by a variety of users is a critical issue for real-time users, trainers and emergency planners who currently use real-time simulation, and PRA practitioners who will increasingly use real-time simulation for evaluating PRA success criteria in near real-time to validate PRA results for specific configurations and plant system unavailabilities

  18. Diamond detectors for time-of-flight measurements in laser-generated plasmas

    Czech Academy of Sciences Publication Activity Database

    Torrisi, L.; Margarone, D.; Milani, E.; Verona-Rinati, G.; Prestopino, G.; Tuvè, C.; Potenza, R.; Láska, Leoš; Krása, Josef; Ullschmied, Jiří

    2009-01-01

    Roč. 164, 5-6 (2009), s. 369-375 ISSN 1042-0150. [Workshop on European Collaboration for Higher Education and Research in Nuclear Engineering and Radiological Protection /4./. Favignana, 26.05.2008-28.05.2008] R&D Projects: GA MŠk(CZ) LC528 Institutional research plan: CEZ:AV0Z10100523; CEZ:AV0Z20430508 Keywords : diamond detector * laser-generated plasma * x-ray detection Subject RIV: BL - Plasma and Gas Discharge Physics Impact factor: 0.550, year: 2009

  19. Mood Expression in Real-Time Computer Generated Music using Pure Data

    DEFF Research Database (Denmark)

    Scirea, Marco; Nelson, Mark; Cheong, Yun-Gyung

    2014-01-01

    This paper presents an empirical study that investigated if procedurally generated music based on a set of musical features can elicit a target mood in the music listener. Drawn from the two-dimensional affect model proposed by Russell, the musical features that we have chosen to express moods...... are intensity, timbre, rhythm, and dissonances. The eight types of mood investigated in this study are being bored, content, happy, miserable, tired, fearful, peaceful, and alarmed. We created 8 short music clips using PD (Pure Data) programming language, each of them represents a particular mood. We carried...... out a pilot study and present a preliminary result....

  20. Simulation of light generation in cholesteric liquid crystals using kinetic equations: Time-independent solution

    Energy Technology Data Exchange (ETDEWEB)

    Shtykov, N. M., E-mail: nshtykov@mail.ru; Palto, S. P.; Umanskii, B. A. [Russian Academy of Sciences, Shubnikov Institute of Crystallography (Russian Federation)

    2013-08-15

    We report on the results of calculating the conditions for light generation in cholesteric liquid crystals doped with fluorescent dyes using kinetic equations. Specific features of spectral properties of the chiral cholesteric medium as a photonic structure and spatially distributed type of the feedback in the active medium are taken into account. The expression is derived for the threshold pump radiation intensity as a function of the dye concentration and sample thickness. The importance of taking into account the distributed loss level in the active medium for calculating the optimal parameters of the medium and for matching the calculated values with the results of experiments is demonstrated.

  1. Optimal performance at arbitrary power of minimally nonlinear irreversible thermoelectric generators with broken time-reversal symmetry

    Science.gov (United States)

    Zhang, Rong; Liu, Wei; Li, Qianwen; Zhang, Lei; Bai, Long

    2018-01-01

    We investigate the performance at arbitrary power of minimally nonlinear irreversible thermoelectric generators (MNITGs) with broken time-reversal symmetry within linear irreversible thermodynamics, and the efficiency of MNITGs at arbitrary power is analytically derived. Furthermore, a universal bound on the efficiency of thermoelectric generators (TGs) with broken time-reversal symmetry and the arbitrary power is obtained. Some system-specific characteristics are discussed and uncovered. A large efficiency at arbitrary power can also be achieved via the cooperative mechanism between the system parameters. Our results indicate that the broken time-reversal symmetry provides the physically allowed degrees of freedom for tuning the performance of thermoelectric devices, and the physical trade-off region between the efficiency and the power output can also offer the appropriate space for optimizing the performance of TGs.

  2. Photonic generation of FCC-compliant UWB pulses based on modified Gaussian quadruplet and incoherent wavelength-to-time conversion

    Science.gov (United States)

    Mu, Hongqian; Wang, Muguang; Tang, Yu; Zhang, Jing; Jian, Shuisheng

    2018-03-01

    A novel scheme for the generation of FCC-compliant UWB pulse is proposed based on modified Gaussian quadruplet and incoherent wavelength-to-time conversion. The modified Gaussian quadruplet is synthesized based on linear sum of a broad Gaussian pulse and two narrow Gaussian pulses with the same pulse-width and amplitude peak. Within specific parameter range, FCC-compliant UWB with spectral power efficiency of higher than 39.9% can be achieved. In order to realize the designed waveform, a UWB generator based on spectral shaping and incoherent wavelength-to-time mapping is proposed. The spectral shaper is composed of a Gaussian filter and a programmable filter. Single-mode fiber functions as both dispersion device and transmission medium. Balanced photodetection is employed to combine linearly the broad Gaussian pulse and two narrow Gaussian pulses, and at same time to suppress pulse pedestals that result in low-frequency components. The proposed UWB generator can be reconfigured for UWB doublet by operating the programmable filter as a single-band Gaussian filter. The feasibility of proposed UWB generator is demonstrated experimentally. Measured UWB pulses match well with simulation results. FCC-compliant quadruplet with 10-dB bandwidth of 6.88-GHz, fractional bandwidth of 106.8% and power efficiency of 51% is achieved.

  3. The conscious access hypothesis: Explaining the consciousness.

    Science.gov (United States)

    Prakash, Ravi

    2008-01-01

    The phenomenon of conscious awareness or consciousness is complicated but fascinating. Although this concept has intrigued the mankind since antiquity, exploration of consciousness from scientific perspectives is not very old. Among myriad of theories regarding nature, functions and mechanism of consciousness, off late, cognitive theories have received wider acceptance. One of the most exciting hypotheses in recent times has been the "conscious access hypotheses" based on the "global workspace model of consciousness". It underscores an important property of consciousness, the global access of information in cerebral cortex. Present article reviews the "conscious access hypothesis" in terms of its theoretical underpinnings as well as experimental supports it has received.

  4. Cooperating Expert Systems for the Next Generation of Real-time Monitoring Applications

    Science.gov (United States)

    Schwuttke, U.; Veregge, J.; Quan, A.

    1995-01-01

    A distributed monitoring and diagnosis system has been developed and successfully applied to real-time monitoring of interplanetary spacecraft at NASA's Jet Propulsion Laboratory. This system uses a combination of conventional processing and artificial intelligence.

  5. Preventive maintenance: optimization of time - based discard decisions at the bruce nuclear generating station

    International Nuclear Information System (INIS)

    Doyle, E.K.; Jardine, A.K.S.

    2001-01-01

    The use of various maintenance optimization techniques at Bruce has lead to cost effective preventive maintenance applications for complex systems. As previously reported at ICONE 6 in New Orleans, 1996, several innovative practices reduced Reliability Centered Maintenance costs while maintaining the accuracy of the analysis. The optimization strategy has undergone further evolution and at the present an Integrated Maintenance Program (IMP) is in place where an Expert Panel consisting of all players/experts proceed through each system in a disciplined fashion and reach agreement on all items under a rigorous time frame. It is well known that there are essentially 3 maintenance based actions that can flow from a Maintenance Optimization Analysis: condition based maintenance, time based maintenance and time based discard. The present effort deals with time based discard decisions. Maintenance data from the Remote On-Power Fuel Changing System was used. (author)

  6. A methodology for the stochastic generation of hourly synthetic direct normal irradiation time series

    Science.gov (United States)

    Larrañeta, M.; Moreno-Tejera, S.; Lillo-Bravo, I.; Silva-Pérez, M. A.

    2018-02-01

    Many of the available solar radiation databases only provide global horizontal irradiance (GHI) while there is a growing need of extensive databases of direct normal radiation (DNI) mainly for the development of concentrated solar power and concentrated photovoltaic technologies. In the present work, we propose a methodology for the generation of synthetic DNI hourly data from the hourly average GHI values by dividing the irradiance into a deterministic and stochastic component intending to emulate the dynamics of the solar radiation. The deterministic component is modeled through a simple classical model. The stochastic component is fitted to measured data in order to maintain the consistency of the synthetic data with the state of the sky, generating statistically significant DNI data with a cumulative frequency distribution very similar to the measured data. The adaptation and application of the model to the location of Seville shows significant improvements in terms of frequency distribution over the classical models. The proposed methodology applied to other locations with different climatological characteristics better results than the classical models in terms of frequency distribution reaching a reduction of the 50% in the Finkelstein-Schafer (FS) and Kolmogorov-Smirnov test integral (KSI) statistics.

  7. Hypothesis-driven physical examination curriculum.

    Science.gov (United States)

    Allen, Sharon; Olson, Andrew; Menk, Jeremiah; Nixon, James

    2017-12-01

    Medical students traditionally learn physical examination skills as a rote list of manoeuvres. Alternatives like hypothesis-driven physical examination (HDPE) may promote students' understanding of the contribution of physical examination to diagnostic reasoning. We sought to determine whether first-year medical students can effectively learn to perform a physical examination using an HDPE approach, and then tailor the examination to specific clinical scenarios. Medical students traditionally learn physical examination skills as a rote list of manoeuvres CONTEXT: First-year medical students at the University of Minnesota were taught both traditional and HDPE approaches during a required 17-week clinical skills course in their first semester. The end-of-course evaluation assessed HDPE skills: students were assigned one of two cardiopulmonary cases. Each case included two diagnostic hypotheses. During an interaction with a standardised patient, students were asked to select physical examination manoeuvres in order to make a final diagnosis. Items were weighted and selection order was recorded. First-year students with minimal pathophysiology performed well. All students selected the correct diagnosis. Importantly, students varied the order when selecting examination manoeuvres depending on the diagnoses under consideration, demonstrating early clinical decision-making skills. An early introduction to HDPE may reinforce physical examination skills for hypothesis generation and testing, and can foster early clinical decision-making skills. This has important implications for further research in physical examination instruction. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  8. Propensity score estimation to address calendar time-specific channeling in comparative effectiveness research of second generation antipsychotics.

    Directory of Open Access Journals (Sweden)

    Stacie B Dusetzina

    Full Text Available Channeling occurs when a medication and its potential comparators are selectively prescribed based on differences in underlying patient characteristics. Drug safety advisories can provide new information regarding the relative safety or effectiveness of a drug product which might increase selective prescribing. In particular, when reported adverse effects vary among drugs within a therapeutic class, clinicians may channel patients toward or away from a drug based on the patient's underlying risk for an adverse outcome. If channeling is not identified and appropriately managed it might lead to confounding in observational comparative effectiveness studies.To demonstrate channeling among new users of second generation antipsychotics following a Food and Drug Administration safety advisory and to evaluate the impact of channeling on cardiovascular risk estimates over time.Florida Medicaid data from 2001-2006.Retrospective cohort of adults initiating second generation antipsychotics. We used propensity scores to match olanzapine initiators with other second generation antipsychotic initiators. To evaluate channeling away from olanzapine following an FDA safety advisory, we estimated calendar time-specific propensity scores. We compare the performance of these calendar time-specific propensity scores with conventionally-estimated propensity scores on estimates of cardiovascular risk.Increased channeling away from olanzapine was evident for some, but not all, cardiovascular risk factors and corresponded with the timing of the FDA advisory. Covariate balance was optimized within period and across all periods when using the calendar time-specific propensity score. Hazard ratio estimates for cardiovascular outcomes did not differ across models (Conventional PS: 0.97, 95%CI: 0.81-3.18 versus calendar time-specific PS: 0.93, 95%CI: 0.77-3.04.Changes in channeling over time was evident for several covariates but had limited impact on cardiovascular risk

  9. Applying the cost of generating force hypothesis to uphill running

    Directory of Open Access Journals (Sweden)

    Wouter Hoogkamer

    2014-07-01

    Full Text Available Historically, several different approaches have been applied to explain the metabolic cost of uphill human running. Most of these approaches result in unrealistically high values for the efficiency of performing vertical work during running uphill, or are only valid for running up steep inclines. The purpose of this study was to reexamine the metabolic cost of uphill running, based upon our understanding of level running energetics and ground reaction forces during uphill running. In contrast to the vertical efficiency approach, we propose that during incline running at a certain velocity, the forces (and hence metabolic energy required for braking and propelling the body mass parallel to the running surface are less than during level running. Based on this idea, we propose that the metabolic rate during uphill running can be predicted by a model, which posits that (1 the metabolic cost of perpendicular bouncing remains the same as during level running, (2 the metabolic cost of running parallel to the running surface decreases with incline, (3 the delta efficiency of producing mechanical power to lift the COM vertically is constant, independent of incline and running velocity, and (4 the costs of leg and arm swing do not change with incline. To test this approach, we collected ground reaction force (GRF data for eight runners who ran thirty 30-second trials (velocity: 2.0–3.0 m/s; incline: 0–9°. We also measured the metabolic rates of eight different runners for 17, 7-minute trials (velocity: 2.0–3.0 m/s; incline: 0–8°. During uphill running, parallel braking GRF approached zero for the 9° incline trials. Thus, we modeled the metabolic cost of parallel running as exponentially decreasing with incline. With that assumption, best-fit parameters for the metabolic rate data indicate that the efficiency of producing mechanical power to lift the center of mass vertically was independent of incline and running velocity, with a value of ∼29%. The metabolic cost of uphill running is not simply equal to the sum of the cost of level running and the cost of performing work to lift the body mass against gravity. Rather, it reflects a constant cost of perpendicular bouncing, decreased costs of parallel braking and propulsion and of course the cost of lifting body mass against gravity.

  10. Supporting hypothesis generation by learners exploring an interactive computer simulation

    NARCIS (Netherlands)

    van Joolingen, Wouter; de Jong, Anthonius J.M.

    1992-01-01

    Computer simulations provide environments enabling exploratory learning. Research has shown that these types of learning environments are promising applications of computer assisted learning but also that they introduce complex learning settings, involving a large number of learning processes. This

  11. Oncology In Vivo Data Integration for Hypothesis Generation

    Directory of Open Access Journals (Sweden)

    Wei Jia

    2012-06-01

    Full Text Available AstraZeneca’s Oncology in vivo data integration platform brings multidimensional data from animal model efficacy, pharmacokinetic and pharmacodynamic data to animal model profiling data and public in vivo studies. Using this platform, scientists can cluster model efficacy and model profiling data together, quickly identify responder profiles and correlate molecular characteristics to pharmacological response. Through meta-analysis, scientists can compare pharmacology between single and combination treatments, between different drug scheduling and administration routes.

  12. Hypothesis Generation in Quality Improvement Projects: Approaches for Exploratory Studies

    NARCIS (Netherlands)

    de Mast, J.; Bergman, M.

    2006-01-01

    In quality improvement projects - such as Six Sigma projects - an exploratory phase can be discerned, during which possible causes, influence factors or variation sources are identified. In a subsequent confirmatory phase the effects of these possible causes are experimentally verified. Whereas the

  13. Assessing Threat Detection Scenarios through Hypothesis Generation and Testing

    Science.gov (United States)

    2015-12-01

    task of playing chess with that of diagnosing medical conditions. Whereas both chess players and physicians must recognize patterns among a fixed...additional cues, all while maintaining these details in memory. Thus, the common research finding that chess experts can hold and process more information...Experiment Building Language (PEBL; Mueller, 2009) program was used to control the computer exercises. Soldiers completed the experiment by viewing

  14. Generation time, net reproductive rate, and growth in stage-age-structured populations

    DEFF Research Database (Denmark)

    Steiner, Uli; Tuljapurkar, Shripad; Coulson, Tim

    2014-01-01

    to age-structured populations. Here we generalize this result to populations structured by stage and age by providing a new, unique measure of reproductive timing (Tc) that, along with net reproductive rate (R0), has a direct mathematical relationship to and approximates growth rate (r). We use simple...... examples to show how reproductive timing Tc and level R0 are shaped by stage dynamics (individual trait changes), selection on the trait, and parent-offspring phenotypic correlation. We also show how population structure can affect dispersion in reproduction among ages and stages. These macroscopic...... features of the life history determine population growth rate r and reveal a complex interplay of trait dynamics, timing, and level of reproduction. Our results contribute to a new framework of population and evolutionary dynamics in stage-and-age-structured populations....

  15. Time-of-flight isotope separator for a second-generation ISOL facility

    CERN Document Server

    Jacquot, B

    2003-01-01

    We focus on the study of a low energy and a high resolving power separator dedicated for an exotic isotope accelerator facility. The approach is based on the use of a time-of-flight technique in a long isochronous section. Different ion species are bunched and then separated in time, in an energy-isochronous section. We then transform the time shift in a transverse shift by a chopper in order to eliminate the unwanted ions using slits. A mass-resolving power of R sub M =10,000 seems feasible for low energy, multi-charged or mono-charged beams with a transverse acceptance up to 50 pi mm mrad.

  16. Self-regulation across time of first-generation online learners

    Directory of Open Access Journals (Sweden)

    Lucy Barnard-Brak

    2010-12-01

    Full Text Available Self-regulatory skills have been associated with positive outcomes for learners. In the current study, we examined the self-regulatory skills of students who are firstgeneration online learners over the course of their first semester of online instruction. The purpose of this study is to determine whether the online selfregulatory skills of learners changed across time as associated with being immersed in their first online learning environment. The results of the current study indicate no significant differences in the online self-regulatory skills of learners across time. Results suggest that environmental factors such as being immersed in an online learning environment for the first time is not, in and of itself, associated with the development of self-regulatory skills of online learners. We conclude that the design of online courses needs to consider ways of developing self-regulatory skills as these skills are not automatically developed with students' online learning experiences.

  17. Water contaminated with Didymosphenia geminata generates changes in Salmo salar spermatozoa activation times.

    Science.gov (United States)

    Olivares, Pamela; Orellana, Paola; Guerra, Guillermo; Peredo-Parada, Matías; Chavez, Viviana; Ramirez, Alfredo; Parodi, Jorge

    2015-06-01

    Didimosphenia geminata ("didymo"), has become a powerful and devastating river plague in Chile. A system was developed in D. geminata channels with the purpose evaluating the effects of water polluted with didymo on the activation of Atlantic salmon (Salmo salar) spermatozoa. Results indicate that semen, when activated with uncontaminated river water had an average time of 60±21s. When using Powermilt, (a commercial activator), times of 240±21s are achieved, while rivers contaminated with D. geminata achieve a motility time of 30±12s. Interestingly enough, the kinetic parameters of VSL, VCL and VAP showed no significant changes under all of the conditions. Furthermore, the presence of D. geminata reduces activation time of the samples as the cells age, indicating increased effects in spermatozoa that are conserved for more than 5 days. D. geminata has antioxidant content, represented by polyphenols; 200ppm of polyphenol were obtained in this study per 10g of microalgae. Spermatozoa exposed to these extracts showed a reduction in mobility time in a dose dependent manner, showing an IC50 of 15ppm. The results suggest an effect on spermatozoa activation, possibly due to the release of polyphenols present in contaminated rivers, facilitating the alteration of sperm motility times, without affecting the viability or kinetics of the cells. These findings have important implications for current policy regarding the control of the algae. Current control measures focus on the number of visible species, and not on the compounds that they release, which this study shows, also have a problematic effect on salmon production. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Backup key generation model for one-time password security protocol

    Science.gov (United States)

    Jeyanthi, N.; Kundu, Sourav

    2017-11-01

    The use of one-time password (OTP) has ushered new life into the existing authentication protocols used by the software industry. It introduced a second layer of security to the traditional username-password authentication, thus coining the term, two-factor authentication. One of the drawbacks of this protocol is the unreliability of the hardware token at the time of authentication. This paper proposes a simple backup key model that can be associated with the real world applications’user database, which would allow a user to circumvent the second authentication stage, in the event of unavailability of the hardware token.

  19. Interpretative possibilities and limitations of Saxe/Goldstein hypothesis

    Directory of Open Access Journals (Sweden)

    André Strauss

    2012-08-01

    Full Text Available The Saxe/Goldstein Hypothesis was generated within the processual archaeology milieu and therefore it was supposed to allow reconstructing the social dimensions of past populations by studying their mortuary practices. In its original form stated that the emergency of formal cemeteries would be the result of an increase on the competition for vital resources. This would lead to the formation of corporate groups of descent whose main objectives were to monopolize the access to vital resources. Later, a reformulated version of this hypothesis was developed emphasizing the relationship between the presence of formal cemeteries and mobility pattern of human groups. In this contribution we present a critical review on the formation of this hypothesis and discuss its limitations. Finally, two examples taken from the Brazilian archaeological record are used to show how the lack of a critical posture in relation to the Saxe/Goldstein Hypothesis may lead to fragile interpretations of the archaeological record.

  20. Time Resolved Shadowgraph Images of Silicon during Laser Ablation: Shockwaves and Particle Generation

    International Nuclear Information System (INIS)

    Liu, C Y; Mao, X L; Greif, R; Russo, R E

    2007-01-01

    Time resolved shadowgraph images were recorded of shockwaves and particle ejection from silicon during laser ablation. Particle ejection and expansion were correlated to an internal shockwave resonating between the shockwave front and the target surface. The number of particles ablated increased with laser energy and was related to the crater volume

  1. Time Resolved Shadowgraph Images of Silicon during Laser Ablation:Shockwaves and Particle Generation

    Energy Technology Data Exchange (ETDEWEB)

    Liu, C.Y.; Mao, X.L.; Greif, R.; Russo, R.E.

    2006-05-06

    Time resolved shadowgraph images were recorded of shockwaves and particle ejection from silicon during laser ablation. Particle ejection and expansion were correlated to an internal shockwave resonating between the shockwave front and the target surface. The number of particles ablated increased with laser energy and was related to the crater volume.

  2. Finite-Difference Time-Domain Modeling of Infrasonic Waves Generated by Supersonic Auroral Arcs

    Science.gov (United States)

    Pasko, V. P.

    2010-12-01

    Atmospheric infrasonic waves are acoustic waves with frequencies ranging from ˜0.02 to ˜10 Hz [e.g., Blanc, Ann. Geophys., 3, 673, 1985]. The importance of infrasound studies has been emphasized in the past ten years from the Comprehensive Nuclear-Test-Ban Treaty verification perspective [e.g., Le Pichon et al., JGR, 114, D08112, 2009]. A proper understanding of infrasound propagation in the atmosphere is required for identification and classification of different infrasonic waves and their sources [Drob et al., JGR, 108, D21, 4680, 2003]. In the present work we employ a FDTD model of infrasound propagation in a realistic atmosphere to provide quantitative interpretation of infrasonic waves produced by auroral arcs moving with supersonic speed. We have recently applied similar modeling approaches for studies of infrasonic waves generated from thunderstorms [e.g., Few, Handbook of Atmospheric Electrodynamics, H. Volland (ed.), Vol. 2, pp.1-31, CRC Press, 1995], quantitative interpretation of infrasonic signatures from pulsating auroras [Wilson et al., GRL, 32, L14810, 2005], and studies of infrasonic waves generated by transient luminous events in the middle atmosphere termed sprites [e.g., Farges, Lightning: Principles, Instruments and Applications, H.D. Betz et al. (eds.), Ch.18, Springer, 2009]. The related results have been reported in [Pasko, JGR, 114, D08205, 2009], [de Larquier et al., GRL, 37, L06804, 2010], and [de Larquier, MS Thesis, Penn State, Aug. 2010], respectively. In the FDTD model, the altitude and frequency dependent attenuation coefficients provided by Sutherland and Bass [J. Acoust. Soc. Am., 115, 1012, 2004] are included in classical equations of acoustics in a gravitationally stratified atmosphere using a decomposition technique recently proposed by de Groot-Hedlin [J. Acoust. Soc. Am., 124, 1430, 2008]. The auroral infrasonic waves (AIW) in the frequency range 0.1-0.01 Hz associated with the supersonic motion of auroral arcs have been

  3. Multiple time-scale optimization scheduling for islanded microgrids including PV, wind turbine, diesel generator and batteries

    DEFF Research Database (Denmark)

    Xiao, Zhao xia; Nan, Jiakai; Guerrero, Josep M.

    2017-01-01

    the adjustment of the day-ahead scheduling and giving priority to the use of renewable energy. According to the forecast of the critical and noncritical load, the wind speed, and the solar irradiation, mixed integer linear programming (MILP) optimization method is used to solve the multi-objective optimization......A multiple time-scale optimization scheduling including day ahead and short time for an islanded microgrid is presented. In this paper, the microgrid under study includes photovoltaics (PV), wind turbine (WT), diesel generator (DG), batteries, and shiftable loads. The study considers the maximum...... efficiency operation area for the diesel engine and the cost of the battery charge/discharge cycle losses. The day-ahead generation scheduling takes into account the minimum operational cost and the maximum load satisfaction as the objective function. Short-term optimal dispatch is based on minimizing...

  4. Empiric model for mean generation time adjustment factor for classic point kinetics equations

    International Nuclear Information System (INIS)

    Goes, David A.B.V. de; Martinez, Aquilino S.; Goncalves, Alessandro da C.

    2017-01-01

    Point reactor kinetics equations are the easiest way to observe the neutron production time behavior in a nuclear reactor. These equations are derived from the neutron transport equation using an approximation called Fick's law leading to a set of first order differential equations. The main objective of this study is to review classic point kinetics equation in order to approximate its results to the case when it is considered the time variation of the neutron currents. The computational modeling used for the calculations is based on the finite difference method. The results obtained with this model are compared with the reference model and then it is determined an empirical adjustment factor that modifies the point reactor kinetics equation to the real scenario. (author)

  5. Empiric model for mean generation time adjustment factor for classic point kinetics equations

    Energy Technology Data Exchange (ETDEWEB)

    Goes, David A.B.V. de; Martinez, Aquilino S.; Goncalves, Alessandro da C., E-mail: david.goes@poli.ufrj.br, E-mail: aquilino@lmp.ufrj.br, E-mail: alessandro@con.ufrj.br [Coordenacao de Pos-Graduacao e Pesquisa de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Departamento de Engenharia Nuclear

    2017-11-01

    Point reactor kinetics equations are the easiest way to observe the neutron production time behavior in a nuclear reactor. These equations are derived from the neutron transport equation using an approximation called Fick's law leading to a set of first order differential equations. The main objective of this study is to review classic point kinetics equation in order to approximate its results to the case when it is considered the time variation of the neutron currents. The computational modeling used for the calculations is based on the finite difference method. The results obtained with this model are compared with the reference model and then it is determined an empirical adjustment factor that modifies the point reactor kinetics equation to the real scenario. (author)

  6. Time-resolved diagnostics of excimer laser-generated ablation plasmas used for pulsed laser deposition

    International Nuclear Information System (INIS)

    Geohegan, D.B.

    1994-01-01

    Characteristics of laser plasmas used for pulsed laser deposition (PLD) of thin films are examined with four in situ diagnostic techniques: Optical emission spectroscopy, optical absorption spectroscopy, ion probe studies, and gated ICCD (intensified charge-coupled-device array) fast photography. These four techniques are complementary and permit simultaneous views of the transport of ions, excited states, ground state neutrals and ions, and hot particulates following KrF laser ablation of YBCO, BN, graphite and Si in vacuum and background gases. The implementation and advantages of the four techniques are first described in order to introduce the key features of laser plasmas for pulsed laser deposition. Aspects of the interaction of the ablation plume with background gases (i.e., thermalization, attenuation, shock formation) and the collision of the plasma plume with the substrate heater are then summarized. The techniques of fast ICCD photography and gated photon counting are then applied to investigate the temperature, velocity, and spatial distribution of hot particles generated during KrF ablation of YBCO, BN, Si and graphite. Finally, key features of fast imaging of the laser ablation of graphite into high pressure rare gases are presented in order to elucidate internal reflected shocks within the plume, redeposition of material on a surface, and formation of hot nanoparticles within the plume

  7. Time-resolved diagnostics of excimer laser-generated ablation plasmas used for pulsed laser deposition

    Energy Technology Data Exchange (ETDEWEB)

    Geohegan, D.B.

    1994-09-01

    Characteristics of laser plasmas used for pulsed laser deposition (PLD) of thin films are examined with four in situ diagnostic techniques: Optical emission spectroscopy, optical absorption spectroscopy, ion probe studies, and gated ICCD (intensified charge-coupled-device array) fast photography. These four techniques are complementary and permit simultaneous views of the transport of ions, excited states, ground state neutrals and ions, and hot particulates following KrF laser ablation of YBCO, BN, graphite and Si in vacuum and background gases. The implementation and advantages of the four techniques are first described in order to introduce the key features of laser plasmas for pulsed laser deposition. Aspects of the interaction of the ablation plume with background gases (i.e., thermalization, attenuation, shock formation) and the collision of the plasma plume with the substrate heater are then summarized. The techniques of fast ICCD photography and gated photon counting are then applied to investigate the temperature, velocity, and spatial distribution of hot particles generated during KrF ablation of YBCO, BN, Si and graphite. Finally, key features of fast imaging of the laser ablation of graphite into high pressure rare gases are presented in order to elucidate internal reflected shocks within the plume, redeposition of material on a surface, and formation of hot nanoparticles within the plume.

  8. Energy intensities, EROIs (energy returned on invested), and energy payback times of electricity generating power plants

    International Nuclear Information System (INIS)

    Weißbach, D.; Ruprecht, G.; Huke, A.; Czerski, K.; Gottlieb, S.; Hussein, A.

    2013-01-01

    The energy returned on invested, EROI, has been evaluated for typical power plants representing wind energy, photovoltaics, solar thermal, hydro, natural gas, biogas, coal and nuclear power. The strict exergy concept with no “primary energy weighting”, updated material databases, and updated technical procedures make it possible to directly compare the overall efficiency of those power plants on a uniform mathematical and physical basis. Pump storage systems, needed for solar and wind energy, have been included in the EROI so that the efficiency can be compared with an “unbuffered” scenario. The results show that nuclear, hydro, coal, and natural gas power systems (in this order) are one order of magnitude more effective than photovoltaics and wind power. - Highlights: ► Nuclear, “renewable” and fossil energy are comparable on a uniform physical basis. ► Energy storage is considered for the calculation, reducing the ERoEI remarkably. ► All power systems generate more energy than they consume. ► Photovoltaics, biomass and wind (buffered) are below the economical threshold

  9. Time profile of harmonics generated by a single atom in a strong electromagnetic field

    International Nuclear Information System (INIS)

    Antoine, P.; Piraux, B.; Maquet, A.

    1995-01-01

    We show that the time profile of the harmonics emitted by a single atom exposed to a strong electromagnetic field may be obtained through a wavelet or a Gabor analysis of the acceleration of the atomic dipole. This analysis is extremely sensitive to the details of the dynamics and sheds some light on the competition between the atomic excitation or ionization processes and photon emission. For illustration we study the interaction of atomic hydrogen with an intense laser pulse

  10. Real time hardware implementation of power converters for grid integration of distributed generation and STATCOM systems

    Science.gov (United States)

    Jaithwa, Ishan

    Deployment of smart grid technologies is accelerating. Smart grid enables bidirectional flows of energy and energy-related communications. The future electricity grid will look very different from today's power system. Large variable renewable energy sources will provide a greater portion of electricity, small DERs and energy storage systems will become more common, and utilities will operate many different kinds of energy efficiency. All of these changes will add complexity to the grid and require operators to be able to respond to fast dynamic changes to maintain system stability and security. This thesis investigates advanced control technology for grid integration of renewable energy sources and STATCOM systems by verifying them on real time hardware experiments using two different systems: d SPACE and OPAL RT. Three controls: conventional, direct vector control and the intelligent Neural network control were first simulated using Matlab to check the stability and safety of the system and were then implemented on real time hardware using the d SPACE and OPAL RT systems. The thesis then shows how dynamic-programming (DP) methods employed to train the neural networks are better than any other controllers where, an optimal control strategy is developed to ensure effective power delivery and to improve system stability. Through real time hardware implementation it is proved that the neural vector control approach produces the fastest response time, low overshoot, and, the best performance compared to the conventional standard vector control method and DCC vector control technique. Finally the entrepreneurial approach taken to drive the technologies from the lab to market via ORANGE ELECTRIC is discussed in brief.

  11. Use of Participant-Generated Photographs versus Time Use Diaries as a Method of Qualitative Data Collection

    Directory of Open Access Journals (Sweden)

    MaryEllen Thompson PhD, OTR/L

    2013-02-01

    Full Text Available A small qualitative research study was chosen as a time efficient way to allow students to participate in and complete a research project during a 16-week long semester course. In the first year of the research contribution course, student researchers asked participants with diabetes to complete time use diaries as a part of their initial data collection. The time use diaries were found to be an ineffective way to collect data on self-management of diabetes and were not useful as a basis for subsequent interviews with the participants. A review of the literature suggested reasons for this lack of effectiveness; in particular, participants tend not to record frequently done daily activities. Further review of the literature pointed toward the use of participant-generated photography as an alternative. Subsequent participants were asked to take photographs of their daily self-management of their diabetes for initial data collection. These photographs provided a strong basis for subsequent interviews with the participants. A comparison of the data collected and the emergent themes from the two different methods of initial data collection demonstrated the improved ability to answer the original research question when using participant-generated photography as a basis for participant interviews. The student researchers found the use of participant-generated photographs to elicit interviews with participants in the context of a research contribution course to be effective and enjoyable.

  12. Metabolic syndrome--neurotrophic hypothesis.

    Science.gov (United States)

    Hristova, M; Aloe, L

    2006-01-01

    An increasing number of researchers of the metabolic syndrome assume that many mechanisms are involved in its complex pathophysiology such as an increased sympathetic activity, disorders of the hypothalamo-pituitary-adrenal axis, the action of chronic subclinical infections, proinflammatory cytokines, and the effect of adipocytokines or psychoemotional stress. An increasing body of scientific research in this field confirms the role of the neurotrophins and mastocytes in the pathogenesis of inflammatory and immune diseases. Recently it has been proved that neurotrophins and mastocytes have metabotrophic effects and take part in the carbohydrate and lipid metabolism. In the early stage of the metabolic syndrome we established a statistically significant increase in the plasma levels of the nerve growth factor. In the generalized stage the plasma levels of the neutrophines were statistically decreased in comparison to those in the healthy controls. We consider that the neurotrophin deficit is likely to play a significant pathogenic role in the development of the metabolic anthropometric and vascular manifestations of the generalized stage of MetSyn. We suggest a hypothesis for the etiopathogenesis of the metabolic syndrome based on the neuro-immuno-endocrine interactions. The specific pathogenic pathways of MetSyn development include: (1) increased tissue and plasma levels of proinflammatory cytokines Interleukin-1(IL-1), Interleukin-6 (IL-6 ) and tumor necrosis factor - alpha (TNF-alpha) caused by inflammatory and/or emotional distress; (2) increased plasma levels of neurotrophin - nerve growth factor (NGF) caused by the high IL-1, IL-6 and TNFalpha levels; (3) high plasma levels of NGF which enhance activation of: the autonomous nerve system--vegetodystonia (disbalance of neurotransmitters); Neuropeptide Y (NPY)--enhanced feeding, obesity and increased leptin plasma levels; hypothalamo-pituitary-adrenal axis--increased corticotropin-releasing hormone (CRH) and

  13. Operation time extension for power units of the first generation NPP and the liability for potential damage

    International Nuclear Information System (INIS)

    Kovalevich, O.M.

    2000-01-01

    The problem on the operation time extension for the six operating NPP first generation power units is discussed. However it is not advisable to improve the safety of these power units up to the acceptable level, therefore there arises the contradiction between the operation time extension of these power units and potential damage for the population. The possibility of having the increased civilian-legal responsibility for potential harm and losses in case of an accident is proposed to be considered as a compensating measure. The measures for realization of this civilian-legal responsibility are described [ru

  14. Comprehensive Cost Minimization in Distribution Networks Using Segmented-time Feeder Reconfiguration and Reactive Power Control of Distributed Generators

    DEFF Research Database (Denmark)

    Chen, Shuheng; Hu, Weihao; Chen, Zhe

    2016-01-01

    In this paper, an efficient methodology is proposed to deal with segmented-time reconfiguration problem of distribution networks coupled with segmented-time reactive power control of distributed generators. The target is to find the optimal dispatching schedule of all controllable switches...... (FAHPSO) is implemented in VC++ 6.0 program language. A modified version of the typical 70-node distribution network and several real distribution networks are used to test the performance of the proposed method. Numerical results show that the proposed methodology is an efficient method for comprehensive...

  15. Design Report for the Synchronized Position, Velocity, and Time Code Generator

    Science.gov (United States)

    2015-08-01

    I2 n/a GPS week 10 U1 n/a Fix Type: 0 = NoFix 1 = Dead reckoning 2 = 2D fix 3 = 3D fix 4 = GPS + dead reckoning 5 = Time only fix 11 U1 n/a...8-bit Fletcher algorithm as specified by RFC 1145 and is calculated over the entire packet: SOP, MID, and payload. Table 1 Data packet format...activities. The precise synchronization is required for both fusion of information and comparison to truth data during algorithm development. The

  16. Combined time-varying forecast based on the proper scoring approach for wind power generation

    DEFF Research Database (Denmark)

    Chen, Xingying; Jiang, Yu; Yu, Kun

    2017-01-01

    Compared with traditional point forecasts, combined forecast have been proposed as an effective method to provide more accurate forecasts than individual model. However, the literature and research focus on wind-power combined forecasts are relatively limited. Here, based on forecasting error...... distribution, a proper scoring approach is applied to combine plausible models to form an overall time-varying model for the next day forecasts, rather than weights-based combination. To validate the effectiveness of the proposed method, real data of 3 years were used for testing. Simulation results...... demonstrate that the proposed method improves the accuracy of overall forecasts, even compared with a numerical weather prediction....

  17. A Kalman Filter-Based Method to Generate Continuous Time Series of Medium-Resolution NDVI Images

    Directory of Open Access Journals (Sweden)

    Fernando Sedano

    2014-12-01

    Full Text Available A data assimilation method to produce complete temporal sequences of synthetic medium-resolution images is presented. The method implements a Kalman filter recursive algorithm that integrates medium and moderate resolution imagery. To demonstrate the approach, time series of 30-m spatial resolution NDVI images at 16-day time steps were generated using Landsat NDVI images and MODIS NDVI products at four sites with different ecosystems and land cover-land use dynamics. The results show that the time series of synthetic NDVI images captured seasonal land surface dynamics and maintained the spatial structure of the landscape at higher spatial resolution. The time series of synthetic medium-resolution NDVI images were validated within a Monte Carlo simulation framework. Normalized residuals decreased as the number of available observations increased, ranging from 0.2 to below 0.1. Residuals were also significantly lower for time series of synthetic NDVI images generated at combined recursion (smoothing than individually at forward and backward recursions (filtering. Conversely, the uncertainties of the synthetic images also decreased when the number of available observations increased and combined recursions were implemented.

  18. Photonics-based real-time ultra-high-range-resolution radar with broadband signal generation and processing.

    Science.gov (United States)

    Zhang, Fangzheng; Guo, Qingshui; Pan, Shilong

    2017-10-23

    Real-time and high-resolution target detection is highly desirable in modern radar applications. Electronic techniques have encountered grave difficulties in the development of such radars, which strictly rely on a large instantaneous bandwidth. In this article, a photonics-based real-time high-range-resolution radar is proposed with optical generation and processing of broadband linear frequency modulation (LFM) signals. A broadband LFM signal is generated in the transmitter by photonic frequency quadrupling, and the received echo is de-chirped to a low frequency signal by photonic frequency mixing. The system can operate at a high frequency and a large bandwidth while enabling real-time processing by low-speed analog-to-digital conversion and digital signal processing. A conceptual radar is established. Real-time processing of an 8-GHz LFM signal is achieved with a sampling rate of 500 MSa/s. Accurate distance measurement is implemented with a maximum error of 4 mm within a range of ~3.5 meters. Detection of two targets is demonstrated with a range-resolution as high as 1.875 cm. We believe the proposed radar architecture is a reliable solution to overcome the limitations of current radar on operation bandwidth and processing speed, and it is hopefully to be used in future radars for real-time and high-resolution target detection and imaging.

  19. The oxidative hypothesis of senescence

    Directory of Open Access Journals (Sweden)

    Gilca M

    2007-01-01

    Full Text Available The oxidative hypothesis of senescence, since its origin in 1956, has garnered significant evidence and growing support among scientists for the notion that free radicals play an important role in ageing, either as "damaging" molecules or as signaling molecules. Age-increasing oxidative injuries induced by free radicals, higher susceptibility to oxidative stress in short-lived organisms, genetic manipulations that alter both oxidative resistance and longevity and the anti-ageing effect of caloric restriction and intermittent fasting are a few examples of accepted scientific facts that support the oxidative theory of senescence. Though not completely understood due to the complex "network" of redox regulatory systems, the implication of oxidative stress in the ageing process is now well documented. Moreover, it is compatible with other current ageing theories (e.g., those implicating the mitochondrial damage/mitochondrial-lysosomal axis, stress-induced premature senescence, biological "garbage" accumulation, etc. This review is intended to summarize and critically discuss the redox mechanisms involved during the ageing process: sources of oxidant agents in ageing (mitochondrial -electron transport chain, nitric oxide synthase reaction- and non-mitochondrial- Fenton reaction, microsomal cytochrome P450 enzymes, peroxisomal β -oxidation and respiratory burst of phagocytic cells, antioxidant changes in ageing (enzymatic- superoxide dismutase, glutathione-reductase, glutathion peroxidase, catalase- and non-enzymatic glutathione, ascorbate, urate, bilirubine, melatonin, tocopherols, carotenoids, ubiquinol, alteration of oxidative damage repairing mechanisms and the role of free radicals as signaling molecules in ageing.

  20. Robust and distributed hypothesis testing

    CERN Document Server

    Gül, Gökhan

    2017-01-01

    This book generalizes and extends the available theory in robust and decentralized hypothesis testing. In particular, it presents a robust test for modeling errors which is independent from the assumptions that a sufficiently large number of samples is available, and that the distance is the KL-divergence. Here, the distance can be chosen from a much general model, which includes the KL-divergence as a very special case. This is then extended by various means. A minimax robust test that is robust against both outliers as well as modeling errors is presented. Minimax robustness properties of the given tests are also explicitly proven for fixed sample size and sequential probability ratio tests. The theory of robust detection is extended to robust estimation and the theory of robust distributed detection is extended to classes of distributions, which are not necessarily stochastically bounded. It is shown that the quantization functions for the decision rules can also be chosen as non-monotone. Finally, the boo...

  1. The venom optimization hypothesis revisited.

    Science.gov (United States)

    Morgenstern, David; King, Glenn F

    2013-03-01

    Animal venoms are complex chemical mixtures that typically contain hundreds of proteins and non-proteinaceous compounds, resulting in a potent weapon for prey immobilization and predator deterrence. However, because venoms are protein-rich, they come with a high metabolic price tag. The metabolic cost of venom is sufficiently high to result in secondary loss of venom whenever its use becomes non-essential to survival of the animal. The high metabolic cost of venom leads to the prediction that venomous animals may have evolved strategies for minimizing venom expenditure. Indeed, various behaviors have been identified that appear consistent with frugality of venom use. This has led to formulation of the "venom optimization hypothesis" (Wigger et al. (2002) Toxicon 40, 749-752), also known as "venom metering", which postulates that venom is metabolically expensive and therefore used frugally through behavioral control. Here, we review the available data concerning economy of venom use by animals with either ancient or more recently evolved venom systems. We conclude that the convergent nature of the evidence in multiple taxa strongly suggests the existence of evolutionary pressures favoring frugal use of venom. However, there remains an unresolved dichotomy between this economy of venom use and the lavish biochemical complexity of venom, which includes a high degree of functional redundancy. We discuss the evidence for biochemical optimization of venom as a means of resolving this conundrum. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Alien abduction: a medical hypothesis.

    Science.gov (United States)

    Forrest, David V

    2008-01-01

    In response to a new psychological study of persons who believe they have been abducted by space aliens that found that sleep paralysis, a history of being hypnotized, and preoccupation with the paranormal and extraterrestrial were predisposing experiences, I noted that many of the frequently reported particulars of the abduction experience bear more than a passing resemblance to medical-surgical procedures and propose that experience with these may also be contributory. There is the altered state of consciousness, uniformly colored figures with prominent eyes, in a high-tech room under a round bright saucerlike object; there is nakedness, pain and a loss of control while the body's boundaries are being probed; and yet the figures are thought benevolent. No medical-surgical history was apparently taken in the above mentioned study, but psychological laboratory work evaluated false memory formation. I discuss problems in assessing intraoperative awareness and ways in which the medical hypothesis could be elaborated and tested. If physicians are causing this syndrome in a percentage of patients, we should know about it; and persons who feel they have been abducted should be encouraged to inform their surgeons and anesthesiologists without challenging their beliefs.

  3. Three-Dimensional Time Domain Simulation of Tsunami-Generated Electromagnetic Fields: Application to the 2011 Tohoku Earthquake Tsunami

    Science.gov (United States)

    Minami, Takuto; Toh, Hiroaki; Ichihara, Hiroshi; Kawashima, Issei

    2017-12-01

    We present a new finite element simulation approach in time domain for electromagnetic (EM) fields associated with motional induction by tsunamis. Our simulation method allows us to conduct three-dimensional simulation with realistic smooth bathymetry and to readily obtain broad structures of tsunami-generated EM fields and their time evolution, benefitting from time domain implementation with efficient unstructured mesh. Highly resolved mesh near observation sites enables us to compare simulation results with observed data and to investigate tsunami properties in terms of EM variations. Furthermore, it makes source separations available for EM data during tsunami events. We applied our simulation approach to the 2011 Tohoku tsunami event with seawater velocity from linear-long and linear-Boussinesq approximations. We revealed that inclusion of dispersion effect is necessary to explain magnetic variations at a northwest Pacific seafloor site, 1,500 km away from the epicenter, while linear-long approximation is enough at a seafloor site 200 km east-northeast of the epicenter. Our simulations provided, for the first time, comprehensive views of spatiotemporal structures of tsunami-generated EM fields for the 2011 Tohoku tsunami, including large-scale electric current circuits in the ocean. Finally, subtraction of the simulated magnetic fields from the observed data revealed symmetric magnetic variations on the western and eastern sides of the epicenter for 30 min since the earthquake origin time. These imply a pair of southward and northward electric currents in the ionosphere that exist on the western and eastern sides of the source region, respectively, which was likely to be caused by tsunami-generated atmospheric acoustic/gravity waves reaching the ionosphere.

  4. Time-resolved measurements with streaked diffraction patterns from electrons generated in laser plasma wakefield

    Science.gov (United States)

    He, Zhaohan; Nees, John; Hou, Bixue; Krushelnick, Karl; Thomas, Alec; Beaurepaire, Benoît; Malka, Victor; Faure, Jérôme

    2013-10-01

    Femtosecond bunches of electrons with relativistic to ultra-relativistic energies can be robustly produced in laser plasma wakefield accelerators (LWFA). Scaling the electron energy down to sub-relativistic and MeV level using a millijoule laser system will make such electron source a promising candidate for ultrafast electron diffraction (UED) applications due to the intrinsic short bunch duration and perfect synchronization with the optical pump. Recent results of electron diffraction from a single crystal gold foil, using LWFA electrons driven by 8-mJ, 35-fs laser pulses at 500 Hz, will be presented. The accelerated electrons were collimated with a solenoid magnetic lens. By applying a small-angle tilt to the magnetic lens, the diffraction pattern can be streaked such that the temporal evolution is separated spatially on the detector screen after propagation. The observable time window and achievable temporal resolution are studied in pump-probe measurements of photo-induced heating on the gold foil.

  5. Relaxation time diagram for identifying heat generation mechanisms in magnetic fluid hyperthermia

    Energy Technology Data Exchange (ETDEWEB)

    Lima, Enio, E-mail: lima@cab.cnea.gov.ar; De Biasi, Emilio; Zysler, Roberto D.; Vasquez Mansilla, Marcelo; Mojica-Pisciotti, Mary L. [Centro Atómico Bariloche/CONICET (Argentina); Torres, Teobaldo E.; Calatayud, M. Pilar; Marquina, C.; Ricardo Ibarra, M.; Goya, Gerardo F. [Universidad de Zaragoza, Instituto de Nanociencia de Aragón INA (Spain)

    2014-12-15

    We present a versatile diagram to envisage the dominant relaxation mechanism of single-domain magnetic nanoparticles (MNPs) under alternating magnetic fields, as those used in magnetic fluid hyperthermia (MFH). The diagram allows estimating the heating efficiency, measured by the Specific Power Absorption (SPA), originated in the magnetic and viscous relaxation times of single-domain MNPs for a given frequency of the ac magnetic field (AFM). The diagram has been successfully applied to different colloids, covering a wide variety of MNPs with different magnetic anisotropy and particle size, and dispersed in different viscous liquid carriers. From the general diagram, we derived a specific chart based on the Linear Response Theory in order to easily estimate the experimental condition for the optimal SPA values of most colloids currently used in MFH.

  6. Generation of a 3-Years Time Serie of Daily Actual Evapotranspiration over the Tibetan Plateau

    Science.gov (United States)

    Faivre, R.; Colin, J.; Menenti, M.

    2016-08-01

    The estimation of turbulent fuxes is of primary interest for hydrological and climatological studies. Also the use of optical remote sensing data in the VNIR and TIR domain already proved to allow for the parameterization of surface energy balance, leading to many algorithms. Their use over arid high elevation areas require detailed characterisation of key surface physical properties and atmospheric statement at a reference level. Satellite products aquired over the Tibetan Plateau and simulations results delivered in the frame of the CEOP-AEGIS project provide incentives for a regular analysis at medium scale.This work aims at improving the use of spaceborne optical remote sensing (VNIR and TIR) for land surface evapotranspiration (ET) mapping. This led to the development of a processing chain based on SEBI algorithm, for the production of a daily actual ET time serie over the whole Tibetan Plateau during the period 2008-2010 (Faivre, 2014).

  7. Early hominins in Europe: The Galerian migration hypothesis

    Science.gov (United States)

    Muttoni, Giovanni; Scardia, Giancarlo; Kent, Dennis V.

    2018-01-01

    Our updated review of sites bearing hominin remains and/or tools from Europe, including new findings from the Balkans, still indicates that the only compelling evidence of main hominin presence in these regions was only since ∼0.9 million years ago (Ma), bracketed by the end of the Jaramillo geomagnetic polarity subchron (0.99 Ma) and the Brunhes-Matuyama polarity chron boundary (0.78 Ma). This time window straddled the late Early Pleistocene climate transition (EPT) at the onset of enhanced glacial/interglacial activity that reverberated worldwide. Europe may have become initially populated during the EPT when, possibly for the first time in the Pleistocene, vast and exploitable ecosystems were generated along the eustatically emergent Po-Danube terrestrial conduit. These newly formed settings, characterized by stable terrestrial lowlands with open grasslands and reduced woody cover especially during glacial/interglacial transitions, are regarded as optimal ecosystems for several large Galerian immigrant mammals such as African and Asian megaherbivores, possibly linked with hominins in a common food web, to expand into en route to Europe. The question of when hominins first arrived in Europe thus places the issue in the context of changes in climate, paleogeography and faunal associations as potential environmental drivers and controlling agents in a specific time frame, a key feature of the Galerian migration hypothesis.

  8. Dynamic Agricultural Land Unit Profile Database Generation using Landsat Time Series Images

    Science.gov (United States)

    Torres-Rua, A. F.; McKee, M.

    2012-12-01

    Agriculture requires continuous supply of inputs to production, while providing final or intermediate outputs or products (food, forage, industrial uses, etc.). Government and other economic agents are interested in the continuity of this process and make decisions based on the available information about current conditions within the agriculture area. From a government point of view, it is important that the input-output chain in agriculture for a given area be enhanced in time, while any possible abrupt disruption be minimized or be constrained within the variation tolerance of the input-output chain. The stability of the exchange of inputs and outputs becomes of even more important in disaster-affected zones, where government programs will look for restoring the area to equal or enhanced social and economical conditions before the occurrence of the disaster. From an economical perspective, potential and existing input providers require up-to-date, precise information of the agriculture area to determine present and future inputs and stock amounts. From another side, agriculture output acquirers might want to apply their own criteria to sort out present and future providers (farmers or irrigators) based on the management done during the irrigation season. In the last 20 years geospatial information has become available for large areas in the globe, providing accurate, unbiased historical records of actual agriculture conditions at individual land units for small and large agricultural areas. This data, adequately processed and stored in any database format, can provide invaluable information for government and economic interests. Despite the availability of the geospatial imagery records, limited or no geospatial-based information about past and current farming conditions at the level of individual land units exists for many agricultural areas in the world. The absence of this information challenges the work of policy makers to evaluate previous or current

  9. Spin force and the generation of sustained spin current in time-dependent Rashba and Dresselhaus systems

    International Nuclear Information System (INIS)

    Ho, Cong Son; Tan, Seng Ghee; Jalil, Mansoor B. A.

    2014-01-01

    The generation of spin current and spin polarization in a two-dimensional electron gas structure is studied in the presence of Dresselhaus and Rashba spin-orbit couplings (SOC), the strength of the latter being modulated in time by an ac gate voltage. By means of the non-Abelian gauge field approach, we established the relation between the Lorentz spin force and the spin current in the SOC system, and showed that the longitudinal component of the spin force induces a transverse spin current. For a constant (time-invariant) Rashba system, we recover the universal spin Hall conductivity of e/(8π) , derived previously via the Berry phase and semi-classical methods. In the case of a time-dependent SOC system, the spin current is sustained even under strong impurity scattering. We evaluated the ac spin current generated by a time-modulated Rashba SOC in the absence of any dc electric field. The magnitude of the spin current reaches a maximum when the modulation frequency matches the Larmor frequency of the electrons

  10. Infection of Gymnodinium sanguineum by the dinoflagellate Amoebophrya sp.: effect of nutrient environment on parasite generation time, reproduction, and infectivity.

    Science.gov (United States)

    Yih, W; Coats, D W

    2000-01-01

    Preliminary attempts to culture Amoebophrya sp., a parasite of Gymnodinium sanguineum from Chesapeake Bay, indicated that success may be influenced by water quality. To explore that possibility, we determined development time, reproductive output, and infectivity of progeny (i.e. dinospores) for Amoebophyra sp. maintained on G. sanguineum grown in four different culture media. The duration of the parasite's intracellular growth phase showed no significant difference among treatments; however, the time required for completion of multiple parasite generations did, with elapsed time to the middle of the third generation being shorter in nutrient-replete media. Parasites of hosts grown in nutrient-replete medium also produced three to four times more dinospores than those infecting hosts under low-nutrient conditions, with mean values of 380 and 130 dinospores/host, respectively. Dinospore production relative to host biovolume also differed, with peak values of 7.4 per 1,000 microm3 host for nutrient-replete medium and 4.8 per 1,000 microm3 host for nutrient-limited medium. Furthermore, dinospores produced by "high-nutrient" parasites had a higher success rate than those formed by "low-nutrient" parasites. Results suggest that Amoebophrya sp. is well adapted to exploit G. sanguineum populations in nutrient-enriched environments.

  11. PREDICT: A next generation platform for near real-time prediction of cholera

    Science.gov (United States)

    Jutla, A.; Aziz, S.; Akanda, A. S.; Alam, M.; Ahsan, G. U.; Huq, A.; Colwell, R. R.

    2017-12-01

    Data on disease prevalence and infectious pathogens is sparingly collected/available in region(s) where climatic variability and extreme natural events intersect with population vulnerability (such as lack of access to water and sanitation infrastructure). Therefore, traditional time series modeling approach of calibration and validation of a model is inadequate. Hence, prediction of diarrheal infections (such as cholera, Shigella etc) remain a challenge even though disease causing pathogens are strongly associated with modalities of regional climate and weather system. Here we present an algorithm that integrates satellite derived data on several hydroclimatic and ecological processes into a framework that can determine high resolution cholera risk on global scales. Cholera outbreaks can be classified in three forms- epidemic (sudden or seasonal outbreaks), endemic (recurrence and persistence of the disease for several consecutive years) and mixed-mode endemic (combination of certain epidemic and endemic conditions) with significant spatial and temporal heterogeneity. Using data from multiple satellites (AVHRR, TRMM, GPM, MODIS, VIIRS, GRACE), we will show examples from Haiti, Yemen, Nepal and several other regions where our algorithm has been successful in capturing risk of outbreak of infection in human population. A spatial model validation algorithm will also be presented that has capabilities to self-calibrate as new hydroclimatic and disease data become available.

  12. Hilbert phase dynamometry (HPD) for real-time measurement of cell generated forces (Conference Presentation)

    Science.gov (United States)

    Sridharan, Shamira; Li, Yanfen; Bhaduri, Basanta; Majeed, Hassaan; Dupenloup, Paul; Levine, Alex; Kilian, Kristopher A.; Popescu, Gabriel

    2016-03-01

    Traction force microscopy is the most widely used technique for studying the forces exerted by cells on deformable substrates. However, the method is computationally intense and cells have to be detached from the substrate prior to measuring the displacement map. We have developed a new method, referred to as Hilbert phase dynamometry (HPD), which yields real-time force fields and, simultaneously, cell dry mass and growth information. HPD operates by imaging cells on a deformable substrate that is patterned with a grid of fluorescent proteins. A Hilbert transform is used to extract the phase map associated with the grid deformation, which provides the displacement field. By combining this information with substrate stiffness, an elasticity model was developed to measure forces exerted by cells with high spatial resolution. In our study, we prepared 10kPa gels and them with a 2-D grid of FITC-conjugated fibrinogen/fibronectin mixture, an extracellular matrix protein to which cells adhere. We cultured undifferentiated mesenchymal stem cells (MSC), and MSCs that were in the process of undergoing adipogenesis and osteogenesis. The cells were measured over the course of 24 hours using Spatial Light Interference Microscopy (SLIM) and wide-field epi-fluorescence microscopy allowing us to simultaneously measure cell growth and the forces exerted by the cells on the substrate.

  13. Generation of the Human Biped Stance by a Neural Controller Able to Compensate Neurological Time Delay

    Science.gov (United States)

    Jiang, Ping; Chiba, Ryosuke; Takakusaki, Kaoru; Ota, Jun

    2016-01-01

    The development of a physiologically plausible computational model of a neural controller that can realize a human-like biped stance is important for a large number of potential applications, such as assisting device development and designing robotic control systems. In this paper, we develop a computational model of a neural controller that can maintain a musculoskeletal model in a standing position, while incorporating a 120-ms neurological time delay. Unlike previous studies that have used an inverted pendulum model, a musculoskeletal model with seven joints and 70 muscular-tendon actuators is adopted to represent the human anatomy. Our proposed neural controller is composed of both feed-forward and feedback controls. The feed-forward control corresponds to the constant activation input necessary for the musculoskeletal model to maintain a standing posture. This compensates for gravity and regulates stiffness. The developed neural controller model can replicate two salient features of the human biped stance: (1) physiologically plausible muscle activations for quiet standing; and (2) selection of a low active stiffness for low energy consumption. PMID:27655271

  14. HLYWD: a program for post-processing data files to generate selected plots or time-lapse graphics

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.

    1980-05-01

    The program HLYWD is a post-processor of output files generated by large plasma simulation computations or of data files containing a time sequence of plasma diagnostics. It is intended to be used in a production mode for either type of application; i.e., it allows one to generate along with the graphics sequence, segments containing title, credits to those who performed the work, text to describe the graphics, and acknowledgement of funding agency. The current version is designed to generate 3D plots and allows one to select type of display (linear or semi-log scales), choice of normalization of function values for display purposes, viewing perspective, and an option to allow continuous rotations of surfaces. This program was developed with the intention of being relatively easy to use, reasonably flexible, and requiring a minimum investment of the user's time. It uses the TV80 library of graphics software and ORDERLIB system software on the CDC 7600 at the National Magnetic Fusion Energy Computing Center at Lawrence Livermore Laboratory in California

  15. The Debt Overhang Hypothesis: Evidence from Pakistan

    Directory of Open Access Journals (Sweden)

    Shah Muhammad Imran

    2016-04-01

    Full Text Available This study investigates the debt overhang hypothesis for Pakistan in the period 1960-2007. The study examines empirically the dynamic behaviour of GDP, debt services, the employed labour force and investment using the time series concepts of unit roots, cointegration, error correlation and causality. Our findings suggest that debt-servicing has a negative impact on the productivity of both labour and capital, and that in turn has adversely affected economic growth. By severely constraining the ability of the country to service debt, this lends support to the debt-overhang hypothesis in Pakistan. The long run relation between debt services and economic growth implies that future increases in output will drain away in form of high debt service payments to lender country as external debt acts like a tax on output. More specifically, foreign creditors will benefit more from the rise in productivity than will domestic producers and labour. This suggests that domestic labour and capital are the ultimate losers from this heavy debt burden.

  16. Automatic CT-based finite element model generation for temperature-based death time estimation: feasibility study and sensitivity analysis.

    Science.gov (United States)

    Schenkl, Sebastian; Muggenthaler, Holger; Hubig, Michael; Erdmann, Bodo; Weiser, Martin; Zachow, Stefan; Heinrich, Andreas; Güttler, Felix Victor; Teichgräber, Ulf; Mall, Gita

    2017-05-01

    Temperature-based death time estimation is based either on simple phenomenological models of corpse cooling or on detailed physical heat transfer models. The latter are much more complex but allow a higher accuracy of death time estimation, as in principle, all relevant cooling mechanisms can be taken into account.Here, a complete workflow for finite element-based cooling simulation is presented. The following steps are demonstrated on a CT phantom: Computer tomography (CT) scan Segmentation of the CT images for thermodynamically relevant features of individual geometries and compilation in a geometric computer-aided design (CAD) model Conversion of the segmentation result into a finite element (FE) simulation model Computation of the model cooling curve (MOD) Calculation of the cooling time (CTE) For the first time in FE-based cooling time estimation, the steps from the CT image over segmentation to FE model generation are performed semi-automatically. The cooling time calculation results are compared to cooling measurements performed on the phantoms under controlled conditions. In this context, the method is validated using a CT phantom. Some of the phantoms' thermodynamic material parameters had to be determined via independent experiments.Moreover, the impact of geometry and material parameter uncertainties on the estimated cooling time is investigated by a sensitivity analysis.

  17. The nitric oxide hypothesis of aging.

    Science.gov (United States)

    McCann, S M; Licinio, J; Wong, M L; Yu, W H; Karanth, S; Rettorri, V

    1998-01-01

    Nitric oxide (NO), generated by endothelial (e) NO synthase (NOS) and neuronal (n) NOS, plays a ubiquitous role in the body in controlling the function of almost every, if not every, organ system. Bacterial and viral products, such as bacterial lipopolysaccharide (LPS), induce inducible (i) NOS synthesis that produces massive amounts of NO toxic to the invading viruses and bacteria, but also host cells by inactivation of enzymes leading to cell death. The actions of all forms of NOS are mediated not only by the free radical oxidant properties of this soluble gas, but also by its activation of guanylate cyclase (GC), leading to the production of cyclic guanosine monophosphate (cGMP) that mediates many of its physiological actions. In addition, NO activates cyclooxygenase and lipoxygenase, leading to the production of physiologically relevant quantities of prostaglandin E2 (PGE2) and leukotrienes. In the case of iNOS, the massive release of NO, PGE2, and leukotrienes produces toxic effects. Systemic injection of LPS causes induction of interleukin (IL)-1 beta mRNA followed by IL-beta synthesis that induces iNOS mRNA with a latency of two and four hours, respectively, in the anterior pituitary and pineal glands, meninges, and choroid plexus, regions outside the blood-brain barrier, and shortly thereafter, in hypothalamic regions, such as the temperature-regulating centers, paraventricular nucleus containing releasing and inhibiting hormone neurons, and the arcuate nucleus, a region containing these neurons and axons bound for the median eminence. We are currently determining if LPS similarly activates cytokine and iNOS production in the cardiovascular system and the gonads. Our hypothesis is that recurrent infections over the life span play a significant role in producing aging changes in all systems outside the blood-brain barrier via release of toxic quantities of NO. NO may be a major factor in the development of coronary heart disease (CHD). Considerable evidence

  18. Explicit symplectic algorithms based on generating functions for relativistic charged particle dynamics in time-dependent electromagnetic field

    Science.gov (United States)

    Zhang, Ruili; Wang, Yulei; He, Yang; Xiao, Jianyuan; Liu, Jian; Qin, Hong; Tang, Yifa

    2018-02-01

    Relativistic dynamics of a charged particle in time-dependent electromagnetic fields has theoretical significance and a wide range of applications. The numerical simulation of relativistic dynamics is often multi-scale and requires accurate long-term numerical simulations. Therefore, explicit symplectic algorithms are much more preferable than non-symplectic methods and implicit symplectic algorithms. In this paper, we employ the proper time and express the Hamiltonian as the sum of exactly solvable terms and product-separable terms in space-time coordinates. Then, we give the explicit symplectic algorithms based on the generating functions of orders 2 and 3 for relativistic dynamics of a charged particle. The methodology is not new, which has been applied to non-relativistic dynamics of charged particles, but the algorithm for relativistic dynamics has much significance in practical simulations, such as the secular simulation of runaway electrons in tokamaks.

  19. Coherent creation of photon pairs and generation of time-bin entangled photons from a quantum dot

    International Nuclear Information System (INIS)

    Harishankar, J.

    2013-01-01

    Semiconductor quantum dots are proven sources of single photons and entangled photon pairs. They are compact sources with the potential to find applications in quantum information processing. In this present work photon pairs were coherently created through resonant two-photon excitation of a biexciton in a single self-assembled semiconductor quantum dot. Emitted photons were collected in single mode fibers and correlation measurements were performed to determine the photon statistics. Measurements showed that the generated photons were anti-bunched with complete suppression of multi-photon emission. This excitation process was used to generate time-bin entangled photons from a single quantum dot. The existence of the entanglement was confirmed through two-photon interferometry based quantum state tomography. (author)

  20. Life cycle assessment and evaluation of energy payback time on high-concentration photovoltaic power generation system

    International Nuclear Information System (INIS)

    Nishimura, A.; Hayashi, Y.; Tanaka, K.; Hirota, M.; Kato, S.; Ito, M.; Araki, K.; Hu, E.J.

    2010-01-01

    In this study, the environmental load of photovoltaic power generation system (PV) during its life cycle and energy payback time (EPT) are evaluated by LCA scheme. Two hypothetical case studies in Toyohashi, Japan and Gobi dessert in China have been carried out to investigate the influence of installation location and PV type on environmental load and EPT. The environmental load and EPT of a high-concentration photovoltaic power generation system (hcpV) and a multi-crystalline silicon photovoltaic power generation system (mc-Si PV) are studied. The study shows for a PV of 100 MW size, the total impacts of the hcpV installed in Toyohashi is larger than that of the hcpV installed in Gobi desert by 5% without consideration of recycling stage. The EPT of the hcpV assumed to be installed in Gobi desert is shorter than EPT of the hcpV assumed to be installed in Toyohashi by 0.64 year. From these results, the superiority to install PV in Gobi desert is certificated. Comparing with hcpV and mc-Si PV, the ratio of the total impacts of mc-Si PV to that of hcpV is 0.34 without consideration of recycling stage. The EPT of hcpV is longer than EPT of mc-Si PV by 0.27 year. The amount of global solar radiation contributing to the amount of power generation of mc-Si PV is larger than the amount of direct solar radiation contributing to the amount of power generation of hcpV by about 188 kW h/(m 2 year) in Gobi desert. Consequently, it appears that using mc-Si PV in Gobi desert is the best option.

  1. Dynamics in next-generation solar cells: time-resolved surface photovoltage measurements of quantum dots chemically linked to ZnO (101[combining macron]0).

    Science.gov (United States)

    Spencer, Ben F; Cliffe, Matthew J; Graham, Darren M; Hardman, Samantha J O; Seddon, Elaine A; Syres, Karen L; Thomas, Andrew G; Sirotti, Fausto; Silly, Mathieu G; Akhtar, Javeed; O'Brien, Paul; Fairclough, Simon M; Smith, Jason M; Chattopadhyay, Swapan; Flavell, Wendy R

    2014-01-01

    The charge dynamics at the surface of the transparent conducting oxide and photoanode material ZnO are investigated in the presence and absence of light-harvesting colloidal quantum dots (QDs). The time-resolved change in surface potential upon photoexcitation has been measured in the m-plane ZnO (101[combining macron]0) using a laser pump-synchrotron X-ray probe methodology. By varying the oxygen annealing conditions, and hence the oxygen vacancy concentration of the sample, we find that dark carrier lifetimes at the ZnO surface vary from hundreds of μs to ms timescales, i.e. a persistent photoconductivity (PPC) is observed. The highly-controlled nature of our experiments under ultra-high vacuum (UHV), and the use of band-gap and sub-band-gap photoexcitation, allow us to demonstrate that defect states ca. 340 meV above the valence band edge are directly associated with the PPC, and that the PPC mediated by these defects dominates over the oxygen photodesorption mechanism. These observations are consistent with the hypothesis that ionized oxygen vacancy states are responsible for the PPC in ZnO. The effect of chemically linking two colloidal QD systems (type I PbS and type II CdS-ZnSe) to the surface has also been investigated. Upon deposition of the QDs onto the surface, the dark carrier lifetime and the surface photovoltage are reduced, suggesting a direct injection of charge carriers into the ZnO conduction band. The results are discussed in the context of the development of next-generation solar cells.

  2. PMJ panel discussion overview on mask complexities, cost, and cycle time in 32-nm system LSI generation: conflict or concurrent?

    Science.gov (United States)

    Hosono, Kunihiro; Kato, Kokoro

    2008-10-01

    This is a report on a panel discussion organized in Photomask Japan 2008, where the challenges about "Mask Complexities, Cost, and Cycle Time in 32-nm System LSI Generation" were addressed to have a look over the possible solutions from the standpoints of chipmaker, commercial mask shop, DA tool vendor and equipments makers. The wrap-up is as follows: Mask complexities justify the mask cost, while the acceptable increase rate of 32nm-mask cost significantly differs between mask suppliers or users side. The efficiency progress by new tools or DFM has driven their cycle-time reductions. Mask complexities and cost will be crucial issues prior to cycle time, and there seems to be linear correlation between them. Controlling complexity and cycle time requires developing a mix of advanced technologies, and especially for cost reduction, shot prices in writers and processing rates in inspection tools have been improved remarkably by tool makers. In addition, activities of consortium in Japan (Mask D2I) are expected to enhance the total optimization of mask design, writing and inspection. The cycle-time reduction potentially drives the lowering of mask cost, and, on the other, the pattern complexities and tighter mask specifications get in the way to 32nm generation as well as the nano-economics and market challenges. There are still many difficult problems in mask manufacturing now, and we are sure to go ahead to overcome a 32nm hurdle with the advances of technologies and collaborations by not only technologies but also finance.

  3. Comparison of different strategies for using fossil calibrations to generate the time prior in Bayesian molecular clock dating.

    Science.gov (United States)

    Barba-Montoya, Jose; Dos Reis, Mario; Yang, Ziheng

    2017-09-01

    Fossil calibrations are the utmost source of information for resolving the distances between molecular sequences into estimates of absolute times and absolute rates in molecular clock dating analysis. The quality of calibrations is thus expected to have a major impact on divergence time estimates even if a huge amount of molecular data is available. In Bayesian molecular clock dating, fossil calibration information is incorporated in the analysis through the prior on divergence times (the time prior). Here, we evaluate three strategies for converting fossil calibrations (in the form of minimum- and maximum-age bounds) into the prior on times, which differ according to whether they borrow information from the maximum age of ancestral nodes and minimum age of descendent nodes to form constraints for any given node on the phylogeny. We study a simple example that is analytically tractable, and analyze two real datasets (one of 10 primate species and another of 48 seed plant species) using three Bayesian dating programs: MCMCTree, MrBayes and BEAST2. We examine how different calibration strategies, the birth-death process, and automatic truncation (to enforce the constraint that ancestral nodes are older than descendent nodes) interact to determine the time prior. In general, truncation has a great impact on calibrations so that the effective priors on the calibration node ages after the truncation can be very different from the user-specified calibration densities. The different strategies for generating the effective prior also had considerable impact, leading to very different marginal effective priors. Arbitrary parameters used to implement minimum-bound calibrations were found to have a strong impact upon the prior and posterior of the divergence times. Our results highlight the importance of inspecting the joint time prior used by the dating program before any Bayesian dating analysis. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  4. The Younger Dryas impact hypothesis: A requiem

    Science.gov (United States)

    Pinter, Nicholas; Scott, Andrew C.; Daulton, Tyrone L.; Podoll, Andrew; Koeberl, Christian; Anderson, R. Scott; Ishman, Scott E.

    2011-06-01

    The Younger Dryas (YD) impact hypothesis is a recent theory that suggests that a cometary or meteoritic body or bodies hit and/or exploded over North America 12,900 years ago, causing the YD climate episode, extinction of Pleistocene megafauna, demise of the Clovis archeological culture, and a range of other effects. Since gaining widespread attention in 2007, substantial research has focused on testing the 12 main signatures presented as evidence of a catastrophic extraterrestrial event 12,900 years ago. Here we present a review of the impact hypothesis, including its evolution and current variants, and of efforts to test and corroborate the hypothesis. The physical evidence interpreted as signatures of an impact event can be separated into two groups. The first group consists of evidence that has been largely rejected by the scientific community and is no longer in widespread discussion, including: particle tracks in archeological chert; magnetic nodules in Pleistocene bones; impact origin of the Carolina Bays; and elevated concentrations of radioactivity, iridium, and fullerenes enriched in 3He. The second group consists of evidence that has been active in recent research and discussions: carbon spheres and elongates, magnetic grains and magnetic spherules, byproducts of catastrophic wildfire, and nanodiamonds. Over time, however, these signatures have also seen contrary evidence rather than support. Recent studies have shown that carbon spheres and elongates do not represent extraterrestrial carbon nor impact-induced megafires, but are indistinguishable from fungal sclerotia and arthropod fecal material that are a small but common component of many terrestrial deposits. Magnetic grains and spherules are heterogeneously distributed in sediments, but reported measurements of unique peaks in concentrations at the YD onset have yet to be reproduced. The magnetic grains are certainly just iron-rich detrital grains, whereas reported YD magnetic spherules are

  5. Graphic tests of Easterlin's hypothesis: science or art?

    Science.gov (United States)

    Rutten, A; Higgs, R

    1984-01-01

    Richard Easterlin believes that the postwar fertility cycle is uniquely consistent with the hypothesis of his relative income model of fertility, yet a closer examination of his evidence shows that the case for the relative income explanation is much weaker than initially appears. Easterlin finds the postwar baby boom a transparent event. Couples who entered the labor market in the postwar period have very low material aspirations. Having grown up during the Great Depression and World War II, they were content with a modest level of living. Their labor market experience was very good. Tight restrictions on immigration kept aliens from coming in to fill the gap. Thus the members of his generation occupied an unprecedented position. They could easily meet and even exceed their expectations. This high level of relative income meant that they could have more of everything they wanted, including children. For the children born during the baby boom, all this was reversed, and hence the needs of the baby bust were sown. To test this hypothesis, Easterlin compared the movements of relative income and fertility over the postwar years using a graph. 4 published versions of the graph are presented. The graph shows that relative income and fertility did move together over the cycle, apparently very closely. Easterlin's measure of fertility is the total fertility rate (TFR). There is no such direct measure of relative income. Easterlin develops 2 proxies based on changing economic conditions believed to shape the level of material aspirations. His preferred measure, labeled R or income in his graph, relates the income experience of young couples in the years previous to marriage to that of their parents in the years before the young people left home. Because of the available data limit construction of this index to the years after 1956, another measure, labeled Re or employment in Easterlin's graphs, is constructed for the pre-1956 period. This measure relates the average of

  6. Timing of first union among second-generation Turks in Europe: The role of parents, peers and institutional context

    Directory of Open Access Journals (Sweden)

    Doreen Huschek

    2010-03-01

    Full Text Available This study examines the influence of parents and peers on first union timing among the Turkish second generation in Europe using pooled data from the TIES survey. Cross-national differences in union formation are assessed by comparing countries with different integration policies and welfare regimes. Analyses show that both parents and peers are relevant predictors of entry into union: More modern parental characteristics and contact with non-coethnic peers result in postponement of union entry. Furthermore, parental and peer influences are found to be rather similar in all seven countries despite a variety of integration policies. Actual timing differences between countries may be caused by welfare state provisions directed at young adults.

  7. RPD-based Hypothesis Reasoning for Cyber Situation Awareness

    Science.gov (United States)

    Yen, John; McNeese, Michael; Mullen, Tracy; Hall, David; Fan, Xiaocong; Liu, Peng

    Intelligence workers such as analysts, commanders, and soldiers often need a hypothesis reasoning framework to gain improved situation awareness of the highly dynamic cyber space. The development of such a framework requires the integration of interdisciplinary techniques, including supports for distributed cognition (human-in-the-loop hypothesis generation), supports for team collaboration (identification of information for hypothesis evaluation), and supports for resource-constrained information collection (hypotheses competing for information collection resources). We here describe a cognitively-inspired framework that is built upon Klein’s recognition-primed decision model and integrates the three components of Endsley’s situation awareness model. The framework naturally connects the logic world of tools for cyber situation awareness with the mental world of human analysts, enabling the perception, comprehension, and prediction of cyber situations for better prevention, survival, and response to cyber attacks by adapting missions at the operational, tactical, and strategic levels.

  8. The Variability Hypothesis: The History of a Biological Model of Sex Differences in Intelligence.

    Science.gov (United States)

    Shields, Stephanie A.

    1982-01-01

    Describes the origin and development of the variability hypothesis as applied to the study of social and psychological sex differences. Explores changes in the hypothesis over time, social and scientific factors that fostered its acceptance, and possible parallels between the variability hypothesis and contemporary theories of sex differences.…

  9. Time value of emission and technology discounting rate for off-grid electricity generation in India using intermediate pyrolysis

    Energy Technology Data Exchange (ETDEWEB)

    Patel, Amit, E-mail: amitrp@iitrpr.ac.in [Indian Institute of Technology Ropar, Nangal Road, Rupnagar 140001, Punjab (India); Faculty of Technology and Engineering, The Maharaja Sayajirao University of Baroda, Vadodara 390001, Gujarat (India); Sarkar, Prabir; Tyagi, Himanshu; Singh, Harpreet [Indian Institute of Technology Ropar, Nangal Road, Rupnagar 140001, Punjab (India)

    2016-07-15

    The environmental impact assessment of a process over its entire operational lifespan is an important issue. Estimation of life cycle emission helps in predicting the contribution of a given process to abate (or to pollute) the environmental emission scenario. Considering diminishing and time-dependent effect of emission, assessment of the overall effect of emissions is very complex. The paper presents a generalized methodology for arriving at a single emission discounting number for a process option, using the concept of time value of carbon emission flow. This number incorporates the effect of the emission resulting from the process over the entire operational lifespan. The advantage of this method is its quantitative aspect as well as its flexible nature. It can be applied to any process. The method is demonstrated with the help of an Intermediate Pyrolysis process when used to generate off-grid electricity and opting biochar route for disposing straw residue. The scenarios of very high net emission to very high net carbon sequestration is generated using process by careful selection of process parameters for different scenarios. For these different scenarios, the process discounting rate was determined and its outcome is discussed. The paper also proposes a process specific eco-label that mentions the discounting rates. - Highlight: • Methodology to obtain emission discounting rate for a process is proposed. • The method includes all components of life cycle emission converts into a time dependent discounting number. • A case study of Intermediate Pyrolysis is used to obtain such number for a range of processes. • The method is useful to determine if the effect from the operation of a process will lead to a net absorption of emission or net accumulation of emission in the environment.

  10. Time value of emission and technology discounting rate for off-grid electricity generation in India using intermediate pyrolysis

    International Nuclear Information System (INIS)

    Patel, Amit; Sarkar, Prabir; Tyagi, Himanshu; Singh, Harpreet

    2016-01-01

    The environmental impact assessment of a process over its entire operational lifespan is an important issue. Estimation of life cycle emission helps in predicting the contribution of a given process to abate (or to pollute) the environmental emission scenario. Considering diminishing and time-dependent effect of emission, assessment of the overall effect of emissions is very complex. The paper presents a generalized methodology for arriving at a single emission discounting number for a process option, using the concept of time value of carbon emission flow. This number incorporates the effect of the emission resulting from the process over the entire operational lifespan. The advantage of this method is its quantitative aspect as well as its flexible nature. It can be applied to any process. The method is demonstrated with the help of an Intermediate Pyrolysis process when used to generate off-grid electricity and opting biochar route for disposing straw residue. The scenarios of very high net emission to very high net carbon sequestration is generated using process by careful selection of process parameters for different scenarios. For these different scenarios, the process discounting rate was determined and its outcome is discussed. The paper also proposes a process specific eco-label that mentions the discounting rates. - Highlight: • Methodology to obtain emission discounting rate for a process is proposed. • The method includes all components of life cycle emission converts into a time dependent discounting number. • A case study of Intermediate Pyrolysis is used to obtain such number for a range of processes. • The method is useful to determine if the effect from the operation of a process will lead to a net absorption of emission or net accumulation of emission in the environment.

  11. Ethnic background and television viewing time among 4-year-old preschool children: the generation R study.

    Science.gov (United States)

    Wijtzes, Anne I; Jansen, Wilma; Jaddoe, Vincent W V; Moll, Henriëtte A; Tiemeier, Henning; Verhulst, Frank C; Hofman, Albert; Mackenbach, Johan P; Raat, Hein

    2013-02-01

    Children's television viewing has been associated with an increased risk of overweight and obesity. This study aims to assess the associations of ethnic background and acculturation characteristics with television viewing time in 4-year-old preschool children. The authors analyzed data from 3452 preschool children and their parents enrolled in the Generation R Study, a large, multiethnic, prospective birth cohort study in Rotterdam, the Netherlands. Multivariable logistic regression models were used to estimate odds ratios of watching television ≥2 hours/day and ≥1 hour/day for Turkish, Moroccan, and Surinamese children (reference group: native Dutch children), adjusted for family socioeconomic position. Effect modification by family socioeconomic position was also assessed. After adjustment for family socioeconomic position, Turkish children (adjusted odds ratio [aOR], 2.27; 95% confidence interval [CI], 1.56-3.30), Moroccan children (aOR, 1.68; 95% CI, 1.03-2.76), and Surinamese children (aOR, 3.12; 95% CI, 2.16-4.50) were significantly more likely to watch television ≥2 hours/day compared with native Dutch children. Stratified analyses showed greater disparity between ethnic minority groups and native Dutch children at higher educational levels. There were no significant associations between acculturation characteristics (i.e., generational status, age at immigration, and Dutch language skills) and children's television viewing time. Children from ethnic minority groups are at an increased risk for high levels of television viewing compared with native Dutch children, independent of family socioeconomic position. Interventions aimed to reduce television viewing time should target all children from ethnic minority groups.

  12. Effect of Long Time Oxygen Exposure on Power Generation of Microbial Fuel Cell with Enriched Mixed Culture

    International Nuclear Information System (INIS)

    Mimi Hani Abu Bakar; Mimi Hani Abu Bakar; Mimi Hani Abu Bakar; Pasco, N.F.; Gooneratne, R.; Hong, K.B.; Hong, K.B.; Hong, K.B.

    2016-01-01

    In this study, we are interested in the effect of long time exposure of the microbial fuel cells (MFCs) to air on the electrochemical performance. Here, MFCs enriched using an effluent from a MFC operated for about eight months. After 30 days, the condition of these systems was reversed from aerobic to anaerobic and vice versa, and their effects were observed for 11 days. The results show that for anaerobic MFCs, power generation was reduced when the anodes were exposed to dissolved oxygen of 7.5 ppm. The long exposure of anodic biofilm to air led to poor electrochemical performance. The power generation recovered fully when air supply stopped entering the anode compartment with a reduction of internal resistance up to 53 %. The study was able to show that mixed facultative microorganism able to strive through the aerobic condition for about a month at 7.5 ppm oxygen or less. The anaerobic condition was able to turn these microbes into exoelectrogen, producing considerable power in relative to their aerobic state. (author)

  13. Generating Multispectral VIIRS Imagery in Near Real-Time for Use by the National Weather Service in Alaska

    Science.gov (United States)

    Broderson, D.; Dierking, C.; Stevens, E.; Heinrichs, T. A.; Cherry, J. E.

    2016-12-01

    The Geographic Information Network of Alaska (GINA) at the University of Alaska Fairbanks (UAF) uses two direct broadcast antennas to receive data from a number of polar-orbiting weather satellites, including the Suomi National Polar Partnership (S-NPP) satellite. GINA uses data from S-NPP's Visible Infrared Imaging Radiometer Suite (VIIRS) to generate a variety of multispectral imagery products developed with the needs of the National Weather Service operational meteorologist in mind. Multispectral products have two primary advantages over single-channel products. First, they can more clearly highlight some terrain and meteorological features which are less evident in the component single channels. Second, multispectral present the information from several bands through just one image, thereby sparing the meteorologist unnecessary time interrogating the component single bands individually. With 22 channels available from the VIIRS instrument, the number of possible multispectral products is theoretically huge. A small number of products will be emphasized in this presentation, with the products chosen based on their proven utility in the forecasting environment. Multispectral products can be generated upstream of the end user or by the end user at their own workstation. The advantage and disadvantages of both approaches will be outlined. Lastly, the technique of improving the appearance of multispectral imagery by correcting for atmospheric reflectance at the shorter wavelengths will be described.

  14. Generation of digital time database from paper ECG records and Fourier transform-based analysis for disease identification.

    Science.gov (United States)

    Mitra, Sucharita; Mitra, M; Chaudhuri, B B

    2004-10-01

    ECG signals recorded on paper are transferred to the digital time database with the help of an automated data extraction system developed here. A flatbed scanner is used to form an image database of each 12-lead ECG signal. Those images are then fed into a Pentium PC having a system to extract pixel-to-pixel co-ordinate information to form a raw database with the help of some image processing techniques. These raw data are then ported to the regeneration domain of the system to check the captured pattern with the original wave shape. The sampling period of each ECG signal is computed after detection of QRS complex. Finally, discrete Fourier transform of the generated database is performed to observe the frequency response properties of every ECG signal. Some interesting amplitude properties of monopolar chest lead V4 and V6 are noticed which are stated.

  15. GPS positioning accuracy using the current generation of IGU near real-time observed and predicted orbits from the IGS

    Science.gov (United States)

    Weston, N. D.; Ray, J. R.

    2009-04-01

    We investigate the performance of the latest generation of IGU near real-time observed and predicted orbits provided by the International GNSS Service (IGS) by comparing positioning results using them with those from the rapid and final counterparts. The ultra-rapid near real-time observed orbits (first half of each two-day SP3 file) have an initial latency of about three hours and are updated four times a day while the predicted half orbits (second day of each SP3 file) are available for true real-time applications. The early IGS ultra-rapid orbits of both types, which began in late 2000, had an estimated accuracy of 5 to 10 cm. The accuracy of the ultra-rapid orbits have improved significantly and now have mean weighted RMS residuals compared to the IGS rapid orbits of about 2.5 cm (after fitting and removing a daily Helmert transformation) with mean median residuals of less than 2 cm. Rotational offsets of the GPS constellation due to EOP prediction errors, especially for UT1, are usually larger than random orbit errors, reaching up to about 3.5 mm RMS around the Z axis (equatorial at GPS altitude). The rapid and final orbits, which we use for reference here, are more accurate but have latencies of approximately 17 hours and 13 days, respectively. To evaluate the positioning performance of the IGS near real-time and real-time orbits, GPS reference station data collected for a sample of days during the second half of 2008 from about 72 CORS sites throughout the United States have been processed using each of those orbit types as well as the IGS rapid and final products. A time series of daily position estimates for each GPS station has been determined for each of the four orbit types and the respective repeatabilities computed. The initial results from this study are promising and show that the accuracy in computing reference station coordinates using the ultra-rapid orbits has improved significantly. There are however, slightly larger variations from the mean

  16. Teaching hypothesis testing: a necessary challenge

    NARCIS (Netherlands)

    Post, Wendy J.; van Duijn, Marijtje A.J.; Makar, Katie; de Sousa, Bruno; Gould, Robert

    The last decades a debate has been going on about the use of hypothesis testing. This has led some teachers to think that confidence intervals and effect sizes need to be taught instead of formal hypothesis testing with p-values. Although we see shortcomings of the use of p-values in statistical

  17. Hypothesis elimination on a quantum computer

    OpenAIRE

    Soklakov, Andrei N.; Schack, Ruediger

    2004-01-01

    Hypothesis elimination is a special case of Bayesian updating, where each piece of new data rules out a set of prior hypotheses. We describe how to use Grover's algorithm to perform hypothesis elimination for a class of probability distributions encoded on a register of qubits, and establish a lower bound on the required computational resources.

  18. Hypothesis Testing in the Real World

    Science.gov (United States)

    Miller, Jeff

    2017-01-01

    Critics of null hypothesis significance testing suggest that (a) its basic logic is invalid and (b) it addresses a question that is of no interest. In contrast to (a), I argue that the underlying logic of hypothesis testing is actually extremely straightforward and compelling. To substantiate that, I present examples showing that hypothesis…

  19. Error probabilities in default Bayesian hypothesis testing

    NARCIS (Netherlands)

    Gu, Xin; Hoijtink, Herbert; Mulder, J,

    2016-01-01

    This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for

  20. Predictions from high scale mixing unification hypothesis

    Indian Academy of Sciences (India)

    2016-01-09

    Jan 9, 2016 ... Starting with 'high scale mixing unification' hypothesis, we investigate the renormalization group evolution of mixing parameters and masses for both Dirac and Majorana-type neutrinos. Following this hypothesis, the PMNS mixing parameters are taken to be identical to the CKM ones at a unifying high ...

  1. Evidence for the spotting hypothesis in gymnasts.

    Science.gov (United States)

    Heinen, Thomas

    2011-04-01

    The goal of this study was to investigate the visual spotting hypothesis in 10 experts and 10 apprentices as they perform back aerial somersaults from a standing position with no preparatory jumps (short flight duration condition) and after some preparatory jumps with a flight time of 1s (long flight duration condition). Differences in gaze behavior and kinematics were expected between experts and apprentices and between experimental conditions. Gaze behavior was measured using a portable and wireless eye-tracking system in combination with a movement-analysis system. Experts exhibited a smaller landing deviation from the middle of the trampoline bed than apprentices. Experts showed higher fixation ratios during the take-off and flight phase. Experts exhibited no blinks in any of the somersaults in both conditions, whereas apprentices showed significant blink ratios in both experimental conditions. The findings suggest that gymnasts can use visual spotting during the back aerial somersault, even when the time of flight is delimited. We conclude that knowledge about gaze-movement relationships may help coaches develop specific training programs in the learning process of the back aerial somersault.

  2. Unit cell hypothesis for Streptococcus faecalis.

    Science.gov (United States)

    Edelstein, E M; Rosenzweig, M S; Daneo-Moore, L; Higgins, M L

    1980-07-01

    The mass doubling times of exponential-phase cultures of Streptococcus faecalis were varied from 30 to 110 min by omitting glutamine from a defined growth medium and providing different concentrations of glutamate (ranging from 300 to 14 mug/ml). After Formalin fixation, cells were dried by the critical point method, and carbon-platinum replicas were prepared. The surface area and volume of cell poles seen in these replicas were estimated by a computer-assisted, three-dimensional reconstruction technique. It was found that the amount of surface area and volume of poles seen in these replicas were independent of the growth rate of culture from which the samples were taken. These observations were consistent with the unit cell model hypothesis of Donachie and Begg, in which a small number of surface sites would produce a constant amount of new cell surface regardless of the mass doubling time of the culture. However, measurements of the thickness of the cell wall taken from thin sections of the same cells showed that the cell wall increased in thickness as a function of the increase in cellular peptidoglycan content which occurs when the growth rate of this organism is slowed down by a decrease in glutamate concentration. Thus, it would seem that although the size of polar shells made by S. faecalis is invariant with growth rate, the amount of wall precursors used to construct these shells is not.

  3. Hypothesis testing in hydrology: Theory and practice

    Science.gov (United States)

    Kirchner, James; Pfister, Laurent

    2017-04-01

    Well-posed hypothesis tests have spurred major advances in hydrological theory. However, a random sample of recent research papers suggests that in hydrology, as in other fields, hypothesis formulation and testing rarely correspond to the idealized model of the scientific method. Practices such as "p-hacking" or "HARKing" (Hypothesizing After the Results are Known) are major obstacles to more rigorous hypothesis testing in hydrology, along with the well-known problem of confirmation bias - the tendency to value and trust confirmations more than refutations - among both researchers and reviewers. Hypothesis testing is not the only recipe for scientific progress, however: exploratory research, driven by innovations in measurement and observation, has also underlain many key advances. Further improvements in observation and measurement will be vital to both exploratory research and hypothesis testing, and thus to advancing the science of hydrology.

  4. Mobility and generation of mosaic non-autonomous transposons by Tn3-derived inverted-repeat miniature elements (TIMEs).

    Science.gov (United States)

    Szuplewska, Magdalena; Ludwiczak, Marta; Lyzwa, Katarzyna; Czarnecki, Jakub; Bartosik, Dariusz

    2014-01-01

    Functional transposable elements (TEs) of several Pseudomonas spp. strains isolated from black shale ore of Lubin mine and from post-flotation tailings of Zelazny Most in Poland, were identified using a positive selection trap plasmid strategy. This approach led to the capture and characterization of (i) 13 insertion sequences from 5 IS families (IS3, IS5, ISL3, IS30 and IS1380), (ii) isoforms of two Tn3-family transposons--Tn5563a and Tn4662a (the latter contains a toxin-antitoxin system), as well as (iii) non-autonomous TEs of diverse structure, ranging in size from 262 to 3892 bp. The non-autonomous elements transposed into AT-rich DNA regions and generated 5- or 6-bp sequence duplications at the target site of transposition. Although these TEs lack a transposase gene, they contain homologous 38-bp-long terminal inverted repeat sequences (IRs), highly conserved in Tn5563a and many other Tn3-family transposons. The simplest elements of this type, designated TIMEs (Tn3 family-derived Inverted-repeat Miniature Elements) (262 bp), were identified within two natural plasmids (pZM1P1 and pLM8P2) of Pseudomonas spp. It was demonstrated that TIMEs are able to mobilize segments of plasmid DNA for transposition, which results in the generation of more complex non-autonomous elements, resembling IS-driven composite transposons in structure. Such transposon-like elements may contain different functional genetic modules in their core regions, including plasmid replication systems. Another non-autonomous element "captured" with a trap plasmid was a TIME derivative containing a predicted resolvase gene and a res site typical for many Tn3-family transposons. The identification of a portable site-specific recombination system is another intriguing example confirming the important role of non-autonomous TEs of the TIME family in shuffling genetic information in bacterial genomes. Transposition of such mosaic elements may have a significant impact on diversity and evolution, not

  5. Mobility and generation of mosaic non-autonomous transposons by Tn3-derived inverted-repeat miniature elements (TIMEs.

    Directory of Open Access Journals (Sweden)

    Magdalena Szuplewska

    Full Text Available Functional transposable elements (TEs of several Pseudomonas spp. strains isolated from black shale ore of Lubin mine and from post-flotation tailings of Zelazny Most in Poland, were identified using a positive selection trap plasmid strategy. This approach led to the capture and characterization of (i 13 insertion sequences from 5 IS families (IS3, IS5, ISL3, IS30 and IS1380, (ii isoforms of two Tn3-family transposons--Tn5563a and Tn4662a (the latter contains a toxin-antitoxin system, as well as (iii non-autonomous TEs of diverse structure, ranging in size from 262 to 3892 bp. The non-autonomous elements transposed into AT-rich DNA regions and generated 5- or 6-bp sequence duplications at the target site of transposition. Although these TEs lack a transposase gene, they contain homologous 38-bp-long terminal inverted repeat sequences (IRs, highly conserved in Tn5563a and many other Tn3-family transposons. The simplest elements of this type, designated TIMEs (Tn3 family-derived Inverted-repeat Miniature Elements (262 bp, were identified within two natural plasmids (pZM1P1 and pLM8P2 of Pseudomonas spp. It was demonstrated that TIMEs are able to mobilize segments of plasmid DNA for transposition, which results in the generation of more complex non-autonomous elements, resembling IS-driven composite transposons in structure. Such transposon-like elements may contain different functional genetic modules in their core regions, including plasmid replication systems. Another non-autonomous element "captured" with a trap plasmid was a TIME derivative containing a predicted resolvase gene and a res site typical for many Tn3-family transposons. The identification of a portable site-specific recombination system is another intriguing example confirming the important role of non-autonomous TEs of the TIME family in shuffling genetic information in bacterial genomes. Transposition of such mosaic elements may have a significant impact on diversity and

  6. Does the temporal mismatch hypothesis match in boreal populations?

    Science.gov (United States)

    Vatka, Emma; Rytkönen, Seppo; Orell, Markku

    2014-10-01

    The temporal mismatch hypothesis suggests that fitness is related to the degree of temporal synchrony between the energetic needs of the offspring and their food supply. The hypothesis has been a basis in studying the influence of climate warming on nature. This study enhances the knowledge on prevalence of temporal mismatches and their consequences in boreal populations, and questions the role of the temporal mismatch hypothesis as the principal explanation for the evolution of timing of breeding. To test this, we examined if synchrony with caterpillar prey or timing of breeding per se better explains reproductive output in North European parid populations. We compared responses of temperate-origin species, the great tit (Parus major) and the blue tit (Cyanistes caeruleus), and a boreal species, the willow tit (Poecile montanus). We found that phenologies of caterpillars and great tits, but not of blue tits, have advanced during the past decades. Phenologies correlated with spring temperatures that may function as cues about the timing of the food peak for great and blue tits. The breeding of great and blue tits and their caterpillar food remained synchronous. Synchrony explained breeding success better than timing of breeding alone. However, the synchrony effect arose only in certain conditions, such as with high caterpillar abundances or high breeding densities. Breeding before good synchrony seems advantageous at high latitudes, especially in the willow tit. Thus, the temporal mismatch hypothesis appears insufficient in explaining the evolution of timing of breeding.

  7. Application of Passivity-Based Control and Time-Frequency Representation in a Doubly Fed Induction Generator System

    Directory of Open Access Journals (Sweden)

    Yingpei Liu

    2015-01-01

    Full Text Available In order to improve the performance of a doubly fed induction generator (DFIG system, we put forward a high performance nonlinear passivity-based control (PBC method on DFIG. Firstly, we build a PBC mathematical model for DFIG. We design the passive controller for the inner loop in the control system based on passivity theory. Then we calculate the rotor’s control voltages which are modulated afterwards to pulse to control the rotor side converter. The maximal wind energy capture is effectively realized. The rotor speed and DFIG currents fast track their expected values. The independent regulation of the stator active power and reactive power is achieved. Finally we perform simulations to verify the effectiveness of the proposed method. Furthermore, we employ the Wigner-Ville distribution (WVD and continuous wavelet transform (CWT as two time-frequency representation methods to indicate that the proposed method in the paper performs well from the perspective of energy distribution in time and frequency domain.

  8. Considerations and Optimization of Time-Resolved PIV Measurements near Complex Wind-Generated Air-Water Wave Interface

    Science.gov (United States)

    Stegmeir, Matthew; Markfort, Corey

    2017-11-01

    Time Resolved PIV measurements are applied on both sides of air-water interface in order to study the coupling between air and fluid motion. The multi-scale and 3-dimensional nature of the wave structure poses several unique considerations to generate optimal-quality data very near the fluid interface. High resolution and dynamic range in space and time are required to resolve relevant flow scales along a complex and ever-changing interface. Characterizing the two-way coupling across the air-water interface provide unique challenges for optical measurement techniques. Approaches to obtain near-boundary measurement on both sides of interface are discussed, including optimal flow seeding procedures, illumination, data analysis, and interface tracking. Techniques are applied to the IIHR Boundary-Layer Wind-Wave Tunnel and example results presented for both sides of the interface. The facility combines a 30m long recirculating water channel with an open-return boundary layer wind tunnel, allowing for the study of boundary layer turbulence interacting with a wind-driven wave field.

  9. Knowledge dimensions in hypothesis test problems

    Science.gov (United States)

    Krishnan, Saras; Idris, Noraini

    2012-05-01

    The reformation in statistics education over the past two decades has predominantly shifted the focus of statistical teaching and learning from procedural understanding to conceptual understanding. The emphasis of procedural understanding is on the formulas and calculation procedures. Meanwhile, conceptual understanding emphasizes students knowing why they are using a particular formula or executing a specific procedure. In addition, the Revised Bloom's Taxonomy offers a twodimensional framework to describe learning objectives comprising of the six revised cognition levels of original Bloom's taxonomy and four knowledge dimensions. Depending on the level of complexities, the four knowledge dimensions essentially distinguish basic understanding from the more connected understanding. This study identifiesthe factual, procedural and conceptual knowledgedimensions in hypothesis test problems. Hypothesis test being an important tool in making inferences about a population from sample informationis taught in many introductory statistics courses. However, researchers find that students in these courses still have difficulty in understanding the underlying concepts of hypothesis test. Past studies also show that even though students can perform the hypothesis testing procedure, they may not understand the rationale of executing these steps or know how to apply them in novel contexts. Besides knowing the procedural steps in conducting a hypothesis test, students must have fundamental statistical knowledge and deep understanding of the underlying inferential concepts such as sampling distribution and central limit theorem. By identifying the knowledge dimensions of hypothesis test problems in this study, suitable instructional and assessment strategies can be developed in future to enhance students' learning of hypothesis test as a valuable inferential tool.

  10. Planned Hypothesis Tests Are Not Necessarily Exempt from Multiplicity Adjustment

    Science.gov (United States)

    Frane, Andrew V.

    2015-01-01

    Scientific research often involves testing more than one hypothesis at a time, which can inflate the probability that a Type I error (false discovery) will occur. To prevent this Type I error inflation, adjustments can be made to the testing procedure that compensate for the number of tests. Yet many researchers believe that such adjustments are…

  11. Interactive comparison of hypothesis tests for statistical model checking

    NARCIS (Netherlands)

    de Boer, Pieter-Tjerk; Reijsbergen, D.P.; Scheinhardt, Willem R.W.

    2015-01-01

    We present a web-based interactive comparison of hypothesis tests as are used in statistical model checking, providing users and tool developers with more insight into their characteristics. Parameters can be modified easily and their influence is visualized in real time; an integrated simulation

  12. Picture-Perfect Is Not Perfect for Metamemory: Testing the Perceptual Fluency Hypothesis with Degraded Images

    Science.gov (United States)

    Besken, Miri

    2016-01-01

    The perceptual fluency hypothesis claims that items that are easy to perceive at encoding induce an illusion that they will be easier to remember, despite the finding that perception does not generally affect recall. The current set of studies tested the predictions of the perceptual fluency hypothesis with a picture generation manipulation.…

  13. The Matter-Gravity Entanglement Hypothesis

    Science.gov (United States)

    Kay, Bernard S.

    2018-03-01

    I outline some of my work and results (some dating back to 1998, some more recent) on my matter-gravity entanglement hypothesis, according to which the entropy of a closed quantum gravitational system is equal to the system's matter-gravity entanglement entropy. The main arguments presented are: (1) that this hypothesis is capable of resolving what I call the second-law puzzle, i.e. the puzzle as to how the entropy increase of a closed system can be reconciled with the asssumption of unitary time-evolution; (2) that the black hole information loss puzzle may be regarded as a special case of this second law puzzle and that therefore the same resolution applies to it; (3) that the black hole thermal atmosphere puzzle (which I recall) can be resolved by adopting a radically different-from-usual description of quantum black hole equilibrium states, according to which they are total pure states, entangled between matter and gravity in such a way that the partial states of matter and gravity are each approximately thermal equilibrium states (at the Hawking temperature); (4) that the Susskind-Horowitz-Polchinski string-theoretic understanding of black hole entropy as the logarithm of the degeneracy of a long string (which is the weak string coupling limit of a black hole) cannot be quite correct but should be replaced by a modified understanding according to which it is the entanglement entropy between a long string and its stringy atmosphere, when in a total pure equilibrium state in a suitable box, which (in line with (3)) goes over, at strong-coupling, to a black hole in equilibrium with its thermal atmosphere. The modified understanding in (4) is based on a general result, which I also describe, which concerns the likely state of a quantum system when it is weakly coupled to an energy-bath and the total state is a random pure state with a given energy. This result generalizes Goldstein et al.'s `canonical typicality' result to systems which are not necessarily small.

  14. Plant traits correlated with generation time directly affect inbreeding depression and mating system and indirectly genetic structure

    Directory of Open Access Journals (Sweden)

    Hardy Olivier J

    2009-07-01

    differences in stature, as proposed earlier, but rather to differences in generation time. Conclusion Plant traits correlated with generation time affect both inbreeding depression and mating system. These in turn modify genetic drift and gene flow and ultimately genetic structure.

  15. Unicorns do exist: a tutorial on "proving" the null hypothesis.

    Science.gov (United States)

    Streiner, David L

    2003-12-01

    Introductory statistics classes teach us that we can never prove the null hypothesis; all we can do is reject or fail to reject it. However, there are times when it is necessary to try to prove the nonexistence of a difference between groups. This most often happens within the context of comparing a new treatment against an established one and showing that the new intervention is not inferior to the standard. This article first outlines the logic of "noninferiority" testing by differentiating between the null hypothesis (that which we are trying to nullify) and the "nill" hypothesis (there is no difference), reversing the role of the null and alternate hypotheses, and defining an interval within which groups are said to be equivalent. We then work through an example and show how to calculate sample sizes for noninferiority studies.

  16. Sexual differentiation and the neuroendocrine hypothesis of autism.

    Science.gov (United States)

    Aiello, Timothy P; Whitaker-Azmitia, Patricia M

    2011-10-01

    The phenotypic expression of autism spectrum disorders varies widely in severity and characteristics and it is, therefore, likely that a number of etiological factors are involved. However, one finding which has been found consistently is that there is a greater incidence of autism in boys than girls. Recently, attention has been given to the extreme male hypothesis-that is that autism behaviors are an extreme form of typical male behaviors, including lack of empathy and language deficits but an increase in so-called systemizing behaviors, such as attention to detail and collecting. This points to the possibility that an alteration during sexual differentiation of the brain may occur in autism. During sexual differentiation of the brain, two brain regions are highly sexually dimorphic-the amygdala and the hypothalamus. Both of these regions are also implicated in the neuroendocrine hypothesis of autism, wherein a balance between oxytocin and cortisol may contribute to the disorder. We are thus proposing that the extreme male hypothesis and the neuroendocrine hypothesis are in fact compatible in that sexual differentiation of the brain towards an extreme male phenotype would result in the neuroendocrine changes proposed in autism. We have preliminary data, treating developing rat pups with the differentiating hormone 17-β estradiol during a critical time and showing changes in social behaviors and oxytocin, to support this hypothesis. Further studies should be undertaken to confirm the role of extremes of normal sexual differentiation in producing the neuroendocrine changes associated with autism. Copyright © 2011 Wiley-Liss, Inc.

  17. Implications of the Bohm-Aharonov hypothesis

    International Nuclear Information System (INIS)

    Ghirardi, G.C.; Rimini, A.; Weber, T.

    1976-01-01

    It is proved that the Bohm-Aharonov hypothesis concerning largerly separated subsystems of composite quantum systems implies that it is impossible to express the dynamical evolution in terms of the density operator

  18. Backcasting long-term climate data: evaluation of hypothesis

    Science.gov (United States)

    Saghafian, Bahram; Aghbalaghi, Sara Ghasemi; Nasseri, Mohsen

    2018-05-01

    Most often than not, incomplete datasets or short-term recorded data in vast regions impedes reliable climate and water studies. Various methods, such as simple correlation with stations having long-term time series, are practiced to infill or extend the period of observation at stations with missing or short-term data. In the current paper and for the first time, the hypothesis on the feasibility of extending the downscaling concept to backcast local observation records using large-scale atmospheric predictors is examined. Backcasting is coined here to contrast forecasting/projection; the former is implied to reconstruct in the past, while the latter represents projection in the future. To assess our hypotheses, daily and monthly statistical downscaling models were employed to reconstruct past precipitation data and lengthen the data period. Urmia and Tabriz synoptic stations, located in northwestern Iran, constituted two case study stations. SDSM and data-mining downscaling model (DMDM) daily as well as the group method of data handling (GMDH) and model tree (Mp5) monthly downscaling models were trained with National Center for Environmental Prediction (NCEP) data. After training, reconstructed precipitation data of the past was validated against observed data. Then, the data was fully extended to the 1948 to 2009 period corresponding to available NCEP data period. The results showed that DMDM performed superior in generation of monthly average precipitation compared with the SDSM, Mp5, and GMDH models, although none of the models could preserve the monthly variance. This overall confirms practical value of the proposed approach in extension of the past historic data, particularly for long-term climatological and water budget studies.

  19. Backcasting long-term climate data: evaluation of hypothesis

    Science.gov (United States)

    Saghafian, Bahram; Aghbalaghi, Sara Ghasemi; Nasseri, Mohsen

    2017-04-01

    Most often than not, incomplete datasets or short-term recorded data in vast regions impedes reliable climate and water studies. Various methods, such as simple correlation with stations having long-term time series, are practiced to infill or extend the period of observation at stations with missing or short-term data. In the current paper and for the first time, the hypothesis on the feasibility of extending the downscaling concept to backcast local observation records using large-scale atmospheric predictors is examined. Backcasting is coined here to contrast forecasting/projection; the former is implied to reconstruct in the past, while the latter represents projection in the future. To assess our hypotheses, daily and monthly statistical downscaling models were employed to reconstruct past precipitation data and lengthen the data period. Urmia and Tabriz synoptic stations, located in northwestern Iran, constituted two case study stations. SDSM and data-mining downscaling model (DMDM) daily as well as the group method of data handling (GMDH) and model tree (Mp5) monthly downscaling models were trained with National Center for Environmental Prediction (NCEP) data. After training, reconstructed precipitation data of the past was validated against observed data. Then, the data was fully extended to the 1948 to 2009 period corresponding to available NCEP data period. The results showed that DMDM performed superior in generation of monthly average precipitation compared with the SDSM, Mp5, and GMDH models, although none of the models could preserve the monthly variance. This overall confirms practical value of the proposed approach in extension of the past historic data, particularly for long-term climatological and water budget studies.

  20. Generation time and the stability of sex-determining alleles in oyster populations as deduced using a gene-based population dynamics model.

    Science.gov (United States)

    Powell, Eric N; Klinck, John M; Hofmann, Eileen E

    2011-02-21

    Crassostrea oysters are protandrous hermaphrodites. Sex is thought to be determined by a single gene with a dominant male allele M and a recessive protandrous allele F, such that FF animals are protandrous and MF animals are permanent males. We investigate the possibility that a reduction in generation time, brought about for example by disease, might jeopardize retention of the M allele. Simulations show that MF males have a significantly lessened lifetime fecundity when generation time declines. The allele frequency of the M allele declines and eventually the M allele is lost. The probability of loss is modulated by population abundance. As abundance increases, the probability of M allele loss declines. Simulations suggest that stabilization of the female-to-male ratio when generation time is long is the dominant function of the M allele. As generation time shortens, the raison d'être for the M allele also fades as mortality usurps the stabilizing role. Disease and exploitation have shortened oyster generation time: one consequence may be to jeopardize retention of the M allele. Two alternative genetic bases for protandry also provide stable sex ratios when generation time is long; an F-dominant protandric allele and protandry restricted to the MF heterozygote. In both cases, simulations show that FF individuals become rare in the population at high abundance and/or long generation time. Protandry restricted to the MF heterozygote maintains sex ratio stability over a wider range of generation times and abundances than the alternatives, suggesting that sex determination based on a male-dominant allele (MM/MF) may not be the optimal solution to the genetic basis for protandry in Crassostrea. Copyright © 2010 Elsevier Ltd. All rights reserved.

  1. Sleep memory processing: the sequential hypothesis

    OpenAIRE

    Giuditta, Antonio

    2014-01-01

    According to the sequential hypothesis (SH) memories acquired during wakefulness are processed during sleep in two serial steps respectively occurring during slow wave sleep (SWS) and rapid eye movement (REM) sleep. During SWS memories to be retained are distinguished from irrelevant or competing traces that undergo downgrading or elimination. Processed memories are stored again during REM sleep which integrates them with preexisting memories. The hypothesis received support from a wealth of ...

  2. Studies of a Next-Generation Silicon-Photomultiplier-Based Time-of-Flight PET/CT System.

    Science.gov (United States)

    Hsu, David F C; Ilan, Ezgi; Peterson, William T; Uribe, Jorge; Lubberink, Mark; Levin, Craig S

    2017-09-01

    This article presents system performance studies for the Discovery MI PET/CT system, a new time-of-flight system based on silicon photomultipliers. System performance and clinical imaging were compared between this next-generation system and other commercially available PET/CT and PET/MR systems, as well as between different reconstruction algorithms. Methods: Spatial resolution, sensitivity, noise-equivalent counting rate, scatter fraction, counting rate accuracy, and image quality were characterized with the National Electrical Manufacturers Association NU-2 2012 standards. Energy resolution and coincidence time resolution were measured. Tests were conducted independently on two Discovery MI scanners installed at Stanford University and Uppsala University, and the results were averaged. Back-to-back patient scans were also performed between the Discovery MI, Discovery 690 PET/CT, and SIGNA PET/MR systems. Clinical images were reconstructed using both ordered-subset expectation maximization and Q.Clear (block-sequential regularized expectation maximization with point-spread function modeling) and were examined qualitatively. Results: The averaged full widths at half maximum (FWHMs) of the radial/tangential/axial spatial resolution reconstructed with filtered backprojection at 1, 10, and 20 cm from the system center were, respectively, 4.10/4.19/4.48 mm, 5.47/4.49/6.01 mm, and 7.53/4.90/6.10 mm. The averaged sensitivity was 13.7 cps/kBq at the center of the field of view. The averaged peak noise-equivalent counting rate was 193.4 kcps at 21.9 kBq/mL, with a scatter fraction of 40.6%. The averaged contrast recovery coefficients for the image-quality phantom were 53.7, 64.0, 73.1, 82.7, 86.8, and 90.7 for the 10-, 13-, 17-, 22-, 28-, and 37-mm-diameter spheres, respectively. The average photopeak energy resolution was 9.40% FWHM, and the average coincidence time resolution was 375.4 ps FWHM. Clinical image comparisons between the PET/CT systems demonstrated the high

  3. Real-time and encryption efficiency improvements of simultaneous fusion, compression and encryption method based on chaotic generators

    Science.gov (United States)

    Jridi, Maher; Alfalou, Ayman

    2018-03-01

    In this paper, enhancement of an existing optical simultaneous fusion, compression and encryption (SFCE) scheme in terms of real-time requirements, bandwidth occupation and encryption robustness is proposed. We have used and approximate form of the DCT to decrease the computational resources. Then, a novel chaos-based encryption algorithm is introduced in order to achieve the confusion and diffusion effects. In the confusion phase, Henon map is used for row and column permutations, where the initial condition is related to the original image. Furthermore, the Skew Tent map is employed to generate another random matrix in order to carry out pixel scrambling. Finally, an adaptation of a classical diffusion process scheme is employed to strengthen security of the cryptosystem against statistical, differential, and chosen plaintext attacks. Analyses of key space, histogram, adjacent pixel correlation, sensitivity, and encryption speed of the encryption scheme are provided, and favorably compared to those of the existing crypto-compression system. The proposed method has been found to be digital/optical implementation-friendly which facilitates the integration of the crypto-compression system on a very broad range of scenarios.

  4. Rapid generation of volatile fatty acids (VFA) through anaerobic acidification of livestock organic waste at low hydraulic residence time (HRT).

    Science.gov (United States)

    Kuruti, Kranti; Nakkasunchi, Shalini; Begum, Sameena; Juntupally, Sudharshan; Arelli, Vijayalakshmi; Anupoju, Gangagni Rao

    2017-08-01

    The purpose of this study was to investigate the effect of pre-treatment and F/M (Food to Microorganism) ratios for the rapid generation of volatile fatty acids (VFA) from livestock organic wastes (cattle manure (CM) and poultry litter (PL)) through an anaerobic acidification process at a pH range of 4.5-5.5. Experiments were organized using CM and PL in batch reactors (1L and 25L) with and with no pre-treatment of substrate at F/M ratios (0.4, 0.6, 0.8 and 1). Among various existing pre-treatments methods, thermal-acidic (120°C; 1% H 2 SO 4 ) pre-treatment was found effective. The results revealed that 0.31 and 0.47kg VFA/(kg VS reduced) could be obtained from CM and PL respectively with no pre-treatment, whereas it improvised to 0.43 and 0.67kg VFA/(kg VS reduced) correspondingly due to pre-treatment. Aforesaid, better yield of VFA was obtained at F/M ratio of 1.0, pH-5.5 and hydraulic residence time of 4days. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. DInSAR time series generation within a cloud computing environment: from ERS to Sentinel-1 scenario

    Science.gov (United States)

    Casu, Francesco; Elefante, Stefano; Imperatore, Pasquale; Lanari, Riccardo; Manunta, Michele; Zinno, Ivana; Mathot, Emmanuel; Brito, Fabrice; Farres, Jordi; Lengert, Wolfgang

    2013-04-01

    One of the techniques that will strongly benefit from the advent of the Sentinel-1 system is Differential SAR Interferometry (DInSAR), which has successfully demonstrated to be an effective tool to detect and monitor ground displacements with centimetre accuracy. The geoscience communities (volcanology, seismicity, …), as well as those related to hazard monitoring and risk mitigation, make extensively use of the DInSAR technique and they will take advantage from the huge amount of SAR data acquired by Sentinel-1. Indeed, such an information will successfully permit the generation of Earth's surface displacement maps and time series both over large areas and long time span. However, the issue of managing, processing and analysing the large Sentinel data stream is envisaged by the scientific community to be a major bottleneck, particularly during crisis phases. The emerging need of creating a common ecosystem in which data, results and processing tools are shared, is envisaged to be a successful way to address such a problem and to contribute to the information and knowledge spreading. The Supersites initiative as well as the ESA SuperSites Exploitation Platform (SSEP) and the ESA Cloud Computing Operational Pilot (CIOP) projects provide effective answers to this need and they are pushing towards the development of such an ecosystem. It is clear that all the current and existent tools for querying, processing and analysing SAR data are required to be not only updated for managing the large data stream of Sentinel-1 satellite, but also reorganized for quickly replying to the simultaneous and highly demanding user requests, mainly during emergency situations. This translates into the automatic and unsupervised processing of large amount of data as well as the availability of scalable, widely accessible and high performance computing capabilities. The cloud computing environment permits to achieve all of these objectives, particularly in case of spike and peak

  6. A Novel Generation Method for the PV Power Time Series Combining the Decomposition Technique and Markov Chain Theory

    DEFF Research Database (Denmark)

    Xu, Shenzhi; Ai, Xiaomeng; Fang, Jiakun

    2017-01-01

    Photovoltaic (PV) power generation has made considerable developments in recent years. But its intermittent and volatility of its output has seriously affected the security operation of the power system. In order to better understand the PV generation and provide sufficient data support for analy......Photovoltaic (PV) power generation has made considerable developments in recent years. But its intermittent and volatility of its output has seriously affected the security operation of the power system. In order to better understand the PV generation and provide sufficient data support...... can simulate the basic statistical, distribution and fluctuation characteristics of the measured series....

  7. A Unifying Hypothesis for Familial and Sporadic Alzheimer's Disease

    Directory of Open Access Journals (Sweden)

    Carole J. Proctor

    2012-01-01

    Full Text Available Alzheimer's disease (AD is characterised by the aggregation of two quite different proteins, namely, amyloid-beta (Aβ, which forms extracellular plaques, and tau, the main component of cytoplasmic neurofibrillary tangles. The amyloid hypothesis proposes that Aβ plaques precede tangle formation but there is still much controversy concerning the order of events and the linkage between Aβ and tau alterations is still unknown. Mathematical modelling has become an essential tool for generating and evaluating hypotheses involving complex systems. We have therefore used this approach to discover the most probable pathway linking Aβ and tau. The model supports a complex pathway linking Aβ and tau via GSK3β, p53, and oxidative stress. Importantly, the pathway contains a cycle with multiple points of entry. It is this property of the pathway which enables the model to be consistent with both the amyloid hypothesis for familial AD and a more complex pathway for sporadic forms.

  8. A test of the domain-specific acculturation strategy hypothesis.

    Science.gov (United States)

    Miller, Matthew J; Yang, Minji; Lim, Robert H; Hui, Kayi; Choi, Na-Yeun; Fan, Xiaoyan; Lin, Li-Ling; Grome, Rebekah E; Farrell, Jerome A; Blackmon, Sha'kema

    2013-01-01

    Acculturation literature has evolved over the past several decades and has highlighted the dynamic ways in which individuals negotiate experiences in multiple cultural contexts. The present study extends this literature by testing M. J. Miller and R. H. Lim's (2010) domain-specific acculturation strategy hypothesis-that individuals might use different acculturation strategies (i.e., assimilated, bicultural, separated, and marginalized strategies; J. W. Berry, 2003) across behavioral and values domains-in 3 independent cluster analyses with Asian American participants. Present findings supported the domain-specific acculturation strategy hypothesis as 67% to 72% of participants from 3 independent samples using different strategies across behavioral and values domains. Consistent with theory, a number of acculturation strategy cluster group differences emerged across generational status, acculturative stress, mental health symptoms, and attitudes toward seeking professional psychological help. Study limitations and future directions for research are discussed.

  9. In vivo time-lapse imaging of skin burn wound healing using second-harmonic generation microscopy

    Science.gov (United States)

    Yasui, Takeshi; Tanaka, Ryosuke; Hase, Eiji; Fukushima, Shu-ichiro; Araki, Tsutomu

    2014-02-01

    Wound healing is a process to repair the damaged tissue caused by thermal burn, incised wound, or stab wound. Although the wound healing has many aspects, it is common for dynamics of collagen fiber, such as decomposition, production, or growth, to be closely related with wound healing. If such the healing process can be visualized as a timelapse image of the collagen fiber in the same subject, one may obtain new findings regarding biological repairing mechanisms in the healing process. In this article, to investigate the temporal modoification of dermal collagen fiber in the burn wound healing, we used second-harmonic-generation (SHG) microscopy, showing high selectivity and good image contrast to collagen molecules as well as high spatial resolution, optical three-dimensional sectioning, minimal invasiveness, deep penetration, the absence of interference from background light, and in vivo measurement without additional staining. Since SHG light arises from a non-centrosymmetric triple helix of three polypeptide chains in the collagen molecule, SHG intensity sensitively reflects the structure maturity of collagen molecule and its aggregates. A series of time-lapse SHG images during the wound healing process of 2 weeks clearly indicated that condensation and melting of dermal collagen fibers by the deep dermal burn, decomposition of the damaged collagen fibers in the inflammation phase, production of new collagen fibers in the proliferation phase, and the growth of the new collagen fibers in the remodeling phase. These results show a high potential of SHG microscopy for optical assessment of the wound healing process in vivo.

  10. Technical note: A new day- and night-time Meteosat Second Generation Cirrus Detection Algorithm MeCiDA

    Directory of Open Access Journals (Sweden)

    W. Krebs

    2007-12-01

    Full Text Available A new cirrus detection algorithm for the Spinning Enhanced Visible and Infra-Red Imager (SEVIRI aboard the geostationary Meteosat Second Generation (MSG, MeCiDA, is presented. The algorithm uses the seven infrared channels of SEVIRI and thus provides a consistent scheme for cirrus detection at day and night. MeCiDA combines morphological and multi-spectral threshold tests and detects optically thick and thin ice clouds. The thresholds were determined by a comprehensive theoretical study using radiative transfer simulations for various atmospheric situations as well as by manually evaluating actual satellite observations. The cirrus detection has been optimized for mid- and high latitudes but it could be adapted to other regions as well. The retrieved cirrus masks have been validated by comparison with the Moderate Resolution Imaging Spectroradiometer (MODIS Cirrus Reflection Flag. To study possible seasonal variations in the performance of the algorithm, one scene per month of the year 2004 was randomly selected and compared with the MODIS flag. 81% of the pixels were classified identically by both algorithms. In a comparison of monthly mean values for Europe and the North-Atlantic MeCiDA detected 29.3% cirrus coverage, while the MODIS SWIR cirrus coverage was 38.1%. A lower detection efficiency is to be expected for MeCiDA, as the spatial resolution of MODIS is considerably better and as we used only the thermal infrared channels in contrast to the MODIS algorithm which uses infrared and visible radiances. The advantage of MeCiDA compared to retrievals for polar orbiting instruments or previous geostationary satellites is that it permits the derivation of quantitative data every 15 min, 24 h a day. This high temporal resolution allows the study of diurnal variations and life cycle aspects. MeCiDA is fast enough for near real-time applications.

  11. Technical note: A new day- and night-time Meteosat Second Generation Cirrus Detection Algorithm MeCiDA

    Science.gov (United States)

    Krebs, W.; Mannstein, H.; Bugliaro, L.; Mayer, B.

    2007-12-01

    A new cirrus detection algorithm for the Spinning Enhanced Visible and Infra-Red Imager (SEVIRI) aboard the geostationary Meteosat Second Generation (MSG), MeCiDA, is presented. The algorithm uses the seven infrared channels of SEVIRI and thus provides a consistent scheme for cirrus detection at day and night. MeCiDA combines morphological and multi-spectral threshold tests and detects optically thick and thin ice clouds. The thresholds were determined by a comprehensive theoretical study using radiative transfer simulations for various atmospheric situations as well as by manually evaluating actual satellite observations. The cirrus detection has been optimized for mid- and high latitudes but it could be adapted to other regions as well. The retrieved cirrus masks have been validated by comparison with the Moderate Resolution Imaging Spectroradiometer (MODIS) Cirrus Reflection Flag. To study possible seasonal variations in the performance of the algorithm, one scene per month of the year 2004 was randomly selected and compared with the MODIS flag. 81% of the pixels were classified identically by both algorithms. In a comparison of monthly mean values for Europe and the North-Atlantic MeCiDA detected 29.3% cirrus coverage, while the MODIS SWIR cirrus coverage was 38.1%. A lower detection efficiency is to be expected for MeCiDA, as the spatial resolution of MODIS is considerably better and as we used only the thermal infrared channels in contrast to the MODIS algorithm which uses infrared and visible radiances. The advantage of MeCiDA compared to retrievals for polar orbiting instruments or previous geostationary satellites is that it permits the derivation of quantitative data every 15 min, 24 h a day. This high temporal resolution allows the study of diurnal variations and life cycle aspects. MeCiDA is fast enough for near real-time applications.

  12. Aminoglycoside antibiotics and autism: a speculative hypothesis

    Directory of Open Access Journals (Sweden)

    Manev Hari

    2001-10-01

    Full Text Available Abstract Background Recently, it has been suspected that there is a relationship between therapy with some antibiotics and the onset of autism; but even more curious, some children benefited transiently from a subsequent treatment with a different antibiotic. Here, we speculate how aminoglycoside antibiotics might be associated with autism. Presentation We hypothesize that aminoglycoside antibiotics could a trigger the autism syndrome in susceptible infants by causing the stop codon readthrough, i.e., a misreading of the genetic code of a hypothetical critical gene, and/or b improve autism symptoms by correcting the premature stop codon mutation in a hypothetical polymorphic gene linked to autism. Testing Investigate, retrospectively, whether a link exists between aminoglycoside use (which is not extensive in children and the onset of autism symptoms (hypothesis "a", or between amino glycoside use and improvement of these symptoms (hypothesis "b". Whereas a prospective study to test hypothesis "a" is not ethically justifiable, a study could be designed to test hypothesis "b". Implications It should be stressed that at this stage no direct evidence supports our speculative hypothesis and that its main purpose is to initiate development of new ideas that, eventually, would improve our understanding of the pathobiology of autism.

  13. Interacting effects of genetic variation for seed dormancy and flowering time on phenology, life history, and fitness of experimental Arabidopsis thaliana populations over multiple generations in the field.

    Science.gov (United States)

    Taylor, Mark A; Cooper, Martha D; Sellamuthu, Reena; Braun, Peter; Migneault, Andrew; Browning, Alyssa; Perry, Emily; Schmitt, Johanna

    2017-10-01

    Major alleles for seed dormancy and flowering time are well studied, and can interact to influence seasonal timing and fitness within generations. However, little is known about how this interaction controls phenology, life history, and population fitness across multiple generations in natural seasonal environments. To examine how seed dormancy and flowering time shape annual plant life cycles over multiple generations, we established naturally dispersing populations of recombinant inbred lines of Arabidopsis thaliana segregating early and late alleles for seed dormancy and flowering time in a field experiment. We recorded seasonal phenology and fitness of each genotype over 2 yr and several generations. Strong seed dormancy suppressed mid-summer germination in both early- and late-flowering genetic backgrounds. Strong dormancy and late-flowering genotypes were both necessary to confer a winter annual life history; other genotypes were rapid-cycling. Strong dormancy increased within-season fecundity in an early-flowering background, but decreased it in a late-flowering background. However, there were no detectable differences among genotypes in population growth rates. Seasonal phenology, life history, and cohort fitness over multiple generations depend strongly upon interacting genetic variation for dormancy and flowering. However, similar population growth rates across generations suggest that different life cycle genotypes can coexist in natural populations. © 2017 The Authors. New Phytologist © 2017 New Phytologist Trust.

  14. Reassessing the Trade-off Hypothesis

    DEFF Research Database (Denmark)

    Rosas, Guillermo; Manzetti, Luigi

    2015-01-01

    Do economic conditions drive voters to punish politicians that tolerate corruption? Previous scholarly work contends that citizens in young democracies support corrupt governments that are capable of promoting good economic outcomes, the so-called trade-off hypothesis. We test this hypothesis based...... on mass surveys in eighteen Latin American countries throughout 2004–2012. We find that citizens that report bribe attempts from bureaucrats are always more likely to report presidential disapproval than citizens that report no such attempts, that is, Latin American victims of corruption are not duped...... by good economic performance. However, we find some evidence for a weaker form of the trade-off hypothesis: presidential disapproval among corruption victims might be more pronounced in contexts of high inflation and high unemployment....

  15. Multi-hypothesis distributed stereo video coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Zamarin, Marco; Forchhammer, Søren

    2013-01-01

    for stereo sequences, exploiting an interpolated intra-view SI and two inter-view SIs. The quality of the SI has a major impact on the DVC Rate-Distortion (RD) performance. As the inter-view SIs individually present lower RD performance compared with the intra-view SI, we propose multi-hypothesis decoding...... for robust fusion and improved performance. Compared with a state-of-the-art single side information solution, the proposed DVC decoder improves the RD performance for all the chosen test sequences by up to 0.8 dB. The proposed multi-hypothesis decoder showed higher robustness compared with other fusion...

  16. Ready for Retirement: The Gateway Drug Hypothesis.

    Science.gov (United States)

    Kleinig, John

    2015-01-01

    The psycho-social observation that the use of some psychoactive substances ("drugs") is often followed by the use of other and more problematic drugs has given rise to a cluster of so-called "gateway drug hypotheses," and such hypotheses have often played an important role in developing drug use policy. The current essay suggests that drug use policies that have drawn on versions of the hypothesis have involved an unjustified oversimplification of the dynamics of drug use, reflecting the interests of certain stakeholders rather than wise social policy. The hypothesis should be retired.

  17. Alertness and Cognitive Control: Testing the Early Onset Hypothesis.

    Science.gov (United States)

    Schneider, Darryl W

    2017-11-20

    Previous research has revealed a peculiar interaction between alertness and cognitive control in selective-attention tasks: Congruency effects are larger on alert trials (on which an alerting cue is presented briefly in advance of the imperative stimulus) than on no-alert trials, despite shorter response times (RTs) on alert trials. One explanation for this finding is the early onset hypothesis, which is based on the assumptions that increased alertness shortens stimulus-encoding time and that cognitive control involves gradually focusing attention during a trial. The author tested the hypothesis in 3 experiments by manipulating alertness and stimulus quality (which were intended to shorten and lengthen stimulus-encoding time, respectively) in an arrow-based flanker task involving congruent and incongruent stimuli. Replicating past findings, the alerting manipulation led to shorter RTs but larger congruency effects on alert trials than on no-alert trials. The stimulus-quality manipulation led to longer RTs and larger congruency effects for degraded stimuli than for intact stimuli. These results provide mixed support for the early onset hypothesis, but the author discusses how data and theory might be reconciled if stimulus quality affects stimulus-encoding time and the rate of evidence accumulation in the decision process. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  18. Using MathWorks' Simulink® and Real-Time Workshop® Code Generator to Produce Attitude Control Test and Flight Code

    OpenAIRE

    Salada, Mark; Dellinger, Wayne

    1998-01-01

    This paper describes the use of a commercial product, MathWorks' RealTime Workshop® (RTW), to generate actual flight code for NASA's Thermosphere, Ionosphere, Mesosphere Energetics and Dynamics (TIMED) mission. The Johns Hopkins University Applied Physics Laboratory is handling the design and construction of this satellite for NASA. As TIMED is scheduled to launch in May of the year 2000, software development for both ground and flight systems are well on their way. However, based on experien...

  19. Thrombin generation, ProC(®)Global, prothrombin time and activated partial thromboplastin time in thawed plasma stored for seven days and after methylene blue/light pathogen inactivation.

    Science.gov (United States)

    Thiele, Thomas; Hron, Gregor; Kellner, Sarah; Wasner, Christina; Westphal, Antje; Warkentin, Theodore E; Greinacher, Andreas; Selleng, Kathleen

    2016-01-01

    Methylene blue pathogen inactivation and storage of thawed plasma both lead to changes in the activity of several clotting factors. We investigated how this translates into a global loss of thrombin generation potential and alterations in the protein C pathway. Fifty apheresis plasma samples were thawed and each divided into three subunits. One subunit was stored for 7 days at 4 °C, one was stored for 7 days at 22 °C and one was stored at 4 °C after methylene blue/light treatment. Thrombin generation parameters, ProC(®)Global-NR, prothrombin time and activated partial thromboplastin time were assessed on days 0 and 7. The velocity of thrombin generation increased significantly after methylene blue treatment (increased thrombin generation rate; time to peak decreased) and decreased after storage (decreased thrombin generation rate and peak thrombin; increased lag time and time to peak). The endogenous thrombin generation potential remained stable after methylene blue treatment and storage at 4 °C. Methylene blue treatment and 7 days of storage at 4 °C activated the protein C pathway, whereas storage at room temperature and storage after methylene blue treatment decreased the functional capacity of the protein C pathway. Prothrombin time and activated partial thromboplastin time showed only modest alterations. The global clotting capacity of thawed plasma is maintained at 4 °C for 7 days and directly after methylene blue treatment of thawed plasma. Thrombin generation and ProC(®)Global are useful tools for investigating the impact of pathogen inactivation and storage on the clotting capacity of therapeutic plasma preparations.

  20. Generating Li–Yorke chaos in a stable continuous-time T–S fuzzy model via time-delay feedback control

    International Nuclear Information System (INIS)

    Qiu-Ye, Sun; Hua-Guang, Zhang; Yan, Zhao

    2010-01-01

    This paper investigates the chaotification problem of a stable continuous-time T–S fuzzy system. A simple nonlinear state time-delay feedback controller is designed by parallel distributed compensation technique. Then, the asymptotically approximate relationship between the controlled continuous-time T–S fuzzy system with time-delay and a discrete-time T–S fuzzy system is established. Based on the discrete-time T–S fuzzy system, it proves that the chaos in the discrete-time T–S fuzzy system satisfies the Li–Yorke definition by choosing appropriate controller parameters via the revised Marotto theorem. Finally, the effectiveness of the proposed chaotic anticontrol method is verified by a practical example. (general)

  1. The Twin Deficits Hypothesis: An Empirical Analysis for Tanzania

    Directory of Open Access Journals (Sweden)

    Manamba Epaphra

    2017-09-01

    Full Text Available This paper examines the relationship between current account and government budget deficits in Tanzania. The paper tests the validity of the twin deficits hypothesis, using annual time series data for the 1966-2015 period. The paper is thought to be significant because the concept of the twin deficit hypothesis is fraught with controversy. Some researches support the hypothesis that there is a positive relationship between current account deficits and fiscal deficits in the economy while others do not. In this paper, the empirical tests fail to reject the twin deficits hypothesis, indicating that rising budget deficits put more strain on the current account deficits in Tanzania. Specifically, the Vector Error Correction Model results support the conventional theory of a positive relationship between fiscal and external balances, with a relatively high speed of adjustment toward the equilibrium position. This evidence is consistent with a small open economy. To address the problem that may result from this kind of relationship, appropriate policy variables for reducing budget deficits such as reduction in non-development expenditure, enhancement of domestic revenue collection and actively fight corruption and tax evasion should be adopted. The government should also target export oriented firms and encourage an import substitution industry by creating favorable business environments.

  2. How organisms do the right thing: The attractor hypothesis

    Science.gov (United States)

    Emlen, J.M.; Freeman, D.C.; Mills, A.; Graham, J.H.

    1998-01-01

    Neo-Darwinian theory is highly successful at explaining the emergence of adaptive traits over successive generations. However, there are reasons to doubt its efficacy in explaining the observed, impressively detailed adaptive responses of organisms to day-to-day changes in their surroundings. Also, the theory lacks a clear mechanism to account for both plasticity and canalization. In effect, there is a growing sentiment that the neo-Darwinian paradigm is incomplete, that something more than genetic structure, mutation, genetic drift, and the action of natural selection is required to explain organismal behavior. In this paper we extend the view of organisms as complex self-organizing entities by arguing that basic physical laws, coupled with the acquisitive nature of organisms, makes adaptation all but tautological. That is, much adaptation is an unavoidable emergent property of organisms' complexity and, to some a significant degree, occurs quite independently of genomic changes wrought by natural selection. For reasons that will become obvious, we refer to this assertion as the attractor hypothesis. The arguments also clarify the concept of "adaptation." Adaptation across generations, by natural selection, equates to the (game theoretic) maximization of fitness (the success with which one individual produces more individuals), while self-organizing based adaptation, within generations, equates to energetic efficiency and the matching of intake and biosynthesis to need. Finally, we discuss implications of the attractor hypothesis for a wide variety of genetical and physiological phenomena, including genetic architecture, directed mutation, genetic imprinting, paramutation, hormesis, plasticity, optimality theory, genotype-phenotype linkage and puncuated equilibrium, and present suggestions for tests of the hypothesis. ?? 1998 American Institute of Physics.

  3. Prospective detection of large prediction errors: a hypothesis testing approach

    International Nuclear Information System (INIS)

    Ruan, Dan

    2010-01-01

    Real-time motion management is important in radiotherapy. In addition to effective monitoring schemes, prediction is required to compensate for system latency, so that treatment can be synchronized with tumor motion. However, it is difficult to predict tumor motion at all times, and it is critical to determine when large prediction errors may occur. Such information can be used to pause the treatment beam or adjust monitoring/prediction schemes. In this study, we propose a hypothesis testing approach for detecting instants corresponding to potentially large prediction errors in real time. We treat the future tumor location as a random variable, and obtain its empirical probability distribution with the kernel density estimation-based method. Under the null hypothesis, the model probability is assumed to be a concentrated Gaussian centered at the prediction output. Under the alternative hypothesis, the model distribution is assumed to be non-informative uniform, which reflects the situation that the future position cannot be inferred reliably. We derive the likelihood ratio test (LRT) for this hypothesis testing problem and show that with the method of moments for estimating the null hypothesis Gaussian parameters, the LRT reduces to a simple test on the empirical variance of the predictive random variable. This conforms to the intuition to expect a (potentially) large prediction error when the estimate is associated with high uncertainty, and to expect an accurate prediction when the uncertainty level is low. We tested the proposed method on patient-derived respiratory traces. The 'ground-truth' prediction error was evaluated by comparing the prediction values with retrospective observations, and the large prediction regions were subsequently delineated by thresholding the prediction errors. The receiver operating characteristic curve was used to describe the performance of the proposed hypothesis testing method. Clinical implication was represented by miss

  4. The discovered preference hypothesis - an empirical test

    DEFF Research Database (Denmark)

    Lundhede, Thomas; Ladenburg, Jacob; Olsen, Søren Bøye

    Using stated preference methods for valuation of non-market goods is known to be vulnerable to a range of biases. Some authors claim that these so-called anomalies in effect render the methods useless for the purpose. However, the Discovered Preference Hypothesis, as put forth by Plott [31], offe...

  5. The Hypothesis of Incommensurability and Multicultural Education

    Science.gov (United States)

    McDonough, Tim

    2009-01-01

    This article describes the logical and rhetorical grounds for a multicultural pedagogy that teaches students the knowledge and skills needed to interact creatively in the public realm betwixt and between cultures. I begin by discussing the notion of incommensurability. I contend that this hypothesis was intended to perform a particular rhetorical…

  6. Sleep memory processing: the sequential hypothesis.

    Science.gov (United States)

    Giuditta, Antonio

    2014-01-01

    According to the sequential hypothesis (SH) memories acquired during wakefulness are processed during sleep in two serial steps respectively occurring during slow wave sleep (SWS) and rapid eye movement (REM) sleep. During SWS memories to be retained are distinguished from irrelevant or competing traces that undergo downgrading or elimination. Processed memories are stored again during REM sleep which integrates them with preexisting memories. The hypothesis received support from a wealth of EEG, behavioral, and biochemical analyses of trained rats. Further evidence was provided by independent studies of human subjects. SH basic premises, data, and interpretations have been compared with corresponding viewpoints of the synaptic homeostatic hypothesis (SHY). Their similarities and differences are presented and discussed within the framework of sleep processing operations. SHY's emphasis on synaptic renormalization during SWS is acknowledged to underline a key sleep effect, but this cannot marginalize sleep's main role in selecting memories to be retained from downgrading traces, and in their integration with preexisting memories. In addition, SHY's synaptic renormalization raises an unsolved dilemma that clashes with the accepted memory storage mechanism exclusively based on modifications of synaptic strength. This difficulty may be bypassed by the assumption that SWS-processed memories are stored again by REM sleep in brain subnuclear quantum particles. Storing of memories in quantum particles may also occur in other vigilance states. Hints are provided on ways to subject the quantum hypothesis to experimental tests.

  7. [Resonance hypothesis of heart rate variability origin].

    Science.gov (United States)

    Sheĭkh-Zade, Iu R; Mukhambetaliev, G Kh; Cherednik, I L

    2009-09-01

    A hypothesis is advanced of the heart rate variability being subjected to beat-to-beat regulation of cardiac cycle duration in order to ensure the resonance interaction between respiratory and own fluctuation of the arterial system volume for minimization of power expenses of cardiorespiratory system. Myogenic, parasympathetic and sympathetic machanisms of heart rate variability are described.

  8. Hypothesis on the nature of atmospheric UFOs

    Science.gov (United States)

    Mukharev, L. A.

    1991-08-01

    A hypothesis is developed according to which the atmospheric UFO phenomenon has an electromagnetic nature. It is suggested that an atmospheric UFO is an agglomeration of charged atmospheric dust within which there exists a slowly damped electromagnetic field. This field is considered to be the source of the observed optical effects and the motive force of the UFO.

  9. Multiple hypothesis clustering in radar plot extraction

    NARCIS (Netherlands)

    Huizing, A.G.; Theil, A.; Dorp, Ph. van; Ligthart, L.P.

    1995-01-01

    False plots and plots with inaccurate range and Doppler estimates may severely degrade the performance of tracking algorithms in radar systems. This paper describes how a multiple hypothesis clustering technique can be applied to mitigate the problems involved in plot extraction. The measures of

  10. The (not so immortal strand hypothesis

    Directory of Open Access Journals (Sweden)

    Cristian Tomasetti

    2015-03-01

    Significance: Utilizing an approach that is fundamentally different from previous efforts to confirm or refute the immortal strand hypothesis, we provide evidence against non-random segregation of DNA during stem cell replication. Our results strongly suggest that parental DNA is passed randomly to stem cell daughters and provides new insight into the mechanism of DNA replication in stem cells.

  11. Forty Years Later: Updating the Fossilization Hypothesis

    Science.gov (United States)

    Han, ZhaoHong

    2013-01-01

    A founding concept in second language acquisition (SLA) research, fossilization has been fundamental to understanding second language (L2) development. The Fossilization Hypothesis, introduced in Selinker's seminal text (1972), has thus been one of the most influential theories, guiding a significant bulk of SLA research for four decades; 2012…

  12. Remarks about the hypothesis of limiting fragmentation

    International Nuclear Information System (INIS)

    Chou, T.T.; Yang, C.N.

    1987-01-01

    Remarks are made about the hypothesis of limiting fragmentation. In particular, the concept of favored and disfavored fragment distribution is introduced. Also, a sum rule is proved leading to a useful quantity called energy-fragmentation fraction. (author). 11 refs, 1 fig., 2 tabs

  13. Sleep memory processing: the sequential hypothesis

    Directory of Open Access Journals (Sweden)

    Antonio eGiuditta

    2014-12-01

    Full Text Available According to the sequential hypothesis (SH memories acquired during wakefulness are processed during sleep in two serial steps respectively occurring during slow wave sleep (SWS and REM sleep. During SWS memories to be retained are distinguished from irrelevant or competing traces that undergo downgrading or elimination. Processed memories are stored again during REM sleep which integrates them with preexisting memories. The hypothesis received support from a wealth of EEG, behavioral, and biochemical analyses of trained rats. Further evidence was provided by independent studies of human subjects. SH basic premises, data, and interpretations have been compared with corresponding viewpoints of the synaptic homeostatic hypothesis (SHY. Their similarities and differences are presented and discussed within the framework of sleep processing operations. SHY’s emphasis on synaptic renormalization during SWS is acknowledged to underline a key sleep effect, but this cannot marginalize sleep’s main role in selecting memories to be retained from downgrading traces, and in their integration with preexisting memories. In addition, SHY’s synaptic renormalization raises an unsolved dilemma that clashes with the accepted memory storage mechanism exclusively based on modifications of synaptic strength. This difficulty may be bypassed by the assumption that SWS-processed memories are stored again by REM sleep in brain subnuclear quantum particles. Storing of memories in quantum particles may also occur in other vigilance states. Hints are provided on ways to subject the quantum hypothesis to experimental tests.

  14. Television Exposure Measures and the Cultivation Hypothesis.

    Science.gov (United States)

    Potter, W. James; Chang, Ik Chin

    1990-01-01

    Describes study of students in grades 8 through 12 that was conducted to determine the degree to which television messages influence a person's construction of reality (the cultivation hypothesis). Research methodology that tests the effects of television exposure is examined with emphasis on the importance of demographic control variables. (38…

  15. Commentary: Human papillomavirus and tar hypothesis for ...

    Indian Academy of Sciences (India)

    2010-08-09

    Aug 9, 2010 ... Commentary: Human papillomavirus and tar hypothesis for squamous cell cervical cancer. Christina Bennett Allen E Kuhn Harry W Haverkos. Volume 35 Issue 3 September 2010 pp ... Keywords. Cervical cancer; co-factors; human papillomavirus; tar-based vaginal douche; tobacco smoke; wood smoke ...

  16. Morbidity and Infant Development: A Hypothesis.

    Science.gov (United States)

    Pollitt, Ernesto

    1983-01-01

    Results of a study conducted in 14 villages of Sui Lin Township, Taiwan, suggest the hypothesis that, under conditions of extreme economic impoverishment and among children within populations where energy protein malnutrition is endemic, there is an inverse relationship between incidence of morbidity in infancy and measures of motor and mental…

  17. Low-jitter wide-range integrated time interval/delay generator based on combination of period counting and capacitor charging.

    Science.gov (United States)

    Klepacki, K; Pawłowski, M; Szplet, R

    2015-02-01

    We present the design, operation, and test results of a new time interval/delay generator that provides the resolution of 0.3 ps, jitter below 10 ps (rms), and wide delay range of 10 s. The wide range has been achieved by counting periods of a reference clock while the high resolution and low jitter have been obtained through the two-time use of inner interpolation. This interpolation, based on charging of a single capacitor, provides both the precise external trigger synchronization and accurate generation of residual time interval. A combination of both processes virtually eliminates triggering indeterminacy. The jitter between the trigger and output is below 1 ps, which ensures a high performance delay. The generator is integrated in a single application specific integrated circuit chip using a standard cost-effective 0.35 μm CMOS process.

  18. Riemann hypothesis for period polynomials of modular forms.

    Science.gov (United States)

    Jin, Seokho; Ma, Wenjun; Ono, Ken; Soundararajan, Kannan

    2016-03-08

    The period polynomial r(f)(z) for an even weight k≥4 newform f∈S(k)(Γ(0)((N)) is the generating function for the critical values of L(f,s) . It has a functional equation relating r(f)(z) to r(f)(-1/Nz). We prove the Riemann hypothesis for these polynomials: that the zeros of r(f)(z) lie on the circle |z|=1/√N . We prove that these zeros are equidistributed when either k or N is large.

  19. The Income Inequality Hypothesis Revisited : Assessing the Hypothesis Using Four Methodological Approaches

    NARCIS (Netherlands)

    Kragten, N.; Rözer, J.

    The income inequality hypothesis states that income inequality has a negative effect on individual’s health, partially because it reduces social trust. This article aims to critically assess the income inequality hypothesis by comparing several analytical strategies, namely OLS regression,

  20. Cortical Neural Computation by Discrete Results Hypothesis.

    Science.gov (United States)

    Castejon, Carlos; Nuñez, Angel

    2016-01-01

    One of the most challenging problems we face in neuroscience is to understand how the cortex performs computations. There is increasing evidence that the power of the cortical processing is produced by populations of neurons forming dynamic neuronal ensembles. Theoretical proposals and multineuronal experimental studies have revealed that ensembles of neurons can form emergent functional units. However, how these ensembles are implicated in cortical computations is still a mystery. Although cell ensembles have been associated with brain rhythms, the functional interaction remains largely unclear. It is still unknown how spatially distributed neuronal activity can be temporally integrated to contribute to cortical computations. A theoretical explanation integrating spatial and temporal aspects of cortical processing is still lacking. In this Hypothesis and Theory article, we propose a new functional theoretical framework to explain the computational roles of these ensembles in cortical processing. We suggest that complex neural computations underlying cortical processing could be temporally discrete and that sensory information would need to be quantized to be computed by the cerebral cortex. Accordingly, we propose that cortical processing is produced by the computation of discrete spatio-temporal functional units that we have called "Discrete Results" (Discrete Results Hypothesis). This hypothesis represents a novel functional mechanism by which information processing is computed in the cortex. Furthermore, we propose that precise dynamic sequences of "Discrete Results" is the mechanism used by the cortex to extract, code, memorize and transmit neural information. The novel "Discrete Results" concept has the ability to match the spatial and temporal aspects of cortical processing. We discuss the possible neural underpinnings of these functional computational units and describe the empirical evidence supporting our hypothesis. We propose that fast-spiking (FS

  1. Einstein's Revolutionary Light-Quantum Hypothesis

    Science.gov (United States)

    Stuewer, Roger H.

    2005-05-01

    The paper in which Albert Einstein proposed his light-quantum hypothesis was the only one of his great papers of 1905 that he himself termed ``revolutionary.'' Contrary to widespread belief, Einstein did not propose his light-quantum hypothesis ``to explain the photoelectric effect.'' Instead, he based his argument for light quanta on the statistical interpretation of the second law of thermodynamics, with the photoelectric effect being only one of three phenomena that he offered as possible experimental support for it. I will discuss Einstein's light-quantum hypothesis of 1905 and his introduction of the wave-particle duality in 1909 and then turn to the reception of his work on light quanta by his contemporaries. We will examine the reasons that prominent physicists advanced to reject Einstein's light-quantum hypothesis in succeeding years. Those physicists included Robert A. Millikan, even though he provided convincing experimental proof of the validity of Einstein's equation of the photoelectric effect in 1915. The turning point came after Arthur Holly Compton discovered the Compton effect in late 1922, but even then Compton's discovery was contested both on experimental and on theoretical grounds. Niels Bohr, in particular, had never accepted the reality of light quanta and now, in 1924, proposed a theory, the Bohr-Kramers-Slater theory, which assumed that energy and momentum were conserved only statistically in microscopic interactions. Only after that theory was disproved experimentally in 1925 was Einstein's revolutionary light-quantum hypothesis generally accepted by physicists---a full two decades after Einstein had proposed it.

  2. Effect of backchannel utterances on facilitating idea-generation in Japanese think-aloud tasks.

    Science.gov (United States)

    Sannomiya, Machiko; Kawaguchi, Atsuo; Yamakawa, Ikue; Morita, Yusuke

    2003-08-01

    The relation between backchannel utterance and idea-generation has hardly been studied. Based on preliminary investigations, we formulated a hypothesis that a listener's backchannel utterances facilitate a speaker's idea-generation. This study experimentally manipulated the frequency of backchannel utterances by listeners during speakers' idea-generation for think-aloud tasks. 16 Japanese female undergraduates participated. Analysis indicated that frequent backchannel utterances increased not only the number of ideas generated but also the speaking time for the tasks.

  3. Changing times, changing stories: Generational differences in climate change perspectives from four remote indigenous communities in Subarctic Alaska

    Science.gov (United States)

    Herman-Mercer, Nicole M.; Matkin, Elli; Laituri, Melinda J.; Toohey, Ryan C; Massey, Maggie; Elder, Kelly; Schuster, Paul F.; Mutter, Edda A.

    2016-01-01

    Indigenous Arctic and Subarctic communities currently are facing a myriad of social and environmental changes. In response to these changes, studies concerning indigenous knowledge (IK) and climate change vulnerability, resiliency, and adaptation have increased dramatically in recent years. Risks to lives and livelihoods are often the focus of adaptation research; however, the cultural dimensions of climate change are equally important because cultural dimensions inform perceptions of risk. Furthermore, many Arctic and Subarctic IK climate change studies document observations of change and knowledge of the elders and older generations in a community, but few include the perspectives of the younger population. These observations by elders and older generations form a historical baseline record of weather and climate observations in these regions. However, many indigenous Arctic and Subarctic communities are composed of primarily younger residents. We focused on the differences in the cultural dimensions of climate change found between young adults and elders. We outlined the findings from interviews conducted in four indigenous communities in Subarctic Alaska. The findings revealed that (1) intergenerational observations of change were common among interview participants in all four communities, (2) older generations observed more overall change than younger generations interviewed by us, and (3) how change was perceived varied between generations. We defined “observations” as the specific examples of environmental and weather change that were described, whereas “perceptions” referred to the manner in which these observations of change were understood and contextualized by the interview participants. Understanding the differences in generational observations and perceptions of change are key issues in the development of climate change adaptation strategies.

  4. First Demonstration of Real-Time End-to-End 40 Gb/s PAM-4 System using 10-G Transmitter for Next Generation Access Applications

    DEFF Research Database (Denmark)

    Wei, Jinlong; Eiselt, Nicklas; Griesser, Helmut

    We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next generation access applications using 10G class transmitters only. Up to 25-dB upstream link budget for 20 km SMF is achieved.......We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next generation access applications using 10G class transmitters only. Up to 25-dB upstream link budget for 20 km SMF is achieved....

  5. Simultaneity modeling analysis of the environmental Kuznets curve hypothesis

    International Nuclear Information System (INIS)

    Ben Youssef, Adel; Hammoudeh, Shawkat; Omri, Anis

    2016-01-01

    The environmental Kuznets curve (EKC) hypothesis has been recognized in the environmental economics literature since the 1990's. Various statistical tests have been used on time series, cross section and panel data related to single and groups of countries to validate this hypothesis. In the literature, the validation has always been conducted by using a single equation. However, since both the environment and income variables are endogenous, the estimation of a single equation model when simultaneity exists produces inconsistent and biased estimates. Therefore, we formulate simultaneous two-equation models to investigate the EKC hypothesis for fifty-six countries, using annual panel data from 1990 to 2012, with the end year is determined by data availability for the panel. To make the panel data analysis more homogeneous, we investigate this issue for a three income-based panels (namely, high-, middle-, and low-income panels) given several explanatory variables. Our results indicate that there exists a bidirectional causality between economic growth and pollution emissions in the overall panels. We also find that the relationship is nonlinear and has an inverted U-shape for all the considered panels. Policy implications are provided. - Highlights: • We have given a new look for the validity of the EKC hypothesis. • We formulate two-simultaneous equation models to validate this hypothesis for fifty-six countries. • We find a bidirectional causality between economic growth and pollution emissions. • We also discover an inverted U-shaped between environmental degradation and economic growth. • This relationship varies at different stages of economic development.

  6. Real time incorporation of random events in the reasoning of an on-line expert system. Application to the acoustic surveillance of vapor generators

    International Nuclear Information System (INIS)

    Launay, T.

    1989-03-01

    A study for improving an expert system applied in diagnostic assistance is presented. The results will be implemented in the vapor generators surveillance system. The aim of the work is to improve performances by reducing the time spent on reasoning and to strengthen the vigilance system. The investigation consists of four parts. In the first part, the state of the art of the different logics used in the artificial intelligence techniques is discussed, and the TMS and ATMS systems are presented. The second part of this thesis deals with problematics. Each point of the problem is studied and answered by applying the basic concepts used in the generation of on-line expert systems. In the third part, the on-line expert system generator ACTE is described. The ACTE aspects concerning the user, the inner structure and the functionality are considered. In the fourth part, an application to the surveillance of vapor generators and concluding remarks are presented [fr

  7. Impact of timely switching from imatinib to a second-generation tyrosine kinase inhibitor after 12-month complete cytogenetic response failure: a chart review analysis.

    Science.gov (United States)

    DeAngelo, Daniel J; Chen, Lei; Guerin, Annie; Styles, Amy; Giguere-Duval, Philippe; Wu, Eric Q

    2014-06-01

    In this study, cytogenetic response rate after timely switching to a second-generation tyrosine kinase inhibitor (TKI) was evaluated among patients with CML-CP, after failure to achieve CCyR 1 year after imatinib initiation. An online physician-administered medical chart review was used to retrospectively collect information from 108 US-based hematologists and oncologists on CML-CP patients who initiated imatinib as first-line therapy and failed to achieve CCyR at 12 months after imatinib initiation. Patients who switched to a second-generation TKI within 3 months after the CCyR failure were defined as timely switchers, and those who continued taking imatinib for at least 3 months after the CCyR failure were defined as nonswitchers. CCyR achievement was compared between timely switchers and nonswitchers using multivariate Cox proportional hazard models. Physicians provided information on 593 patients, with 306 defined as timely switchers and 287 defined as nonswitchers. Among the nonswitchers, 78 switched to a second-generation TKI at a later date. After adjusting for potential confounding factors, timely switchers had statistically significantly greater likelihood of achieving CCyR (hazard ratio, 1.80; P = .002) compared with nonswitchers. Timely switching from imatinib to a second-generation TKI after CCyR failure 1 year after imatinib initiation was associated with a greater likelihood of achieving CCyR compared with delaying the switch or not switching to a second-generation TKI. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. The Method of Hypothesis in Plato's Philosophy

    Directory of Open Access Journals (Sweden)

    Malihe Aboie Mehrizi

    2016-09-01

    Full Text Available The article deals with the examination of method of hypothesis in Plato's philosophy. This method, respectively, will be examined in three dialogues of Meno, Phaedon and Republic in which it is explicitly indicated. It will be shown the process of change of Plato’s attitude towards the position and usage of the method of hypothesis in his realm of philosophy. In Meno, considering the geometry, Plato attempts to introduce a method that can be used in the realm of philosophy. But, ultimately in Republic, Plato’s special attention to the method and its importance in the philosophical investigations, leads him to revise it. Here, finally Plato introduces the particular method of philosophy, i.e., the dialectic

  9. Debates—Hypothesis testing in hydrology: Introduction

    Science.gov (United States)

    Blöschl, Günter

    2017-03-01

    This paper introduces the papers in the "Debates—Hypothesis testing in hydrology" series. The four articles in the series discuss whether and how the process of testing hypotheses leads to progress in hydrology. Repeated experiments with controlled boundary conditions are rarely feasible in hydrology. Research is therefore not easily aligned with the classical scientific method of testing hypotheses. Hypotheses in hydrology are often enshrined in computer models which are tested against observed data. Testability may be limited due to model complexity and data uncertainty. All four articles suggest that hypothesis testing has contributed to progress in hydrology and is needed in the future. However, the procedure is usually not as systematic as the philosophy of science suggests. A greater emphasis on a creative reasoning process on the basis of clues and explorative analyses is therefore needed.

  10. Imaging of Ventricular Fibrillation and Defibrillation: The Virtual Electrode Hypothesis.

    Science.gov (United States)

    Boukens, Bastiaan J; Gutbrod, Sarah R; Efimov, Igor R

    2015-01-01

    Ventricular fibrillation is the major underlying cause of sudden cardiac death. Understanding the complex activation patterns that give rise to ventricular fibrillation requires high resolution mapping of localized activation. The use of multi-electrode mapping unraveled re-entrant activation patterns that underlie ventricular fibrillation. However, optical mapping contributed critically to understanding the mechanism of defibrillation, where multi-electrode recordings could not measure activation patterns during and immediately after a shock. In addition, optical mapping visualizes the virtual electrodes that are generated during stimulation and defibrillation pulses, which contributed to the formulation of the virtual electrode hypothesis. The generation of virtual electrode induced phase singularities during defibrillation is arrhythmogenic and may lead to the induction of fibrillation subsequent to defibrillation. Defibrillating with low energy may circumvent this problem. Therefore, the current challenge is to use the knowledge provided by optical mapping to develop a low energy approach of defibrillation, which may lead to more successful defibrillation.

  11. Sea otter health: Challenging a pet hypothesis

    OpenAIRE

    Lafferty, Kevin D.

    2015-01-01

    A recent series of studies on tagged sea otters (Enhydra lutris nereis) challenges the hypothesis that sea otters are sentinels of a dirty ocean, in particular, that pet cats are the main source of exposure to Toxoplasma gondii in central California. Counter to expectations, sea otters from unpopulated stretches of coastline are less healthy and more exposed to parasites than city-associated otters. Ironically, now it seems that spillover from wildlife, not pets, dominates spatial patterns of...

  12. Sea otter health: Challenging a pet hypothesis

    Directory of Open Access Journals (Sweden)

    Kevin D. Lafferty

    2015-12-01

    Full Text Available A recent series of studies on tagged sea otters (Enhydra lutris nereis challenges the hypothesis that sea otters are sentinels of a dirty ocean, in particular, that pet cats are the main source of exposure to Toxoplasma gondii in central California. Counter to expectations, sea otters from unpopulated stretches of coastline are less healthy and more exposed to parasites than city-associated otters. Ironically, now it seems that spillover from wildlife, not pets, dominates spatial patterns of disease transmission.

  13. Sea otter health: Challenging a pet hypothesis.

    Science.gov (United States)

    Lafferty, Kevin D

    2015-12-01

    A recent series of studies on tagged sea otters (Enhydra lutris nereis) challenges the hypothesis that sea otters are sentinels of a dirty ocean, in particular, that pet cats are the main source of exposure to Toxoplasma gondii in central California. Counter to expectations, sea otters from unpopulated stretches of coastline are less healthy and more exposed to parasites than city-associated otters. Ironically, now it seems that spillover from wildlife, not pets, dominates spatial patterns of disease transmission.

  14. Sea otter health: challenging a pet hypothesis

    Science.gov (United States)

    Lafferty, Kevin D.

    2015-01-01

    A recent series of studies on tagged sea otters (Enhydra lutris nereis) challenges the hypothesis that sea otters are sentinels of a dirty ocean, in particular, that pet cats are the main source of exposure to Toxoplasma gondii in central California. Counter to expectations, sea otters from unpopulated stretches of coastline are less healthy and more exposed to parasites than city-associated otters. Ironically, now it seems that spillover from wildlife, not pets, dominates spatial patterns of disease transmission.

  15. Application of Sivasubramanian Kalimuthu Hypothesis to Triangles

    OpenAIRE

    M. Sivasubramanian

    2009-01-01

    Problem statement: The interior angles sum of a number of Euclidean triangles was transformed into quadratic equations. The analysis of those quadratic equations yielded the following proposition: There exists Euclidean triangle whose interior angle sum is a straight angle. Approach: In this study, the researchers introduced a new hypothesis for quadratic equations and derived an entirely new result. Results: The result of the study was controversial, but mathematically consistent. Conclusion...

  16. Kelvin on an old, celebrated hypothesis

    Science.gov (United States)

    Harrison, Edward

    1986-07-01

    Lord Kelvin in 1901 tested an ``old and celebrated hypothesis'' that if we could see far enough into space the whole sky would be occupied with stellar disks all of perhaps the same brightness as the Sun. Kelvin was the first to solve quantitatively and correctly the riddle of a dark night sky, a riddle that had been previously solved qualitatively by Edgar Allan Poe, and is now known as Olbers' paradox.

  17. Testing the hypothesis of the natural suicide rates: Further evidence from OECD data

    DEFF Research Database (Denmark)

    Andres, Antonio Rodriguez; Halicioglu, Ferda

    2011-01-01

    This paper provides further evidence on the hypothesis of the natural rate of suicide using the time series data for 15 OECD countries over the period 1970–2004. This hypothesis suggests that the suicide rate of a society could never be zero even if both the economic and the social conditions wer...

  18. Color Helmet Mounted Display System with Real Time Computer Generated and Video Imagery for In-Flight Simulation

    Science.gov (United States)

    Sawyer, Kevin; Jacobsen, Robert; Aiken, Edwin W. (Technical Monitor)

    1995-01-01

    NASA Ames Research Center and the US Army are developing the Rotorcraft Aircrew Systems Concepts Airborne Laboratory (RASCAL) using a Sikorsky UH-60 helicopter for the purpose of flight systems research. A primary use of the RASCAL is in-flight simulation for which the visual scene will use computer generated imagery and synthetic vision. This research is made possible in part to a full color wide field of view Helmet Mounted Display (HMD) system that provides high performance color imagery suitable for daytime operations in a flight-rated package. This paper describes the design and performance characteristics of the HMD system. Emphasis is placed on the design specifications, testing, and integration into the aircraft of Kaiser Electronics' RASCAL HMD system that was designed and built under contract for NASA. The optical performance and design of the Helmet mounted display unit will be discussed as well as the unique capabilities provided by the system's Programmable Display Generator (PDG).

  19. Testing the egocentric mirror-rotation hypothesis.

    Science.gov (United States)

    Muelenz, Cornelius; Hecht, Heiko; Gamer, Matthias

    2010-01-01

    Although observers know about the law of reflection, their intuitive understanding of spatial locations in mirrors is often erroneous. Hecht et al. (2005) proposed a two-stage mirror-rotation hypothesis to explain these misconceptions. The hypothesis involves an egocentric bias to the effect that observers behave as if the mirror surface were rotated by about 2 degrees to be more orthogonal than is the case. We test four variants of the hypothesis, which differ depending on whether the virtual world, the mirror, or both are taken to be rotated. We devised an experimental setup that allowed us to distinguish between these variants. Our results confirm that the virtual world--and only the virtual world--is being rotated. Observers had to perform a localization task, using a mirror that was either fronto-parallel or rotated opposite the direction of the predicted effect. We were thus able to compensate for the effect. The positions of objects in mirrors were perceived in accordance with the erroneous conception that the virtual world behind the mirror is slightly rotated and that the reconstruction is based on the non-rotated fronto-parallel mirror. A covert rotation of the mirror by about 2 degrees against the predicted effect was able to compensate for the placement error.

  20. Consumer health information seeking as hypothesis testing.

    Science.gov (United States)

    Keselman, Alla; Browne, Allen C; Kaufman, David R

    2008-01-01

    Despite the proliferation of consumer health sites, lay individuals often experience difficulty finding health information online. The present study attempts to understand users' information seeking difficulties by drawing on a hypothesis testing explanatory framework. It also addresses the role of user competencies and their interaction with internet resources. Twenty participants were interviewed about their understanding of a hypothetical scenario about a family member suffering from stable angina and then searched MedlinePlus consumer health information portal for information on the problem presented in the scenario. Participants' understanding of heart disease was analyzed via semantic analysis. Thematic coding was used to describe information seeking trajectories in terms of three key strategies: verification of the primary hypothesis, narrowing search within the general hypothesis area and bottom-up search. Compared to an expert model, participants' understanding of heart disease involved different key concepts, which were also differently grouped and defined. This understanding provided the framework for search-guiding hypotheses and results interpretation. Incorrect or imprecise domain knowledge led individuals to search for information on irrelevant sites, often seeking out data to confirm their incorrect initial hypotheses. Online search skills enhanced search efficiency, but did not eliminate these difficulties. Regardless of their web experience and general search skills, lay individuals may experience difficulty with health information searches. These difficulties may be related to formulating and evaluating hypotheses that are rooted in their domain knowledge. Informatics can provide support at the levels of health information portals, individual websites, and consumer education tools.

  1. Ab initio calculation of harmonic generation spectra of helium using a time-dependent non-Hermitian formalism

    Czech Academy of Sciences Publication Activity Database

    Gilary, I.; Kaprálová, Petra; Moiseyev, N.

    2006-01-01

    Roč. 74, - (2006), 052505-1 ISSN 1050-2947 R&D Projects: GA AV ČR(CZ) KJB100550501; GA MŠk(CZ) LC512 Grant - others:Israel Science Foundation(IL) 1152/04 Institutional research plan: CEZ:AV0Z40550506 Keywords : high-order harmonic generation * symmetry selection rules * even harmonics * complex scaling * F-produkt Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 3.047, year: 2006

  2. The hormonal sensitivity hypothesis: A review and new findings.

    Science.gov (United States)

    Pope, Carley J; Oinonen, Kirsten; Mazmanian, Dwight; Stone, Suzanne

    2017-05-01

    Previous women's health practitioners and researchers have postulated that some women are particularly sensitive to hormonal changes occurring during reproductive events. We hypothesize that some women are particularly sensitive to hormonal changes occurring across their reproductive lifespan. To evaluate this hypothesis, we reviewed findings from the existing literature and findings from our own lab. Taken together, the evidence we present shows a recurring pattern of hormonal sensitivity at predictable but different times across the lifespan of some women (i.e., menarche, the premenstrual phase, hormonal contraceptive use, pregnancy, the postpartum period, and menopause). These findings provide support for the hypothesis that there is a subgroup of women who are more susceptible to physical, psychological, and sexual symptoms related to hormonal shifts or abrupt hormonal fluctuations that occur throughout the reproductive lifespan. We propose that this pattern reflects a Hormonal Sensitivity Syndrome. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Nonlinear internal waves and plumes generated in response to sea-loch outflow, AUV, and time-lapse photography observations

    Science.gov (United States)

    Toberman, Matthew; Inall, Mark; Boyd, Tim; Dumount, Estelle; Griffiths, Colin

    2017-07-01

    The tidally modulated outflow of brackish water from a sea loch forms a thin surface layer that propagates into the coastal ocean as a buoyant gravity current, transporting nutrients and sediments, as well as fresh water, heat and momentum. The fresh intrusion both propagates into and generates a strongly stratified environment which supports trains of nonlinear internal waves (NLIWs). NLIWs are shown to propagate ahead of this buoyancy input in response to propagation of the outflow water into the stratified environment generated by the previous release as well as in the opposing direction after the reflection from steep bathymetry. Oblique aerial photographs were taken and photogrammetric rectification led to the identification of the buoyant intrusion and the subsequent generation of NLIWs. An autonomous underwater vehicle (AUV) was deployed on repeated reciprocal transects in order to make simultaneous CTD, ADCP, and microstructure shear measurements of the evolution of these phenomena in conjunction with conventional mooring measurements. AUV-based temperature and salinity signals of NLIWs of depression were observed together with increased turbulent kinetic energy dissipation rates of over 2 orders of magnitude within and in the wake of the NLIWs. Repeated measurements allow a unique opportunity to investigate the horizontal structure of these phenomena. Simple metric scaling demonstrates that these processes are likely to be feature of many fjordic systems located on the west coast of Scotland but may also play a key role in the assimilation of the outflow from many tidally dominated fjordic systems throughout the world.

  4. The time structure of Cherenkov images generated by TeV gamma-rays and by cosmic rays

    OpenAIRE

    HEGRA Collaboration; al, M. Hess et

    1998-01-01

    The time profiles of Cherenkov images of cosmic-ray showers and of gamma-ray showers are investigated, using data gathered with the HEGRA system of imaging atmospheric Cherenkov telescopes during the 1997 outbursts of Mrk 501. Photon arrival times are shown to vary across the shower images. The dominant feature is a time gradient along the major axis of the images. The gradient varies with the distance between the telescope and the shower core, and is maximal for large distances. The time pro...

  5. A method for generating an illusion of backwards time travel using immersive virtual reality-an exploratory study.

    Science.gov (United States)

    Friedman, Doron; Pizarro, Rodrigo; Or-Berkers, Keren; Neyret, Solène; Pan, Xueni; Slater, Mel

    2014-01-01

    We introduce a new method, based on immersive virtual reality (IVR), to give people the illusion of having traveled backwards through time to relive a sequence of events in which they can intervene and change history. The participant had played an important part in events with a tragic outcome-deaths of strangers-by having to choose between saving 5 people or 1. We consider whether the ability to go back through time, and intervene, to possibly avoid all deaths, has an impact on how the participant views such moral dilemmas, and also whether this experience leads to a re-evaluation of past unfortunate events in their own lives. We carried out an exploratory study where in the "Time Travel" condition 16 participants relived these events three times, seeing incarnations of their past selves carrying out the actions that they had previously carried out. In a "Repetition" condition another 16 participants replayed the same situation three times, without any notion of time travel. Our results suggest that those in the Time Travel condition did achieve an illusion of "time travel" provided that they also experienced an illusion of presence in the virtual environment, body ownership, and agency over the virtual body that substituted their own. Time travel produced an increase in guilt feelings about the events that had occurred, and an increase in support of utilitarian behavior as the solution to the moral dilemma. Time travel also produced an increase in implicit morality as judged by an implicit association test. The time travel illusion was associated with a reduction of regret associated with bad decisions in their own lives. The results show that when participants have a third action that they can take to solve the moral dilemma (that does not immediately involve choosing between the 1 and the 5) then they tend to take this option, even though it is useless in solving the dilemma, and actually results in the deaths of a greater number.

  6. Generation of new Agm Ten clusters via laser ablation synthesis using Ag-Te nano-composite as precursor. Quadrupole ion trap time-of-flight mass spectrometry.

    Science.gov (United States)

    Mawale, Ravi Madhukar; Amato, Filippo; Alberti, Milan; Havel, Josef

    2014-12-30

    Silver tellurides find applications in the development of infrared detection, imaging, magnetics, sensors, memory devices, and optic materials. However, only a limited number of silver tellurides have been described to date. Laser ablation synthesis (LAS) was selected to generate new Ag-Te clusters. Isothermal adsorption was used to study the formation of silver nano-particles-tellurium aggregates. Laser desorption ionization quadrupole ion trap time-of-flight mass spectrometry (LDI-QIT-TOFMS) was used for the generation and analysis of Agm Ten clusters. Scanning electron microscopy and energy-dispersive X-ray spectroscopy were used to visualize the structure of materials. The stoichiometry of the generated clusters was determined by computer modeling of isotopic patterns. A simple, one-pot method for the preparation of Ag-Te nano-composite was developed and found suitable for LAS of silver tellurides. The LDI of Ag-Te nano-composite leads to the formation of 11 unary and 52 binary clusters. The stoichiometry of the 34 novel Agm Ten clusters is reported here for the first time. LAS with TOFMS detection was proven to be a powerful technique for the generation of silver telluride clusters. Knowledge of the stoichiometry of the generated clusters might facilitate the further development of novel high-tech silver tellurium nano-materials. Copyright © 2014 John Wiley & Sons, Ltd.

  7. ROSMOD: A Toolsuite for Modeling, Generating, Deploying, and Managing Distributed Real-time Component-based Software using ROS

    Directory of Open Access Journals (Sweden)

    Pranav Srinivas Kumar

    2016-09-01

    Full Text Available This paper presents the Robot Operating System Model-driven development tool suite, (ROSMOD an integrated development environment for rapid prototyping component-based software for the Robot Operating System (ROS middleware. ROSMOD is well suited for the design, development and deployment of large-scale distributed applications on embedded devices. We present the various features of ROSMOD including the modeling language, the graphical user interface, code generators, and deployment infrastructure. We demonstrate the utility of this tool with a real-world case study: an Autonomous Ground Support Equipment (AGSE robot that was designed and prototyped using ROSMOD for the NASA Student Launch competition, 2014–2015.

  8. Study of short-optical pulses time behavior generated by the self-injection method in a self-imaged unstable resonator

    International Nuclear Information System (INIS)

    Farahbod, A. H.; Asgari, S.

    2006-01-01

    In the present research, the time behavior of short optical pulses from a self-injected, self-imaged unstable resonator (SI-GSFUR) have been studied, and its dependence on the opening-time of electro-optics switch, delay -time among the Q-switching and self-injection electrical commands for the pulse generation and regeneration, loss and the resonator geometrical magnification, M, for the optical pulses under identical pumping conditions investigated. The numerical results have been compared with the recent experimental data, for M = -2.2, and a reasonably good consistency observed between them.

  9. West Nile virus experimental evolution in vivo and the trade-off hypothesis.

    Directory of Open Access Journals (Sweden)

    Eleanor R Deardorff

    2011-11-01

    Full Text Available In nature, arthropod-borne viruses (arboviruses perpetuate through alternating replication in vertebrate and invertebrate hosts. The trade-off hypothesis proposes that these viruses maintain adequate replicative fitness in two disparate hosts in exchange for superior fitness in one host. Releasing the virus from the constraints of a two-host cycle should thus facilitate adaptation to a single host. This theory has been addressed in a variety of systems, but remains poorly understood. We sought to determine the fitness implications of alternating host replication for West Nile virus (WNV using an in vivo model system. Previously, WNV was serially or alternately passed 20 times in vivo in chicks or mosquitoes and resulting viruses were characterized genetically. In this study, these test viruses were competed in vivo in fitness assays against an unpassed marked reference virus. Fitness was assayed in chicks and in two important WNV vectors, Culex pipiens and Culex quinquefasciatus. Chick-specialized virus displayed clear fitness gains in chicks and in Cx. pipiens but not in Cx. quinquefasciatus. Cx. pipiens-specialized virus experienced reduced fitness in chicks and little change in either mosquito species. These data suggest that when fitness is measured in birds the trade-off hypothesis is supported; but in mosquitoes it is not. Overall, these results suggest that WNV evolution is driven by alternate cycles of genetic expansion in mosquitoes, where purifying selection is weak and genetic diversity generated, and restriction in birds, where purifying selection is strong.

  10. Space-time-wave number-frequency Z(x, t, k, f) analysis of SAW generation on fluid filled cylindrical shells.

    Science.gov (United States)

    Martinez, Loïc; Morvan, Bruno; Izbicki, Jean Louis

    2004-04-01

    A new 4D space-time-wave number-frequency representation Z(x,t,k,f) is introduced. The Z(x,t,k,f) representation is used for processing 2D space-time signal collection issued from wave propagation along a 1D medium. This representation is an extension along the time dimension of the space-wave number-frequency representation. The Z(x,t,k,f) representation is obtained by short time-space 2D Fourier transforming the space-time collection. The Z(x,t,k,f) representation allows the characterization transient aspects of wave generation and propagation in both space and time dimensions. The Z(x,t,k,f) representation is used to experimentally investigate Lamb wave generation and propagation around a cylindrical shell (relative thickness is equal to 0.03) surrounded by water and excited by a pulse (0.1 micros duration with 1-5 MHz transducers). Three kinds of fluids have been used inside the shell: air, water, propanol. In all the cases, the Z(x,t,k,f) analysis clearly identify the reflected field on the insonified side of the shell and it allows the measurement of the local reflection coefficients R(x,t,k,f). The generation and the propagation of Lamb waves are also quantified. For the liquid filled shells, the multiple internal reflections are revealed by Z(x,t,k,f) analysis: the local transmission coefficients T(x,t,k,f) are also measured. When local matching conditions allows Lamb wave generation, the multiple regeneration of Lamb wave is observed. Based on these results, a link is establish toward the theoretical results obtained by steady state approach and Sommerfeld-Watson transform.

  11. A method for generating an illusion of backwards time travel using immersive virtual reality—an exploratory study

    Science.gov (United States)

    Friedman, Doron; Pizarro, Rodrigo; Or-Berkers, Keren; Neyret, Solène; Pan, Xueni; Slater, Mel

    2014-01-01

    We introduce a new method, based on immersive virtual reality (IVR), to give people the illusion of having traveled backwards through time to relive a sequence of events in which they can intervene and change history. The participant had played an important part in events with a tragic outcome—deaths of strangers—by having to choose between saving 5 people or 1. We consider whether the ability to go back through time, and intervene, to possibly avoid all deaths, has an impact on how the participant views such moral dilemmas, and also whether this experience leads to a re-evaluation of past unfortunate events in their own lives. We carried out an exploratory study where in the “Time Travel” condition 16 participants relived these events three times, seeing incarnations of their past selves carrying out the actions that they had previously carried out. In a “Repetition” condition another 16 participants replayed the same situation three times, without any notion of time travel. Our results suggest that those in the Time Travel condition did achieve an illusion of “time travel” provided that they also experienced an illusion of presence in the virtual environment, body ownership, and agency over the virtual body that substituted their own. Time travel produced an increase in guilt feelings about the events that had occurred, and an increase in support of utilitarian behavior as the solution to the moral dilemma. Time travel also produced an increase in implicit morality as judged by an implicit association test. The time travel illusion was associated with a reduction of regret associated with bad decisions in their own lives. The results show that when participants have a third action that they can take to solve the moral dilemma (that does not immediately involve choosing between the 1 and the 5) then they tend to take this option, even though it is useless in solving the dilemma, and actually results in the deaths of a greater number. PMID:25228889

  12. A method for generating an illusion of backwards time travel using immersive virtual reality - an exploratory study

    Directory of Open Access Journals (Sweden)

    Doron eFriedman

    2014-09-01

    Full Text Available We introduce a new method, based on immersive virtual reality, to give people the illusion of having travelled backwards through time to relive a sequence of events in which they can intervene and change history. The participant had played an important part in events with a tragic outcome - deaths of strangers – by having to choose between saving 5 people or 1. We consider whether the ability to go back through time, and intervene, to possibly avoid all deaths, has an impact on how the participant views such moral dilemmas, and also whether this experience leads to a re-evaluation of past unfortunate events in their own lives. We carried out an exploratory study where in the ‘Time Travel’ condition 16 participants relived these events three times, seeing incarnations of their past selves carrying out the actions that they had previously carried out. In a ‘Repetition’ condition another 16 participants replayed the same situation three times, without any notion of time travel. Our results suggest that those in the Time Travel condition did achieve an illusion of ‘time travel’ provided that they also experienced an illusion of presence in the virtual environment, body ownership and agency over the virtual body that substituted their own. Time travel produced an increase in guilt feelings about the events that had occurred, and an increase in support of utilitarian behavior as the solution to the moral dilemma. Time travel also produced an increase in implicit morality as judged by an implicit association test. The time travel illusion was associated with a reduction of regret associated with bad decisions in their own lives. The results show that when participants have a third action that they can take to solve the moral dilemma (that does not immediately involve choosing between the 1 and the 5 then they tend to take this option, even though it is useless in solving the dilemma, and actually results in the deaths of a greater number.

  13. Real-time cooperating motion generation for man-machine systems and its application to medical technology.

    Science.gov (United States)

    Seto, Fumi; Hirata, Yasuhisa; Kosuge, Kazuhiro

    2007-01-01

    In this paper, we propose a cooperating motion generation method for man-machine cooperation systems in which the machines are controlled based on the intentional force applied by a human/humans for realizing several tasks in cooperation with a human/humans. By applying this method, the systems could avoid self-collisions, collisions with obstacles and other dangerous situations during the tasks. Proposed method consists of two parts; representation method of robots' body referred to as "RoBE (Representation of Body by Elastic elements)", and cooperating motion generation method using RoBE. As the application examples of proposed method, we focused on robots cooperating with a human/humans and surgery robot tools from the aspect of medical and welfare field. We did the experiments using human-friendly robot, referred to as MR Helper, for illustrating the validity of the proposed method. We also did the computer simulation to indicate the prospects of applications of our self-collision avoidance method to surgery robot tools.

  14. Generating a Square Switching Window for Timing Jitter Tolerant 160 Gb/s Demultiplexing by the Optical Fourier Transform Technique

    DEFF Research Database (Denmark)

    Oxenløwe, Leif Katsuo; Galili, Michael; Clausen, A. T:

    2006-01-01

    A square spectrum is optically Fourier transformed into a square pulse in the time domain. This is used to demultiplex a 160 Gb/s data signal with a significant increase in jitter tolerance to 2.6 ps.......A square spectrum is optically Fourier transformed into a square pulse in the time domain. This is used to demultiplex a 160 Gb/s data signal with a significant increase in jitter tolerance to 2.6 ps....

  15. Detecting nonlinear structure in time series

    International Nuclear Information System (INIS)

    Theiler, J.

    1991-01-01

    We describe an approach for evaluating the statistical significance of evidence for nonlinearity in a time series. The formal application of our method requires the careful statement of a null hypothesis which characterizes a candidate linear process, the generation of an ensemble of ''surrogate'' data sets which are similar to the original time series but consistent with the null hypothesis, and the computation of a discriminating statistic for the original and for each of the surrogate data sets. The idea is to test the original time series against the null hypothesis by checking whether the discriminating statistic computed for the original time series differs significantly from the statistics computed for each of the surrogate sets. While some data sets very cleanly exhibit low-dimensional chaos, there are many cases where the evidence is sketchy and difficult to evaluate. We hope to provide a framework within which such claims of nonlinearity can be evaluated. 5 refs., 4 figs

  16. Hypothesis Testing as an Act of Rationality

    Science.gov (United States)

    Nearing, Grey

    2017-04-01

    Statistical hypothesis testing is ad hoc in two ways. First, setting probabilistic rejection criteria is, as Neyman (1957) put it, an act of will rather than an act of rationality. Second, physical theories like conservation laws do not inherently admit probabilistic predictions, and so we must use what are called epistemic bridge principles to connect model predictions with the actual methods of hypothesis testing. In practice, these bridge principles are likelihood functions, error functions, or performance metrics. I propose that the reason we are faced with these problems is because we have historically failed to account for a fundamental component of basic logic - namely the portion of logic that explains how epistemic states evolve in the presence of empirical data. This component of Cox' (1946) calculitic logic is called information theory (Knuth, 2005), and adding information theory our hypothetico-deductive account of science yields straightforward solutions to both of the above problems. This also yields a straightforward method for dealing with Popper's (1963) problem of verisimilitude by facilitating a quantitative approach to measuring process isomorphism. In practice, this involves data assimilation. Finally, information theory allows us to reliably bound measures of epistemic uncertainty, thereby avoiding the problem of Bayesian incoherency under misspecified priors (Grünwald, 2006). I therefore propose solutions to four of the fundamental problems inherent in both hypothetico-deductive and/or Bayesian hypothesis testing. - Neyman (1957) Inductive Behavior as a Basic Concept of Philosophy of Science. - Cox (1946) Probability, Frequency and Reasonable Expectation. - Knuth (2005) Lattice Duality: The Origin of Probability and Entropy. - Grünwald (2006). Bayesian Inconsistency under Misspecification. - Popper (1963) Conjectures and Refutations: The Growth of Scientific Knowledge.

  17. Testing competing forms of the Milankovitch hypothesis: A multivariate approach

    Science.gov (United States)

    Kaufmann, Robert K.; Juselius, Katarina

    2016-02-01

    We test competing forms of the Milankovitch hypothesis by estimating the coefficients and diagnostic statistics for a cointegrated vector autoregressive model that includes 10 climate variables and four exogenous variables for solar insolation. The estimates are consistent with the physical mechanisms postulated to drive glacial cycles. They show that the climate variables are driven partly by solar insolation, determining the timing and magnitude of glaciations and terminations, and partly by internal feedback dynamics, pushing the climate variables away from equilibrium. We argue that the latter is consistent with a weak form of the Milankovitch hypothesis and that it should be restated as follows: internal climate dynamics impose perturbations on glacial cycles that are driven by solar insolation. Our results show that these perturbations are likely caused by slow adjustment between land ice volume and solar insolation. The estimated adjustment dynamics show that solar insolation affects an array of climate variables other than ice volume, each at a unique rate. This implies that previous efforts to test the strong form of the Milankovitch hypothesis by examining the relationship between solar insolation and a single climate variable are likely to suffer from omitted variable bias.

  18. The hygiene hypothesis: current perspectives and future therapies

    Directory of Open Access Journals (Sweden)

    Stiemsma LT

    2015-07-01

    Full Text Available Leah T Stiemsma,1,2 Lisa A Reynolds,3 Stuart E Turvey,1,2,4 B Brett Finlay1,3,5 1Department of Microbiology & Immunology, University of British Columbia, 2The Child and Family Research Institute, 3Michael Smith Laboratories, University of British Columbia, 4Department of Pediatrics, University of British Columbia, 5Department of Biochemistry and Molecular Biology, University of British Columbia, Vancouver, BC, Canada Abstract: Developed countries have experienced a steady increase in atopic disease and disorders of immune dysregulation since the 1980s. This increase parallels a decrease in infectious diseases within the same time period, while developing countries seem to exhibit the opposite effect, with less immune dysregulation and a higher prevalence of infectious disease. The “hygiene hypothesis”, proposed by Strachan in 1989, aimed to explain this peculiar generational rise in immune dysregulation. However, research over the past 10 years provides evidence connecting the commensal and symbiotic microbes (intestinal microbiota and parasitic helminths with immune development, expanding the hygiene hypothesis into the “microflora” and “old friends” hypotheses, respectively. There is evidence that parasitic helminths and commensal microbial organisms co-evolved with the human immune system and that these organisms are vital in promoting normal immune development. Current research supports the potential for manipulation of the bacterial intestinal microbiota to treat and even prevent immune dysregulation in the form of atopic disease and other immune-mediated disorders (namely inflammatory bowel disease and type 1 diabetes. Both human and animal model research are crucial in understanding the mechanistic links between these intestinal microbes and helminth parasites, and the human immune system. Pro-, pre-, and synbiotic, as well as treatment with live helminth and excretory/secretory helminth product therapies, are all potential

  19. Statistical hypothesis testing with SAS and R

    CERN Document Server

    Taeger, Dirk

    2014-01-01

    A comprehensive guide to statistical hypothesis testing with examples in SAS and R When analyzing datasets the following questions often arise:Is there a short hand procedure for a statistical test available in SAS or R?If so, how do I use it?If not, how do I program the test myself? This book answers these questions and provides an overview of the most commonstatistical test problems in a comprehensive way, making it easy to find and performan appropriate statistical test. A general summary of statistical test theory is presented, along with a basicdescription for each test, including the

  20. Confluence Model or Resource Dilution Hypothesis?

    DEFF Research Database (Denmark)

    Jæger, Mads

    Studies on family background often explain the negative effect of sibship size on educational attainment by one of two theories: the Confluence Model (CM) or the Resource Dilution Hypothesis (RDH). However, as both theories – for substantively different reasons – predict that sibship size should...... to identify a unique RDH effect on educational attainment. Using sibling data from the Wisconsin Longitudinal Study (WLS) and a random effect Instrumental Variable model, I find that in addition to having a negative effect on cognitive ability, sibship size also has a strong negative effect on educational...