WorldWideScience

Sample records for stringent pointing requirements

  1. Flight Hardware Packaging Design for Stringent EMC Radiated Emission Requirements

    Science.gov (United States)

    Lortz, Charlene L.; Huang, Chi-Chien N.; Ravich, Joshua A.; Steiner, Carl N.

    2013-01-01

    This packaging design approach can help heritage hardware meet a flight project's stringent EMC radiated emissions requirement. The approach requires only minor modifications to a hardware's chassis and mainly concentrates on its connector interfaces. The solution is to raise the surface area where the connector is mounted by a few millimeters using a pedestal, and then wrapping with conductive tape from the cable backshell down to the surface-mounted connector. This design approach has been applied to JPL flight project subsystems. The EMC radiated emissions requirements for flight projects can vary from benign to mission critical. If the project's EMC requirements are stringent, the best approach to meet EMC requirements would be to design an EMC control program for the project early on and implement EMC design techniques starting with the circuit board layout. This is the ideal scenario for hardware that is built from scratch. Implementation of EMC radiated emissions mitigation techniques can mature as the design progresses, with minimal impact to the design cycle. The real challenge exists for hardware that is planned to be flown following a built-to-print approach, in which heritage hardware from a past project with a different set of requirements is expected to perform satisfactorily for a new project. With acceptance of heritage, the design would already be established (circuit board layout and components have already been pre-determined), and hence any radiated emissions mitigation techniques would only be applicable at the packaging level. The key is to take a heritage design with its known radiated emissions spectrum and repackage, or modify its chassis design so that it would have a better chance of meeting the new project s radiated emissions requirements.

  2. Comparison of urine iodine/creatinine ratio between patients following stringent and less stringent low iodine diet for radioiodine remnant ablation of thyroid cancer

    International Nuclear Information System (INIS)

    Roh, Jee Ho; Kim, Byung Il; Ha, Ji Su; Chang, Sei Joong; Shin, Hye Young; Choi, Joon Hyuk; Kim, Do Min; Kim, Chong Soon

    2006-01-01

    A low iodine diet (LID) for 1 ∼ 2 weeks is recommended for patients who undergoing radioiodine remnant ablation. However, the LID educations for patients are different among centers because there is no concrete recommendation for protocol of LID. In this investigation, we compared two representative types of LID protocols performed in several centers in Korea using urine iodine to creatinine tatio (urine I/Cr). From 2006, April to June, patients referred to our center for radioiodine remnant ablation of thyroid cancer from several local hospitals which had different LID protocols were included. We divided into two groups, stringent LID for 1 week and less stringent LID for 2 weeks, then measured their urine I/Cr ratio with spot urine when patients were admitted to the hospital. Total 27 patients were included in this investigation (M:F = 1:26; 13 in one-week stringent LID; 14 in two-week less stringent LID). Average of urine I/Cr ratio was 127.87 ± 78.52 μ g/g in stringent LID for 1 week, and 289.75 ± 188.24 μ g/g in less stringent LID for 2 weeks. It was significantly lower in stringent LID for 1 week group (ρ = 0.008). The number of patients whose urine I/Cr ratios were below 100 μ g/g was 6 of 13 in stringent LID for 1 week group, and 3 of 14 in less stringent LID for 2 weeks group. Stringent LID for 1 week resulted in better urinary I/Cr ratio in our investigation compared with the other protocol. However it still resulted in plenty of inadequate range of I/Cr ratio, so more stringent protocol such as stringent LID for 2 weeks is expected more desirable

  3. The rapidly evolving centromere-specific histone has stringent functional requirements in Arabidopsis thaliana.

    Science.gov (United States)

    Ravi, Maruthachalam; Kwong, Pak N; Menorca, Ron M G; Valencia, Joel T; Ramahi, Joseph S; Stewart, Jodi L; Tran, Robert K; Sundaresan, Venkatesan; Comai, Luca; Chan, Simon W-L

    2010-10-01

    Centromeres control chromosome inheritance in eukaryotes, yet their DNA structure and primary sequence are hypervariable. Most animals and plants have megabases of tandem repeats at their centromeres, unlike yeast with unique centromere sequences. Centromere function requires the centromere-specific histone CENH3 (CENP-A in human), which replaces histone H3 in centromeric nucleosomes. CENH3 evolves rapidly, particularly in its N-terminal tail domain. A portion of the CENH3 histone-fold domain, the CENP-A targeting domain (CATD), has been previously shown to confer kinetochore localization and centromere function when swapped into human H3. Furthermore, CENP-A in human cells can be functionally replaced by CENH3 from distantly related organisms including Saccharomyces cerevisiae. We have used cenh3-1 (a null mutant in Arabidopsis thaliana) to replace endogenous CENH3 with GFP-tagged variants. A H3.3 tail domain-CENH3 histone-fold domain chimera rescued viability of cenh3-1, but CENH3's lacking a tail domain were nonfunctional. In contrast to human results, H3 containing the A. thaliana CATD cannot complement cenh3-1. GFP-CENH3 from the sister species A. arenosa functionally replaces A. thaliana CENH3. GFP-CENH3 from the close relative Brassica rapa was targeted to centromeres, but did not complement cenh3-1, indicating that kinetochore localization and centromere function can be uncoupled. We conclude that CENH3 function in A. thaliana, an organism with large tandem repeat centromeres, has stringent requirements for functional complementation in mitosis.

  4. New generation of gas infrared point heaters

    Energy Technology Data Exchange (ETDEWEB)

    Schink, Damian [Pintsch Aben B.V., Dinslaken (Germany)

    2011-11-15

    It is more than thirty years since gas infrared heating for points was introduced on the railway network of what is now Deutsche Bahn. These installations have remained in service right through to the present, with virtually no modifications. More stringent requirements as regards availability, maintainability and remote monitoring have, however, led to the development of a new system of gas infrared heating for points - truly a new generation. (orig.)

  5. Circuitry linking the Csr and stringent response global regulatory systems.

    Science.gov (United States)

    Edwards, Adrianne N; Patterson-Fortin, Laura M; Vakulskas, Christopher A; Mercante, Jeffrey W; Potrykus, Katarzyna; Vinella, Daniel; Camacho, Martha I; Fields, Joshua A; Thompson, Stuart A; Georgellis, Dimitris; Cashel, Michael; Babitzke, Paul; Romeo, Tony

    2011-06-01

    CsrA protein regulates important cellular processes by binding to target mRNAs and altering their translation and/or stability. In Escherichia coli, CsrA binds to sRNAs, CsrB and CsrC, which sequester CsrA and antagonize its activity. Here, mRNAs for relA, spoT and dksA of the stringent response system were found among 721 different transcripts that copurified with CsrA. Many of the transcripts that copurified with CsrA were previously determined to respond to ppGpp and/or DksA. We examined multiple regulatory interactions between the Csr and stringent response systems. Most importantly, DksA and ppGpp robustly activated csrB/C transcription (10-fold), while they modestly activated csrA expression. We propose that CsrA-mediated regulation is relieved during the stringent response. Gel shift assays confirmed high affinity binding of CsrA to relA mRNA leader and weaker interactions with dksA and spoT. Reporter fusions, qRT-PCR and immunoblotting showed that CsrA repressed relA expression, and (p)ppGpp accumulation during stringent response was enhanced in a csrA mutant. CsrA had modest to negligible effects on dksA and spoT expression. Transcription of dksA was negatively autoregulated via a feedback loop that tended to mask CsrA effects. We propose that the Csr system fine-tunes the stringent response and discuss biological implications of the composite circuitry. © Published 2011. This article is a US Government work and is in the public domain in the USA.

  6. Is ionizing radiation regulated more stringently than chemical carcinogens

    International Nuclear Information System (INIS)

    Travis, C.C.; Pack, S.R.; Hattemer-Frey, H.A.

    1989-01-01

    It is widely believed that United States government agencies regulate exposure to ionizing radiation more stringently than exposure to chemical carcinogens. It is difficult to verify this perception, however, because chemical carcinogens and ionizing radiation are regulated using vastly different strategies. Chemical carcinogens are generally regulated individually. Regulators consider the risk of exposure to one chemical rather than the cumulative radiation exposure from all sources. Moreover, standards for chemical carcinogens are generally set in terms of quantities released or resultant environmental concentrations, while standards for ionizing radiation are set in terms of dose to the human body. Since chemicals and ionizing radiation cannot be compared on the basis of equal dose to the exposed individual, standards regulating chemicals and ionizing radiation cannot be compared directly. It is feasible, however, to compare the two sets of standards on the basis of equal risk to the exposed individual, assuming that standards for chemicals and ionizing radiation are equivalent if estimated risk levels are equitable. This paper compares risk levels associated with current standards for ionizing radiation and chemical carcinogens. The authors do not attempt to determine whether either type of risk is regulated too stringently or not stringently enough but endeavor only to ascertain if ionizing radiation is actually regulated more strictly than chemical carcinogens

  7. Orphan Toxin OrtT (YdcX) of Escherichia coli Reduces Growth during the Stringent Response

    Science.gov (United States)

    2015-01-29

    antimicrobials trimethoprim and sulfamethoxazole; these antimicrobials induce the stringent response by inhibiting tetrahydrofolate synthesis...in the presence of both antimicrobials trimethoprim and sulfamethoxazole; these antimicrobials induce the stringent response by inhibiting...level [20]. Toxins 2015, 7 301 Despite these difficulties in determining physiological roles, TA systems are clearly phage inhibition systems

  8. Stringent DDI-based prediction of H. sapiens-M. tuberculosis H37Rv protein-protein interactions.

    Science.gov (United States)

    Zhou, Hufeng; Rezaei, Javad; Hugo, Willy; Gao, Shangzhi; Jin, Jingjing; Fan, Mengyuan; Yong, Chern-Han; Wozniak, Michal; Wong, Limsoon

    2013-01-01

    H. sapiens-M. tuberculosis H37Rv protein-protein interaction (PPI) data are very important information to illuminate the infection mechanism of M. tuberculosis H37Rv. But current H. sapiens-M. tuberculosis H37Rv PPI data are very scarce. This seriously limits the study of the interaction between this important pathogen and its host H. sapiens. Computational prediction of H. sapiens-M. tuberculosis H37Rv PPIs is an important strategy to fill in the gap. Domain-domain interaction (DDI) based prediction is one of the frequently used computational approaches in predicting both intra-species and inter-species PPIs. However, the performance of DDI-based host-pathogen PPI prediction has been rather limited. We develop a stringent DDI-based prediction approach with emphasis on (i) differences between the specific domain sequences on annotated regions of proteins under the same domain ID and (ii) calculation of the interaction strength of predicted PPIs based on the interacting residues in their interaction interfaces. We compare our stringent DDI-based approach to a conventional DDI-based approach for predicting PPIs based on gold standard intra-species PPIs and coherent informative Gene Ontology terms assessment. The assessment results show that our stringent DDI-based approach achieves much better performance in predicting PPIs than the conventional approach. Using our stringent DDI-based approach, we have predicted a small set of reliable H. sapiens-M. tuberculosis H37Rv PPIs which could be very useful for a variety of related studies. We also analyze the H. sapiens-M. tuberculosis H37Rv PPIs predicted by our stringent DDI-based approach using cellular compartment distribution analysis, functional category enrichment analysis and pathway enrichment analysis. The analyses support the validity of our prediction result. Also, based on an analysis of the H. sapiens-M. tuberculosis H37Rv PPI network predicted by our stringent DDI-based approach, we have discovered some

  9. Achieving stringent climate targets. An analysis of the role of transport and variable renewable energies using energy-economy-climate models

    Energy Technology Data Exchange (ETDEWEB)

    Pietzcker, Robert Carl

    2014-07-01

    Anthropogenic climate change is threatening the welfare of mankind. Accordingly, policy makers have repeatedly stated the goal of slowing climate change and limiting the increase of global mean temperature to less than 2 C above pre-industrial times (the so-called ''two degree target''). Stabilizing the temperature requires drastic reductions of greenhouse gas (GHG) emissions to nearly zero. As the global system of energy supply currently relies on fossil fuels, reducing GHG emissions can only be achieved through a full-scale transformation of the energy system. This thesis investigates the economic requirements and implications of different scenarios that achieve stringent climate mitigation targets. It starts with the analysis of characteristic decarbonization patterns and identifies two particularly relevant aspects of mitigation scenarios: deployment of variable renewable energies (VRE) and decarbonization of the transport sector. After investigating these fields in detail, we turned towards one of the most relevant questions for policy makers and analyzed the trade-off between the stringency of a climate target and its economic requirements and implications. All analyses are based on the improvement, application, comparison, and discussion of large-scale IAMs. The novel ''mitigation share'' metric allowed us to identify the relevance of specific technology groups for mitigation and to improve our understanding of the decarbonization patterns of different energy subsectors. It turned out that the power sector is decarbonized first and reaches lowest emissions, while the transport sector is slowest to decarbonize. For the power sector, non-biomass renewable energies contribute most to emission reductions, while the transport sector strongly relies on liquid fuels and therefore requires biomass in combination with carbon capture and sequestration (CCS) to reduce emissions. An in-depth investigation of the solar power

  10. Electricity versus hydrogen for passenger cars under stringent climate change control

    NARCIS (Netherlands)

    Rösler, H.; van der Zwaan, B.; Keppo, I.; Bruggink, J.

    2014-01-01

    In this article we analyze how passenger car transportation in Europe may change this century under permanent high oil prices and stringent climate control policy. We focus on electricity and hydrogen as principal candidate energy carriers, because these two options are increasingly believed to

  11. The replacement gag vibration monitoring system for Hinkley Point 'B' power station

    International Nuclear Information System (INIS)

    Bagwell, T.; Morrish, M.F.G.

    1985-01-01

    The original computerised system for monitoring the vibration of gags in each reactor channel of the Hinkley Point 'B' AGR Power Station did not meet the specification for a more stringent safety requirement. This paper describes the replacement of that original single processor system with an enhanced dual processor/multiple scanner computer system used to satisfy this new safety and reliability need. The specification and installation of the new hardware and software are discussed, and some of the problems encountered and their solutions are highlighted. (author)

  12. On the Required Number of Antennas in a Point-to-Point Large-but-Finite MIMO System

    KAUST Repository

    Makki, Behrooz; Svensson, Tommy; Eriksson, Thomas; Alouini, Mohamed-Slim

    2015-01-01

    In this paper, we investigate the performance of the point-to-point multiple-input-multiple-output (MIMO) systems in the presence of a large but finite numbers of antennas at the transmitters and/or receivers. Considering the cases with and without hybrid automatic repeat request (HARQ) feedback, we determine the minimum numbers of the transmit/receive antennas which are required to satisfy different outage probability constraints. We study the effect of the spatial correlation between the antennas on the system performance. Also, the required number of antennas are obtained for different fading conditions. Our results show that different outage requirements can be satisfied with relatively few transmit/receive antennas. © 2015 IEEE.

  13. On the Required Number of Antennas in a Point-to-Point Large-but-Finite MIMO System

    KAUST Repository

    Makki, Behrooz

    2015-11-12

    In this paper, we investigate the performance of the point-to-point multiple-input-multiple-output (MIMO) systems in the presence of a large but finite numbers of antennas at the transmitters and/or receivers. Considering the cases with and without hybrid automatic repeat request (HARQ) feedback, we determine the minimum numbers of the transmit/receive antennas which are required to satisfy different outage probability constraints. We study the effect of the spatial correlation between the antennas on the system performance. Also, the required number of antennas are obtained for different fading conditions. Our results show that different outage requirements can be satisfied with relatively few transmit/receive antennas. © 2015 IEEE.

  14. Structural characterization of the stringent response related exopolyphosphatase/guanosine pentaphosphate phosphohydrolase protein family

    DEFF Research Database (Denmark)

    Kristensen, Ole; Laurberg, Martin; Liljas, Anders

    2004-01-01

    Exopolyphosphatase/guanosine pentaphosphate phosphohydrolase (PPX/GPPA) enzymes play central roles in the bacterial stringent response induced by starvation. The high-resolution crystal structure of the putative Aquifex aeolicus PPX/GPPA phosphatase from the actin-like ATPase domain superfamily has...

  15. Synthetic Peptides to Target Stringent Response-Controlled Virulence in a Pseudomonas aeruginosa Murine Cutaneous Infection Model

    Directory of Open Access Journals (Sweden)

    Daniel Pletzer

    2017-09-01

    Full Text Available Microorganisms continuously monitor their surroundings and adaptively respond to environmental cues. One way to cope with various stress-related situations is through the activation of the stringent stress response pathway. In Pseudomonas aeruginosa this pathway is controlled and coordinated by the activity of the RelA and SpoT enzymes that metabolize the small nucleotide secondary messenger molecule (pppGpp. Intracellular ppGpp concentrations are crucial in mediating adaptive responses and virulence. Targeting this cellular stress response has recently been the focus of an alternative approach to fight antibiotic resistant bacteria. Here, we examined the role of the stringent response in the virulence of P. aeruginosa PAO1 and the Liverpool epidemic strain LESB58. A ΔrelA/ΔspoT double mutant showed decreased cytotoxicity toward human epithelial cells, exhibited reduced hemolytic activity, and caused down-regulation of the expression of the alkaline protease aprA gene in stringent response mutants grown on blood agar plates. Promoter fusions of relA or spoT to a bioluminescence reporter gene revealed that both genes were expressed during the formation of cutaneous abscesses in mice. Intriguingly, virulence was attenuated in vivo by the ΔrelA/ΔspoT double mutant, but not the relA mutant nor the ΔrelA/ΔspoT complemented with either gene. Treatment of a cutaneous P. aeruginosa PAO1 infection with anti-biofilm peptides increased animal welfare, decreased dermonecrotic lesion sizes, and reduced bacterial numbers recovered from abscesses, resembling the phenotype of the ΔrelA/ΔspoT infection. It was previously demonstrated by our lab that ppGpp could be targeted by synthetic peptides; here we demonstrated that spoT promoter activity was suppressed during cutaneous abscess formation by treatment with peptides DJK-5 and 1018, and that a peptide-treated relA complemented stringent response double mutant strain exhibited reduced peptide

  16. Whole-Genome Microarray and Gene Deletion Studies Reveal Regulation of the Polyhydroxyalkanoate Production Cycle by the Stringent Response in Ralstonia eutropha H16

    Energy Technology Data Exchange (ETDEWEB)

    Brigham, CJ; Speth, DR; Rha, C; Sinskey, AJ

    2012-10-22

    Poly(3-hydroxybutyrate) (PHB) production and mobilization in Ralstonia eutropha are well studied, but in only a few instances has PHB production been explored in relation to other cellular processes. We examined the global gene expression of wild-type R. eutropha throughout the PHB cycle: growth on fructose, PHB production using fructose following ammonium depletion, and PHB utilization in the absence of exogenous carbon after ammonium was resupplied. Our results confirm or lend support to previously reported results regarding the expression of PHB-related genes and enzymes. Additionally, genes for many different cellular processes, such as DNA replication, cell division, and translation, are selectively repressed during PHB production. In contrast, the expression levels of genes under the control of the alternative sigma factor sigma(54) increase sharply during PHB production and are repressed again during PHB utilization. Global gene regulation during PHB production is strongly reminiscent of the gene expression pattern observed during the stringent response in other species. Furthermore, a ppGpp synthase deletion mutant did not show an accumulation of PHB, and the chemical induction of the stringent response with DL-norvaline caused an increased accumulation of PHB in the presence of ammonium. These results indicate that the stringent response is required for PHB accumulation in R. eutropha, helping to elucidate a thus-far-unknown physiological basis for this process.

  17. Ten Year Study of the Stringently Defined Otitis Prone Child in Rochester, NY

    Science.gov (United States)

    Pichichero, Michael E.

    2016-01-01

    This review summarizes a prospective, longitudinal 10-year study in Rochester NY with virtually every clinically diagnosed acute otitis media (AOM) confirmed by bacterial culture of middle ear fluid. Children experiencing 3 episodes within 6 months or 4 episodes in 12 months were considered stringently-defined otitis prone (sOP). We found stringent diagnosis compared with clinical diagnosis reduced the frequency of children meeting the OP definition from 27% to 6% resulting in 14.8% and 2.4% receiving tympanostomy tubes, respectively. Significantly more often RSV infection led to AOM in sOP than non-otitis prone (NOP) children that correlated with diminished total RSV-specific serum IgG. sOP children produced low levels of antibody to Streptococcus pneumoniae and Haemophilus influenzae candidate vaccine protein antigens and to routine pediatric vaccines. sOP children generated significantly fewer memory B cells, functional and memory T cells to otopathogens following NP colonization and AOM than NOP children and they had defects in antigen presenting cells. PMID:27273691

  18. Stringent homology-based prediction of H. sapiens-M. tuberculosis H37Rv protein-protein interactions.

    Science.gov (United States)

    Zhou, Hufeng; Gao, Shangzhi; Nguyen, Nam Ninh; Fan, Mengyuan; Jin, Jingjing; Liu, Bing; Zhao, Liang; Xiong, Geng; Tan, Min; Li, Shijun; Wong, Limsoon

    2014-04-08

    H. sapiens-M. tuberculosis H37Rv protein-protein interaction (PPI) data are essential for understanding the infection mechanism of the formidable pathogen M. tuberculosis H37Rv. Computational prediction is an important strategy to fill the gap in experimental H. sapiens-M. tuberculosis H37Rv PPI data. Homology-based prediction is frequently used in predicting both intra-species and inter-species PPIs. However, some limitations are not properly resolved in several published works that predict eukaryote-prokaryote inter-species PPIs using intra-species template PPIs. We develop a stringent homology-based prediction approach by taking into account (i) differences between eukaryotic and prokaryotic proteins and (ii) differences between inter-species and intra-species PPI interfaces. We compare our stringent homology-based approach to a conventional homology-based approach for predicting host-pathogen PPIs, based on cellular compartment distribution analysis, disease gene list enrichment analysis, pathway enrichment analysis and functional category enrichment analysis. These analyses support the validity of our prediction result, and clearly show that our approach has better performance in predicting H. sapiens-M. tuberculosis H37Rv PPIs. Using our stringent homology-based approach, we have predicted a set of highly plausible H. sapiens-M. tuberculosis H37Rv PPIs which might be useful for many of related studies. Based on our analysis of the H. sapiens-M. tuberculosis H37Rv PPI network predicted by our stringent homology-based approach, we have discovered several interesting properties which are reported here for the first time. We find that both host proteins and pathogen proteins involved in the host-pathogen PPIs tend to be hubs in their own intra-species PPI network. Also, both host and pathogen proteins involved in host-pathogen PPIs tend to have longer primary sequence, tend to have more domains, tend to be more hydrophilic, etc. And the protein domains from both

  19. Dual Regulation of Bacillus subtilis kinB Gene Encoding a Sporulation Trigger by SinR through Transcription Repression and Positive Stringent Transcription Control.

    Science.gov (United States)

    Fujita, Yasutaro; Ogura, Mitsuo; Nii, Satomi; Hirooka, Kazutake

    2017-01-01

    It is known that transcription of kinB encoding a trigger for Bacillus subtilis sporulation is under repression by SinR, a master repressor of biofilm formation, and under positive stringent transcription control depending on the adenine species at the transcription initiation nucleotide (nt). Deletion and base substitution analyses of the kinB promoter (P kinB ) region using lacZ fusions indicated that either a 5-nt deletion (Δ5, nt -61/-57, +1 is the transcription initiation nt) or the substitution of G at nt -45 with A (G-45A) relieved kinB repression. Thus, we found a pair of SinR-binding consensus sequences (GTTCTYT; Y is T or C) in an inverted orientation (SinR-1) between nt -57/-42, which is most likely a SinR-binding site for kinB repression. This relief from SinR repression likely requires SinI, an antagonist of SinR. Surprisingly, we found that SinR is essential for positive stringent transcription control of P kinB . Electrophoretic mobility shift assay (EMSA) analysis indicated that SinR bound not only to SinR-1 but also to SinR-2 (nt -29/-8) consisting of another pair of SinR consensus sequences in a tandem repeat arrangement; the two sequences partially overlap the '-35' and '-10' regions of P kinB . Introduction of base substitutions (T-27C C-26T) in the upstream consensus sequence of SinR-2 affected positive stringent transcription control of P kinB , suggesting that SinR binding to SinR-2 likely causes this positive control. EMSA also implied that RNA polymerase and SinR are possibly bound together to SinR-2 to form a transcription initiation complex for kinB transcription. Thus, it was suggested in this work that derepression of kinB from SinR repression by SinI induced by Spo0A∼P and occurrence of SinR-dependent positive stringent transcription control of kinB might induce effective sporulation cooperatively, implying an intimate interplay by stringent response, sporulation, and biofilm formation.

  20. The Stringent Response Induced by Phosphate Limitation Promotes Purine Salvage in Agrobacterium fabrum.

    Science.gov (United States)

    Sivapragasam, Smitha; Deochand, Dinesh K; Meariman, Jacob K; Grove, Anne

    2017-10-31

    Agrobacterium fabrum induces tumor growth in susceptible plant species. The upregulation of virulence genes that occurs when the bacterium senses plant-derived compounds is enhanced by acidic pH and limiting inorganic phosphate. Nutrient starvation may also trigger the stringent response, and purine salvage is among the pathways expected to be favored under such conditions. We show here that phosphate limitation induces the stringent response, as evidenced by production of (p)ppGpp, and that the xdhCSML operon encoding the purine salvage enzyme xanthine dehydrogenase is upregulated ∼15-fold. The xdhCSML operon is under control of the TetR family transcription factor XdhR; direct binding of ppGpp to XdhR attenuates DNA binding, and the enhanced xdhCSML expression correlates with increased cellular levels of (p)ppGpp. Xanthine dehydrogenase may also divert purines away from salvage pathways to form urate, the ligand for the transcription factor PecS, which in the plant pathogen Dickeya dadantii is a key regulator of virulence gene expression. However, urate levels remain low under conditions that produce increased levels of xdhCSML expression, and neither acidic pH nor limiting phosphate results in induction of genes under control of PecS. Instead, expression of such genes is induced only by externally supplemented urate. Taken together, our data indicate that purine salvage is favored during the stringent response induced by phosphate starvation, suggesting that control of this pathway may constitute a novel approach to modulating virulence. Because bacterial purine catabolism appears to be unaffected, as evidenced by the absence of urate accumulation, we further propose that the PecS regulon is induced by only host-derived urate.

  1. Stringent or nonstringent complete remission and prognosis in acute myeloid leukemia

    DEFF Research Database (Denmark)

    Øvlisen, Andreas K; Oest, Anders; Bendtsen, Mette D

    2018-01-01

    Stringent complete remission (sCR) of acute myeloid leukemia is defined as normal hematopoiesis after therapy. Less sCR, including non-sCR, was introduced as insufficient blood platelet, neutrophil, or erythrocyte recovery. These latter characteristics were defined retrospectively as postremission...... transfusion dependency and were suggested to be of prognostic value. In the present report, we evaluated the prognostic impact of achieving sCR and non-sCR in the Danish National Acute Leukaemia Registry, including 769 patients registered with classical CR (ie,

  2. Vibration isolation and dual-stage actuation pointing system for space precision payloads

    Science.gov (United States)

    Kong, Yongfang; Huang, Hai

    2018-02-01

    Pointing and stability requirements for future space missions are becoming more and more stringent. This work follows the pointing control method which consists of a traditional spacecraft attitude control system and a payload active pointing loop, further proposing a vibration isolation and dual-stage actuation pointing system for space precision payloads based on a soft Stewart platform. Central to the concept is using the dual-stage actuator instead of the traditional voice coil motor single-stage actuator to improve the payload active pointing capability. Based on a specified payload, the corresponding platform was designed to be installed between the spacecraft bus and the payload. The performance of the proposed system is demonstrated by preliminary closed-loop control investigations in simulations. With the ordinary spacecraft bus, the line-of-sight pointing accuracy can be controlled to below a few milliarcseconds in tip and tilt. Meanwhile, utilizing the voice coil motor with the softening spring in parallel, which is a portion of the dual-stage actuator, the system effectively achieves low-frequency motion transmission and high-frequency vibration isolation along the other four degree-of-freedom directions.

  3. Detecting Chemical Weapons: Threats, Requirements, Solutions, and Future Challenges

    Science.gov (United States)

    Boso, Brian

    2011-03-01

    Although chemicals have been reportedly used as weapons for thousands of years, it was not until 1915 at Ypres, France that an industrial chemical, chlorine, was used in World War I as an offensive weapon in significant quantity, causing mass casualties. From that point until today the development, detection, production and protection from chemical weapons has be an organized endeavor of many of the world's armed forces and in more recent times, non-governmental terrorist organizations. The number of Chemical Warfare Agents (CWAs) has steadily increased as research into more toxic substances continued for most of the 20 th century. Today there are over 70 substances including harassing agents like tear gas, incapacitating agents, and lethal agents like blister, blood, chocking, and nerve agents. The requirements for detecting chemical weapons vary depending on the context in which they are encountered and the concept of operation of the organization deploying the detection equipment. The US DoD, for example, has as a requirement, that US forces be able to continue their mission, even in the event of a chemical attack. This places stringent requirements on detection equipment. It must be lightweight (developed for this application, including, but not limited to: mass spectroscopy, IR spectroscopy, RAMAN spectroscopy, MEMs micro-cantilever sensors, surface acoustic wave sensors, differential mobility spectrometry, and amplifying fluorescence polymers. In the future the requirements for detection equipment will continue to become even more stringent. The continuing increase in the sheer number of threats that will need to be detected, the development of binary agents requiring that even the precursor chemicals be detected, the development of new types of agents unlike any of the current chemistries, and the expansion of the list of toxic industrial chemical will require new techniques with higher specificity and more sensitivity.

  4. Augmenting the Genetic Toolbox for Sulfolobus islandicus with a Stringent Positive Selectable Marker for Agmatine Prototrophy

    Science.gov (United States)

    Cooper, Tara E.; Krause, David J.

    2013-01-01

    Sulfolobus species have become the model organisms for studying the unique biology of the crenarchaeal division of the archaeal domain. In particular, Sulfolobus islandicus provides a powerful opportunity to explore natural variation via experimental functional genomics. To support these efforts, we further expanded genetic tools for S. islandicus by developing a stringent positive selection for agmatine prototrophs in strains in which the argD gene, encoding arginine decarboxylase, has been deleted. Strains with deletions in argD were shown to be auxotrophic for agmatine even in nutrient-rich medium, but growth could be restored by either supplementation of exogenous agmatine or reintroduction of a functional copy of the argD gene from S. solfataricus P2 into the ΔargD host. Using this stringent selection, a robust targeted gene knockout system was established via an improved next generation of the MID (marker insertion and unmarked target gene deletion) method. Application of this novel system was validated by targeted knockout of the upsEF genes involved in UV-inducible cell aggregation formation. PMID:23835176

  5. Air Quality and Health Benefits of China's Recent Stringent Environmental Policy

    Science.gov (United States)

    Zheng, Y.; Xue, T.; Zhang, Q.; Geng, G.; He, K.

    2016-12-01

    Aggressive emission control measures were taken by China's central and local governments after the promulgation of the "Air Pollution Prevention and Control Action Plan" in 2013. We evaluated the air quality and health benefits of this ever most stringent air pollution control policy during 2013-2015 by utilizing a two-stage data fusion model and newly-developed cause-specific integrated exposure-response functions (IER) developed for the Global Burden of Disease (GBD). The two-stage data fusion model predicts spatiotemporal continuous PM2.5 (particulate matter with aerodynamic diameter less than 2.5 µm) concentrations by integrating satellite-derived aerosol optical depth (AOD) measurements, PM2.5 concentrations from measurement and air quality model, and other ancillary information. During the years of analysis, PM2.5 concentration dropped significantly on national average and over heavily polluted regions as identified by Mann-Kendall analysis. The national PM2.5-attributable mortality decreased by 72.8 (95% CI: 59.4, 85.2) thousand (6%) from 1.23 (95% CI: 1.06, 1.39) million in 2013 to 1.15 (95% CI: 0.98, 1.31) million in 2015 due to considerable reduction (i.e. 18%) of population-weighted PM2.5 from 61.4 to 50.5 µg/m3. Meteorological variations between 2013 and 2015 were estimated to raise the PM2.5 levels by 0.24 µg/m3 and national mortality by 2.1 (95% CI: 1.6, 2.6) thousand through sensitivity tests, which implies the dominant role of anthropogenic impacts on PM2.5 abatement and attributable mortality reduction. Our study affirms the effectiveness of China's recent air quality policy, however, due to the possible supralinear shape of C-R functions, health benefits induced by air quality improvement in these years are limited. We therefore appeal for continuous implementation of current policies and further stringent measures from both air quality improvement and public health protection perspectives.

  6. On the Required Number of Antennas in a Point-to-Point Large-but-Finite MIMO System: Outage-Limited Scenario

    KAUST Repository

    Makki, Behrooz

    2016-03-22

    This paper investigates the performance of the point-To-point multiple-input-multiple-output (MIMO) systems in the presence of a large but finite numbers of antennas at the transmitters and/or receivers. Considering the cases with and without hybrid automatic repeat request (HARQ) feedback, we determine the minimum numbers of the transmit/receive antennas, which are required to satisfy different outage probability constraints. Our results are obtained for different fading conditions and the effect of the power amplifiers efficiency/feedback error probability on the performance of the MIMO-HARQ systems is analyzed. Then, we use some recent results on the achievable rates of finite block-length codes, to analyze the effect of the codewords lengths on the system performance. Moreover, we derive closed-form expressions for the asymptotic performance of the MIMO-HARQ systems when the number of antennas increases. Our analytical and numerical results show that different outage requirements can be satisfied with relatively few transmit/receive antennas. © 1972-2012 IEEE.

  7. Pointing stability of Hinode and requirements for the next Solar mission Solar-C

    Science.gov (United States)

    Katsukawa, Y.; Masada, Y.; Shimizu, T.; Sakai, S.; Ichimoto, K.

    2017-11-01

    It is essential to achieve fine pointing stability in a space mission aiming for high resolutional observations. In a future Japanese solar mission SOLAR-C, which is a successor of the HINODE (SOLAR-B) mission, we set targets of angular resolution better than 0.1 arcsec in the visible light and better than 0.2 - 0.5 arcsec in EUV and X-rays. These resolutions are twice to five times better than those of corresponding instruments onboard HINODE. To identify critical items to achieve the requirements of the pointing stability in SOLAR-C, we assessed in-flight performance of the pointing stability of HINODE that achieved the highest pointing stability in Japanese space missions. We realized that one of the critical items that have to be improved in SOLAR-C is performance of the attitude stability near the upper limit of the frequency range of the attitude control system. The stability of 0.1 arcsec (3σ) is required in the EUV and X-ray telescopes of SOLAR-C while the HINODE performance is slightly worse than the requirement. The visible light telescope of HINODE is equipped with an image stabilization system inside the telescope, which achieved the stability of 0.03 arcsec (3σ) by suppressing the attitude jitter in the frequency range lower than 10 Hz. For further improvement, it is expected to suppress disturbances induced by resonance between the telescope structures and disturbances of momentum wheels and mechanical gyros in the frequency range higher than 100 Hz.

  8. Induction of a stringent metabolic response in intracellular stages of Leishmania mexicana leads to increased dependence on mitochondrial metabolism.

    Directory of Open Access Journals (Sweden)

    Eleanor C Saunders

    2014-01-01

    Full Text Available Leishmania parasites alternate between extracellular promastigote stages in the insect vector and an obligate intracellular amastigote stage that proliferates within the phagolysosomal compartment of macrophages in the mammalian host. Most enzymes involved in Leishmania central carbon metabolism are constitutively expressed and stage-specific changes in energy metabolism remain poorly defined. Using (13C-stable isotope resolved metabolomics and (2H2O labelling, we show that amastigote differentiation is associated with reduction in growth rate and induction of a distinct stringent metabolic state. This state is characterized by a global decrease in the uptake and utilization of glucose and amino acids, a reduced secretion of organic acids and increased fatty acid β-oxidation. Isotopomer analysis showed that catabolism of hexose and fatty acids provide C4 dicarboxylic acids (succinate/malate and acetyl-CoA for the synthesis of glutamate via a compartmentalized mitochondrial tricarboxylic acid (TCA cycle. In vitro cultivated and intracellular amastigotes are acutely sensitive to inhibitors of mitochondrial aconitase and glutamine synthetase, indicating that these anabolic pathways are essential for intracellular growth and virulence. Lesion-derived amastigotes exhibit a similar metabolism to in vitro differentiated amastigotes, indicating that this stringent response is coupled to differentiation signals rather than exogenous nutrient levels. Induction of a stringent metabolic response may facilitate amastigote survival in a nutrient-poor intracellular niche and underlie the increased dependence of this stage on hexose and mitochondrial metabolism.

  9. Health information needs of professional nurses required at the point of care

    Directory of Open Access Journals (Sweden)

    Esmeralda Ricks

    2015-06-01

    Conclusion: This study has enabled the researcher to identify the information needs required by professional nurses at the point of care to enhance the delivery of patient care. The research results were used to develop a mobile library that could be accessed by professional nurses.

  10. Environmental tipping points significantly affect the cost−benefit assessment of climate policies

    Science.gov (United States)

    Cai, Yongyang; Judd, Kenneth L.; Lenton, Timothy M.; Lontzek, Thomas S.; Narita, Daiju

    2015-01-01

    Most current cost−benefit analyses of climate change policies suggest an optimal global climate policy that is significantly less stringent than the level required to meet the internationally agreed 2 °C target. This is partly because the sum of estimated economic damage of climate change across various sectors, such as energy use and changes in agricultural production, results in only a small economic loss or even a small economic gain in the gross world product under predicted levels of climate change. However, those cost−benefit analyses rarely take account of environmental tipping points leading to abrupt and irreversible impacts on market and nonmarket goods and services, including those provided by the climate and by ecosystems. Here we show that including environmental tipping point impacts in a stochastic dynamic integrated assessment model profoundly alters cost−benefit assessment of global climate policy. The risk of a tipping point, even if it only has nonmarket impacts, could substantially increase the present optimal carbon tax. For example, a risk of only 5% loss in nonmarket goods that occurs with a 5% annual probability at 4 °C increase of the global surface temperature causes an immediate two-thirds increase in optimal carbon tax. If the tipping point also has a 5% impact on market goods, the optimal carbon tax increases by more than a factor of 3. Hence existing cost−benefit assessments of global climate policy may be significantly underestimating the needs for controlling climate change. PMID:25825719

  11. Environmental tipping points significantly affect the cost-benefit assessment of climate policies.

    Science.gov (United States)

    Cai, Yongyang; Judd, Kenneth L; Lenton, Timothy M; Lontzek, Thomas S; Narita, Daiju

    2015-04-14

    Most current cost-benefit analyses of climate change policies suggest an optimal global climate policy that is significantly less stringent than the level required to meet the internationally agreed 2 °C target. This is partly because the sum of estimated economic damage of climate change across various sectors, such as energy use and changes in agricultural production, results in only a small economic loss or even a small economic gain in the gross world product under predicted levels of climate change. However, those cost-benefit analyses rarely take account of environmental tipping points leading to abrupt and irreversible impacts on market and nonmarket goods and services, including those provided by the climate and by ecosystems. Here we show that including environmental tipping point impacts in a stochastic dynamic integrated assessment model profoundly alters cost-benefit assessment of global climate policy. The risk of a tipping point, even if it only has nonmarket impacts, could substantially increase the present optimal carbon tax. For example, a risk of only 5% loss in nonmarket goods that occurs with a 5% annual probability at 4 °C increase of the global surface temperature causes an immediate two-thirds increase in optimal carbon tax. If the tipping point also has a 5% impact on market goods, the optimal carbon tax increases by more than a factor of 3. Hence existing cost-benefit assessments of global climate policy may be significantly underestimating the needs for controlling climate change.

  12. Waste management from reprocessing: a stringent regulatory requirements for high quality conditioned residues

    International Nuclear Information System (INIS)

    Bordier, J. C.; Greneche, D.; Devezeaux, J. G.; Dalcorso, J.

    2000-01-01

    Nuclear waste production and management in France is governed by safety requirements imposed to all operators. French nuclear safety relies on two basic principles: · Responsibility of the nuclear operator, which expands to waste generated, · Safety basic objectives issued by national Safety Authority. For a long time the regulatory framework for waste production and management has been satisfactorily applied and has benefited to each actor of the process. LLW/MLW and HLW nuclear waste are currently conditioned in safe matrices or packages either likely to be disposed in surface repositories or designed with the intention to be disposed underground according to their radioactive content. France is looking into the case of VLLW and has already carried out a design for future disposal, the design being in the pipe. Other types of waste (i. e. radium bearing waste, graphite, and tritium content waste) are also considered in the whole framework of French waste management. (author)

  13. Stringent constraints on the dark matter annihilation cross section from subhalo searches with the Fermi Gamma-Ray Space Telescope

    Energy Technology Data Exchange (ETDEWEB)

    Berlin, Asher; Hooper, Dan

    2014-01-01

    The dark matter halo of the Milky Way is predicted to contain a very large number of smaller subhalos. As a result of the dark matter annihilations taking place within such objects, the most nearby and massive subhalos could appear as point-like or spatially extended gamma-ray sources, without observable counterparts at other wavelengths. In this paper, we use the results of the Aquarius simulation to predict the distribution of nearby subhalos, and compare this to the characteristics of the unidentified gamma-ray sources observed by the Fermi Gamma-Ray Space Telescope. Focusing on the brightest high latitude sources, we use this comparison to derive limits on the dark matter annihilation cross section. For dark matter particles lighter than ~200 GeV, the resulting limits are the strongest obtained to date, being modestly more stringent than those derived from observations of dwarf galaxies or the Galactic Center. We also derive independent limits based on the lack of unidentified gamma-ray sources with discernible spatial extension, but these limits are a factor of ~2-10 weaker than those based on point-like subhalos. Lastly, we note that four of the ten brightest high-latitude sources exhibit a similar spectral shape, consistent with 30-60 GeV dark matter particles annihilating to b quarks with an annihilation cross section on the order of sigma v ~ (5-10) x 10^-27 cm^3/s, or 8-10 GeV dark matter particles annihilating to taus with sigma v ~ (2.0-2.5) x 10^-27 cm^3/s.

  14. Health information needs of professional nurses required at the point of care.

    Science.gov (United States)

    Ricks, Esmeralda; ten Ham, Wilma

    2015-06-11

    Professional nurses work in dynamic environments and need to keep up to date with relevant information for practice in nursing to render quality patient care. Keeping up to date with current information is often challenging because of heavy workload, diverse information needs and the accessibility of the required information at the point of care. The aim of the study was to explore and describe the information needs of professional nurses at the point of care in order to make recommendations to stakeholders to develop a mobile library accessible by means of smart phones when needed. The researcher utilised a quantitative, descriptive survey design to conduct this study. The target population comprised 757 professional nurses employed at a state hospital. Simple random sampling was used to select a sample of the wards, units and departments for inclusion in the study. A convenience sample of 250 participants was selected. Two hundred and fifty structured self-administered questionnaires were distributed amongst the participants. Descriptive statistics were used to analyse the data. A total of 136 completed questionnaires were returned. The findings highlighted the types and accessible sources of information. Information needs of professional nurses were identified such as: extremely drug-resistant tuberculosis, multi-drug-resistant tuberculosis, HIV, antiretrovirals and all chronic lifestyle diseases. This study has enabled the researcher to identify the information needs required by professional nurses at the point of care to enhance the delivery of patient care. The research results were used to develop a mobile library that could be accessed by professional nurses.

  15. Stringent limits on the ionized mass loss from A and F dwarfs

    International Nuclear Information System (INIS)

    Brown, A.; Veale, A.; Judge, P.; Bookbinder, J.A.; Hubeny, I.

    1990-01-01

    Following the suggestion of Willson et al. (1987) that A- and F-type main-sequence stars might undergo significant mass loss due to pulsationally driven winds, upper limits to the ionized mass loss from A and F dwarfs have been obtained using VLA observations. These stringent upper limits show that the level of ionized mass loss would have at most only a small effect on stellar evolution. Radiative-equilibrium atmospheric and wind models for early A dwarfs indicate that it is highly likely that a wind flowing from such stars would be significantly ionized. In addition, late A and early F dwarfs exhibit chromospheric emission indicative of significant nonradiative heating. The present mass-loss limits are thus representative of the total mass-loss rates for these stars. It is concluded that A and F dwarfs are not losing sufficient mass to cause A dwarfs to evolve into G dwarfs. 24 refs

  16. Instrument air dew point requirements -- 108-P, L, K

    International Nuclear Information System (INIS)

    Fairchild, P.N.

    1994-01-01

    The 108 Building dew point analyzers measure dew point at atmospheric pressure. Existing 108 Roundsheets state the maximum dew point temperature shall be less than -50 F. After repeatedly failing to maintain a -50 F dew point temperature Reactor Engineering researched the basis for the existing limit. This report documents the results of the study and provides technical justification for a new maximum dew point temperature of -35 F at atmospheric pressure as read by the 108 building dew point analyzers

  17. City-specific vehicle emission control strategies to achieve stringent emission reduction targets in China's Yangtze River Delta region.

    Science.gov (United States)

    Zhang, Shaojun; Wu, Ye; Zhao, Bin; Wu, Xiaomeng; Shu, Jiawei; Hao, Jiming

    2017-01-01

    The Yangtze River Delta (YRD) region is one of the most prosperous and densely populated regions in China and is facing tremendous pressure to mitigate vehicle emissions and improve air quality. Our assessment has revealed that mitigating vehicle emissions of NOx would be more difficult than reducing the emissions of other major vehicular pollutants (e.g., CO, HC and PM 2.5 ) in the YRD region. Even in Shanghai, where the emission control implemented are more stringent than in Jiangsu and Zhejiang, we observed little to no reduction in NOx emissions from 2000 to 2010. Emission-reduction targets for HC, NOx and PM 2.5 are determined using a response surface modeling tool for better air quality. We design city-specific emission control strategies for three vehicle-populated cities in the YRD region: Shanghai and Nanjing and Wuxi in Jiangsu. Our results indicate that even if stringent emission control consisting of the Euro 6/VI standards, the limitation of vehicle population and usage, and the scrappage of older vehicles is applied, Nanjing and Wuxi will not be able to meet the NOx emissions target by 2020. Therefore, additional control measures are proposed for Nanjing and Wuxi to further mitigate NOx emissions from heavy-duty diesel vehicles. Copyright © 2016. Published by Elsevier B.V.

  18. 40 CFR 63.1583 - What are the emission points and control requirements for an industrial POTW treatment plant?

    Science.gov (United States)

    2010-07-01

    ... control requirements for an industrial POTW treatment plant? 63.1583 Section 63.1583 Protection of... Pollutants: Publicly Owned Treatment Works Industrial Potw Treatment Plant Description and Requirements § 63.1583 What are the emission points and control requirements for an industrial POTW treatment plant? (a...

  19. Ribosome•RelA structures reveal the mechanism of stringent response activation

    Science.gov (United States)

    Loveland, Anna B; Bah, Eugene; Madireddy, Rohini; Zhang, Ying; Brilot, Axel F; Grigorieff, Nikolaus; Korostelev, Andrei A

    2016-01-01

    Stringent response is a conserved bacterial stress response underlying virulence and antibiotic resistance. RelA/SpoT-homolog proteins synthesize transcriptional modulators (p)ppGpp, allowing bacteria to adapt to stress. RelA is activated during amino-acid starvation, when cognate deacyl-tRNA binds to the ribosomal A (aminoacyl-tRNA) site. We report four cryo-EM structures of E. coli RelA bound to the 70S ribosome, in the absence and presence of deacyl-tRNA accommodating in the 30S A site. The boomerang-shaped RelA with a wingspan of more than 100 Å wraps around the A/R (30S A-site/RelA-bound) tRNA. The CCA end of the A/R tRNA pins the central TGS domain against the 30S subunit, presenting the (p)ppGpp-synthetase domain near the 30S spur. The ribosome and A/R tRNA are captured in three conformations, revealing hitherto elusive states of tRNA engagement with the ribosomal decoding center. Decoding-center rearrangements are coupled with the step-wise 30S-subunit 'closure', providing insights into the dynamics of high-fidelity tRNA decoding. DOI: http://dx.doi.org/10.7554/eLife.17029.001 PMID:27434674

  20. 40 CFR 63.1586 - What are the emission points and control requirements for a non-industrial POTW treatment plant?

    Science.gov (United States)

    2010-07-01

    ... control requirements for a non-industrial POTW treatment plant? 63.1586 Section 63.1586 Protection of... Pollutants: Publicly Owned Treatment Works Non-Industrial Potw Treatment Plant Requirements § 63.1586 What are the emission points and control requirements for a non-industrial POTW treatment plant? There are...

  1. DTU-ESA millimeter-wave validation standard antenna – requirements and design

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Kim, Oleksiy S.; Breinbjerg, Olav

    2014-01-01

    from a validation campaign is achieved when a dedicated Validation Standard (VAST) antenna specifically designed for this purpose is available. The driving requirements to VAST antennas are their mechanical stability with respect to any orientation of the antenna in the gravity field and thermal...... are briefly reviewed and the baseline design is described. The emphasis is given to definition of the requirements for the mechanical and thermal stability of the antenna, which satisfy the stringent stability requirement for the mm-VAST electrical characteristics....

  2. Latency in Visionic Systems: Test Methods and Requirements

    Science.gov (United States)

    Bailey, Randall E.; Arthur, J. J., III; Williams, Steven P.; Kramer, Lynda J.

    2005-01-01

    A visionics device creates a pictorial representation of the external scene for the pilot. The ultimate objective of these systems may be to electronically generate a form of Visual Meteorological Conditions (VMC) to eliminate weather or time-of-day as an operational constraint and provide enhancement over actual visual conditions where eye-limiting resolution may be a limiting factor. Empirical evidence has shown that the total system delays or latencies including the imaging sensors and display systems, can critically degrade their utility, usability, and acceptability. Definitions and measurement techniques are offered herein as common test and evaluation methods for latency testing in visionics device applications. Based upon available data, very different latency requirements are indicated based upon the piloting task, the role in which the visionics device is used in this task, and the characteristics of the visionics cockpit display device including its resolution, field-of-regard, and field-of-view. The least stringent latency requirements will involve Head-Up Display (HUD) applications, where the visionics imagery provides situational information as a supplement to symbology guidance and command information. Conversely, the visionics system latency requirement for a large field-of-view Head-Worn Display application, providing a Virtual-VMC capability from which the pilot will derive visual guidance, will be the most stringent, having a value as low as 20 msec.

  3. An ultrabright and monochromatic electron point source made of a LaB6 nanowire

    Science.gov (United States)

    Zhang, Han; Tang, Jie; Yuan, Jinshi; Yamauchi, Yasushi; Suzuki, Taku T.; Shinya, Norio; Nakajima, Kiyomi; Qin, Lu-Chang

    2016-03-01

    Electron sources in the form of one-dimensional nanotubes and nanowires are an essential tool for investigations in a variety of fields, such as X-ray computed tomography, flexible displays, chemical sensors and electron optics applications. However, field emission instability and the need to work under high-vacuum or high-temperature conditions have imposed stringent requirements that are currently limiting the range of application of electron sources. Here we report the fabrication of a LaB6 nanowire with only a few La atoms bonded on the tip that emits collimated electrons from a single point with high monochromaticity. The nanostructured tip has a low work function of 2.07 eV (lower than that of Cs) while remaining chemically inert, two properties usually regarded as mutually exclusive. Installed in a scanning electron microscope (SEM) field emission gun, our tip shows a current density gain that is about 1,000 times greater than that achievable with W(310) tips, and no emission decay for tens of hours of operation. Using this new SEM, we acquired very low-noise, high-resolution images together with rapid chemical compositional mapping using a tip operated at room temperature and at 10-times higher residual gas pressure than that required for W tips.

  4. Insulated hsp70B' promoter: stringent heat-inducible activity in replication-deficient, but not replication-competent adenoviruses.

    Science.gov (United States)

    Rohmer, Stanimira; Mainka, Astrid; Knippertz, Ilka; Hesse, Andrea; Nettelbeck, Dirk M

    2008-04-01

    Key to the realization of gene therapy is the development of efficient and targeted gene transfer vectors. Therapeutic gene transfer by replication-deficient or more recently by conditionally replication-competent/oncolytic adenoviruses has shown much promise. For specific applications, however, it will be advantageous to provide vectors that allow for external control of gene expression. The efficient cellular heat shock system in combination with available technology for focused and controlled hyperthermia suggests heat-regulated transcription control as a promising tool for this purpose. We investigated the feasibility of a short fragment of the human hsp70B' promoter, with and without upstream insulator elements, for the regulation of transgene expression by replication-deficient or oncolytic adenoviruses. Two novel adenoviral vectors with an insulated hsp70B' promoter were developed and showed stringent heat-inducible gene expression with induction ratios up to 8000-fold. In contrast, regulation of gene expression from the hsp70B' promoter without insulation was suboptimal. In replication-competent/oncolytic adenoviruses regulation of the hsp70B' promoter was lost specifically during late replication in permissive cells and could not be restored by the insulators. We developed novel adenovirus gene transfer vectors that feature improved and stringent regulation of transgene expression from the hsp70B' promoter using promoter insulation. These vectors have potential for gene therapy applications that benefit from external modulation of therapeutic gene expression or for combination therapy with hyperthermia. Furthermore, our study reveals that vector replication can deregulate inserted cellular promoters, an observation which is of relevance for the development of replication-competent/oncolytic gene transfer vectors. (c) 2008 John Wiley & Sons, Ltd.

  5. Does dishonesty really invite third-party punishment? Results of a more stringent test.

    Science.gov (United States)

    Konishi, Naoki; Ohtsubo, Yohsuke

    2015-05-01

    Many experiments have demonstrated that people are willing to incur cost to punish norm violators even when they are not directly harmed by the violation. Such altruistic third-party punishment is often considered an evolutionary underpinning of large-scale human cooperation. However, some scholars argue that previously demonstrated altruistic third-party punishment against fairness-norm violations may be an experimental artefact. For example, envy-driven retaliatory behaviour (i.e. spite) towards better-off unfair game players may be misidentified as altruistic punishment. Indeed, a recent experiment demonstrated that participants ceased to inflict third-party punishment against an unfair player once a series of key methodological problems were systematically controlled for. Noticing that a previous finding regarding apparently altruistic third-party punishment against honesty-norm violations may have been subject to methodological issues, we used a different and what we consider to be a more sound design to evaluate these findings. Third-party punishment against dishonest players withstood this more stringent test. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  6. Reactor core modeling practice: Operational requirements, model characteristics, and model validation

    International Nuclear Information System (INIS)

    Zerbino, H.

    1997-01-01

    The physical models implemented in power plant simulators have greatly increased in performance and complexity in recent years. This process has been enabled by the ever increasing computing power available at affordable prices. This paper describes this process from several angles: First the operational requirements which are more critical from the point of view of model performance, both for normal and off-normal operating conditions; A second section discusses core model characteristics in the light of the solutions implemented by Thomson Training and Simulation (TT and S) in several full-scope simulators recently built and delivered for Dutch, German, and French nuclear power plants; finally we consider the model validation procedures, which are of course an integral part of model development, and which are becoming more and more severe as performance expectations increase. As a conclusion, it may be asserted that in the core modeling field, as in other areas, the general improvement in the quality of simulation codes has resulted in a fairly rapid convergence towards mainstream engineering-grade calculations. This is remarkable performance in view of the stringent real-time requirements which the simulation codes must satisfy as well as the extremely wide range of operating conditions that they are called upon to cover with good accuracy. (author)

  7. On-site meteorological instrumentation requirements to characterize diffusion from point sources: workshop report. Final report Sep 79-Sep 80

    International Nuclear Information System (INIS)

    Strimaitis, D.; Hoffnagle, G.; Bass, A.

    1981-04-01

    Results of a workshop entitled 'On-Site Meteorological Instrumentation Requirements to Characterize Diffusion from Point Sources' are summarized and reported. The workshop was sponsored by the U.S. Environmental Protection Agency in Raleigh, North Carolina, on January 15-17, 1980. Its purpose was to provide EPA with a thorough examination of the meteorological instrumentation and data collection requirements needed to characterize airborne dispersion of air contaminants from point sources and to recommend, based on an expert consensus, specific measurement technique and accuracies. Secondary purposes of the workshop were to (1) make recommendations to the National Weather Service (NWS) about collecting and archiving meteorological data that would best support air quality dispersion modeling objectives and (2) make recommendations on standardization of meteorological data reporting and quality assurance programs

  8. Testing to fulfill HACCP (Hazard Analysis Critical Control Points) requirements: principles and examples.

    Science.gov (United States)

    Gardner, I A

    1997-12-01

    On-farm HACCP (hazard analysis critical control points) monitoring requires cost-effective, yet accurate and reproducible tests that can determine the status of cows, milk, and the dairy environment. Tests need to be field-validated, and their limitations need to be established so that appropriate screening strategies can be initiated and test results can be rationally interpreted. For infections and residues of low prevalence, tests or testing strategies that are highly specific help to minimize false-positive results and excessive costs to the dairy industry. The determination of the numbers of samples to be tested in HACCP monitoring programs depends on the specific purpose of the test and the likely prevalence of the agent or residue at the critical control point. The absence of positive samples from a herd test should not be interpreted as freedom from a particular agent or residue unless the entire herd has been tested with a test that is 100% sensitive. The current lack of field-validated tests for most of the chemical and infectious agents of concern makes it difficult to ensure that the stated goals of HACCP programs are consistently achieved.

  9. Guide for reviewing safety analysis reports for packaging: Review of quality assurance requirements

    International Nuclear Information System (INIS)

    Moon, D.W.

    1988-10-01

    This review section describes quality assurance requirements applying to design, purchase, fabrication, handling, shipping, storing, cleaning, assembly, inspection, testing, operation, maintenance, repair, and modification of components of packaging which are important to safety. The design effort, operation's plans, and quality assurance requirements should be integrated to achieve a system in which the independent QA program is not overly stringent and the application of QA requirements is commensurate with safety significance. The reviewer must verify that the applicant's QA section in the SARP contains package-specific QA information required by DOE Orders and federal regulations that demonstrate compliance. 8 refs

  10. Status of the U.S. nuclear option, conditions leading to its resurgence, and current licensing requirements

    International Nuclear Information System (INIS)

    Ioannidi, J.

    2007-01-01

    The projected increase in electricity demand, increased concern over emissions along with more stringent emission requirements, volatility of the gas and oil supplies and prices, and the convergence of favourable conditions and legislation make nuclear power a practical option for meeting future electricity base-load demands. (author)

  11. User requirements Massive Point Clouds for eSciences (WP1)

    NARCIS (Netherlands)

    Suijker, P.M.; Alkemade, I.; Kodde, M.P.; Nonhebel, A.E.

    2014-01-01

    This report is a milestone in work package 1 (WP1) of the project Massive point clouds for eSciences. In WP1 the basic functionalities needed for a new Point Cloud Spatial Database Management System are identified. This is achieved by (1) literature research, (2) discussions with the project

  12. Hinkley Point 'C' power station public inquiry: proof of evidence on the need for Hinkley Point 'C' to help meet capacity requirement and the non-fossil-fuel proportion economically

    International Nuclear Information System (INIS)

    Jenkin, F.P.

    1988-09-01

    A public inquiry has been set up to examine the planning application made by the Central Electricity Generating Board (CEGB) for the construction of a 1200 MW Pressurized Water Reactor power station at Hinkley Point (Hinkley Point ''C'') in the United Kingdom. The purpose of this evidence to the Inquiry is to show why there is a need now to go ahead with the construction of Hinkley Point ''C'' generating station to help meet the non-fossil-fuel proportion of generation economically and also to help meet future generating capacity requirement. The CEGB submits that it is appropriate to compare Hinkley Point ''C'' with other non-fossil-fuel alternatives under various bases. Those dealt with by this proof of evidence are as follows: i) ability to contribute to capacity need and in assisting the distribution companies to meet their duty to supply electricity; ii) ability to contribute to the non-fossil-fuel proportion; iii) relative economic merit. (author)

  13. The role of point-of-care assessment of platelet function in predicting postoperative bleeding and transfusion requirements after coronary artery bypass grafting.

    Science.gov (United States)

    Mishra, Pankaj Kumar; Thekkudan, Joyce; Sahajanandan, Raj; Gravenor, Mike; Lakshmanan, Suresh; Fayaz, Khazi Mohammed; Luckraz, Heyman

    2015-01-01

    OBJECTIVE platelet function assessment after cardiac surgery can predict postoperative blood loss, guide transfusion requirements and discriminate the need for surgical re-exploration. We conducted this study to assess the predictive value of point-of-care testing platelet function using the Multiplate® device. Patients undergoing isolated coronary artery bypass grafting were prospectively recruited ( n = 84). Group A ( n = 42) patients were on anti-platelet therapy until surgery; patients in Group B ( n = 42) stopped anti-platelet treatment at least 5 days preoperatively. Multiplate® and thromboelastography (TEG) tests were performed in the perioperative period. Primary end-point was excessive bleeding (>2.5 ml/kg/h) within first 3 h postoperative. Secondary end-points included transfusion requirements, re-exploration rates, intensive care unit and in-hospital stays. Patients in Group A had excessive bleeding (59% vs. 33%, P = 0.02), higher re-exploration rates (14% vs. 0%, P function testing was the most significant predictor of excessive bleeding (odds ratio [OR]: 2.3, P = 0.08), need for blood (OR: 5.5, P functional assessment with Multiplate® was the strongest predictor for bleeding and transfusion requirements in patients on anti-platelet therapy until the time of surgery.

  14. The design of visible system for improving the measurement accuracy of imaging points

    Science.gov (United States)

    Shan, Qiu-sha; Li, Gang; Zeng, Luan; Liu, Kai; Yan, Pei-pei; Duan, Jing; Jiang, Kai

    2018-02-01

    It has a widely applications in robot vision and 3D measurement for binocular stereoscopic measurement technology. And the measure precision is an very important factor, especially in 3D coordination measurement, high measurement accuracy is more stringent to the distortion of the optical system. In order to improving the measurement accuracy of imaging points, to reducing the distortion of the imaging points, the optical system must be satisfied the requirement of extra low distortion value less than 0.1#65285;, a transmission visible optical lens was design, which has characteristic of telecentric beam path in image space, adopted the imaging model of binocular stereo vision, and imaged the drone at the finity distance. The optical system was adopted complex double Gauss structure, and put the pupil stop on the focal plane of the latter groups, maked the system exit pupil on the infinity distance, and realized telecentric beam path in image space. The system mainly optical parameter as follows: the system spectrum rangement is visible light wave band, the optical effective length is f '=30mm, the relative aperture is 1/3, and the fields of view is 21°. The final design results show that the RMS value of the spread spots of the optical lens in the maximum fields of view is 2.3μm, which is less than one pixel(3.45μm) the distortion value is less than 0.1%, the system has the advantage of extra low distortion value and avoids the latter image distortion correction; the proposed modulation transfer function of the optical lens is 0.58(@145 lp/mm), the imaging quality of the system is closed to the diffraction limited; the system has simply structure, and can satisfies the requirements of the optical indexes. Ultimately, based on the imaging model of binocular stereo vision was achieved to measuring the drone at the finity distance.

  15. Design Tools for Cost-Effective Implementation of Planetary Protection Requirements

    Science.gov (United States)

    Hamlin, Louise; Belz, Andrea; Evans, Michael; Kastner, Jason; Satter, Celeste; Spry, Andy

    2006-01-01

    Since the Viking missions to Mars in the 1970s, accounting for the costs associated with planetary protection implementation has not been done systematically during early project formulation phases, leading to unanticipated costs during subsequent implementation phases of flight projects. The simultaneous development of more stringent planetary protection requirements, resulting from new knowledge about the limits of life on Earth, together with current plans to conduct life-detection experiments on a number of different solar system target bodies motivates a systematic approach to integrating planetary protection requirements and mission design. A current development effort at NASA's Jet Propulsion Laboratory is aimed at integrating planetary protection requirements more fully into the early phases of mission architecture formulation and at developing tools to more rigorously predict associated cost and schedule impacts of architecture options chosen to meet planetary protection requirements.

  16. A miniaturized, high frequency mechanical scanner for high speed atomic force microscope using suspension on dynamically determined points

    Energy Technology Data Exchange (ETDEWEB)

    Herfst, Rodolf; Dekker, Bert; Witvoet, Gert; Crowcombe, Will; Lange, Dorus de [Department of Optomechatronics, Netherlands Organization for Applied Scientific Research, TNO, Delft (Netherlands); Sadeghian, Hamed, E-mail: hamed.sadeghianmarnani@tno.nl, E-mail: h.sadeghianmarnani@tudelft.nl [Department of Optomechatronics, Netherlands Organization for Applied Scientific Research, TNO, Delft (Netherlands); Department of Precision and Microsystems Engineering, Delft University of Technology, Delft (Netherlands)

    2015-11-15

    One of the major limitations in the speed of the atomic force microscope (AFM) is the bandwidth of the mechanical scanning stage, especially in the vertical (z) direction. According to the design principles of “light and stiff” and “static determinacy,” the bandwidth of the mechanical scanner is limited by the first eigenfrequency of the AFM head in case of tip scanning and by the sample stage in terms of sample scanning. Due to stringent requirements of the system, simply pushing the first eigenfrequency to an ever higher value has reached its limitation. We have developed a miniaturized, high speed AFM scanner in which the dynamics of the z-scanning stage are made insensitive to its surrounding dynamics via suspension of it on specific dynamically determined points. This resulted in a mechanical bandwidth as high as that of the z-actuator (50 kHz) while remaining insensitive to the dynamics of its base and surroundings. The scanner allows a practical z scan range of 2.1 μm. We have demonstrated the applicability of the scanner to the high speed scanning of nanostructures.

  17. What are the essential competencies required of a midwife at the point of registration?

    Science.gov (United States)

    Butler, Michelle M; Fraser, Diane M; Murphy, Roger J L

    2008-09-01

    to identify the essential competencies required of a midwife at the point of registration. qualitative, descriptive, extended case study and depth interviews. pre-registration midwifery education in England. 39 qualifying midwives, their assessors, midwives and midwife teachers across six higher education institutions, and 20 experienced midwives at two sites. essential competencies were identified relating to (1) being a safe practitioner; (2) having the right attitude; and (3) being an effective communicator. In order to be a safe practitioner, it was proposed that a midwife must have a reasonable degree of self-sufficiency, use up-to-date knowledge in practice, and have self and professional awareness. It was suggested that having the right attitude involves being motivated, being committed to midwifery and being caring and kind. Participants highlighted the importance of effective communication so that midwives can relate to and work in partnership with women and provide truly informed choice. Essential communication skills include active listening, providing appropriate information and flexibility. the most important requirement at registration is that a midwife is safe and will practise safely. However, this capability to be safe is further mediated by attitudes and communication skills. models of midwifery competence should always include personal attributes and effective communication in addition to the competencies required to be able to practise safely, and there should be an explicit focus in curriculum content, skills training and assessment on attitudes and communication.

  18. The Dangers of Aestheticism in Schooling.

    Science.gov (United States)

    Meager, Ruby

    1981-01-01

    Prompted by Immanuel Kant's analysis of the nature and operations of the imagination in his "Critique of the Aesthetical Judgment," this article points out the danger of encouraging imagination-borne aesthetical judgments and explanatory hypotheses. Concludes that understanding requires submission to more stringent standards of objectivity and to…

  19. Waste management system requirements document

    International Nuclear Information System (INIS)

    1991-02-01

    This volume defines the top level requirements for the Mined Geologic Disposal System (MGDS). It is designed to be used in conjunction with Volume 1 of the WMSR, General System Requirements. It provides a functional description expanding the requirements allocated to the MGDS in Volume 1 and elaborates on each requirement by providing associated performance criteria as appropriate. Volumes 1 and 4 of the WMSR provide a minimum set of requirements that must be satisfied by the final MGDS design. This document sets forth specific requirements that must be fulfilled. It is not the intent or purpose of this top level document to describe how each requirement is to be satisfied in the final MGDS design. Each subsequent level of the technical document hierarchy must provide further guidance and definition as to how each of these requirements is to be implemented in the design. It is expected that each subsequent level of requirements will be significantly more detailed. Section 2 of this volume provides a functional description of the MGDS. Each function is addressed in terms of requirements, and performance criteria. Section 3 provides a list of controlling documents. Each document cited in a requirement of Chapter 2 is included in this list and is incorporated into this document as a requirement on the final system. The WMSR addresses only federal requirements (i.e., laws, regulations and DOE orders). State and local requirements are not addressed. However, it will be specifically noted at the potentially affected WMSR requirements that there could be additional or more stringent regulations imposed by a state or local requirements or administering agency over the cited federal requirements

  20. The implementation of modern digital technology in x-ray medical diagnosis in Republic of Moldova - a stringent necessity

    International Nuclear Information System (INIS)

    Rosca, Andrei

    2011-01-01

    The study includes analyses of current technical state of radiodiagnostic equipment from the Public Medico-Sanitary Institution of Ministry of Health of Republic of Moldova (IMSP MS RM). The traditional radiodiagnostic apparatuses were morally and physically outrun at 96,6% (in regional MSPI - 93,5%), inclusive the dental one - 92,0% (in raional MSPI - 97,2%), X-Ray exam -100%, mobile - 84,1% etc. The exploitation of the traditional radiodiagnostic apparatuses with high degree of physical and moral wear essentially diminished the quality of profile investigation, creates premises for diagnostic error perpetrating, increase the collective ionizing irradiation of population etc. In recent years the subvention of MSPI HM RM with digital radiodiagnostic equipment was started. This process is very hard unfold because of grave socio-economic crises in Republic of Moldova. Despite these obstacles the subvention of MSPI HM RM with digital equipment represents a stringent necessity and a time request.

  1. Rapid species responses to changes in climate require stringent climate protection targets

    NARCIS (Netherlands)

    Vliet, van A.J.H.; Leemans, R.

    2006-01-01

    The Avoiding Dangerous Climate Change book consolidates the scientific findings of the Exeter conference and gives an account of the most recent developments on critical thresholds and key vulnerabilities of the climate system, impacts on human and natural systems, emission pathways and

  2. A Point Source of a Different Color: Identifying a Gap in United States Regulatory Policy for “Green” CSO Treatment Using Constructed Wetlands

    Directory of Open Access Journals (Sweden)

    Zeno F. Levy

    2014-04-01

    Full Text Available Up to 850 billion gallons of untreated combined sewer overflow (CSO is discharged into waters of the United States each year. Recent changes in CSO management policy support green infrastructure (GI technologies as “front of the pipe” approaches to discharge mitigation by detention/reduction of urban stormwater runoff. Constructed wetlands for CSO treatment have been considered among suites of GI solutions. However, these wetlands differ fundamentally from other GI technologies in that they are “end of the pipe” treatment systems that discharge from a point source, and are therefore regulated in the U.S. under the National Pollution Discharge Elimination System (NPDES. We use a comparative regulatory analysis to examine the U.S. policy framework for CSO treatment wetlands. We find in all cases that permitting authorities have used best professional judgment to determine effluent limits and compliance monitoring requirements, referencing technology and water quality-based standards originally developed for traditional “grey” treatment systems. A qualitative comparison with Europe shows less stringent regulatory requirements, perhaps due to institutionalized design parameters. We recommend that permitting authorities develop technical guidance documents for evaluation of “green” CSO treatment systems that account for their unique operational concerns and benefits with respect to sustainable development.

  3. Reliability issues of free-space communications systems and networks

    Science.gov (United States)

    Willebrand, Heinz A.

    2003-04-01

    Free space optics (FSO) is a high-speed point-to-point connectivity solution traditionally used in the enterprise campus networking market for building-to-building LAN connectivity. However, more recently some wire line and wireless carriers started to deploy FSO systems in their networks. The requirements on FSO system reliability, meaing both system availability and component reliability, are far more stringent in the carrier market when compared to the requirements in the enterprise market segment. This paper tries to outline some of the aspects that are important to ensure carrier class system reliability.

  4. Use of simple transport equations to estimate waste package performance requirements

    International Nuclear Information System (INIS)

    Wood, B.J.

    1982-01-01

    A method of developing waste package performance requirements for specific nuclides is described. The method is based on: Federal regulations concerning permissible concentrations in solution at the point of discharge to the accessible environment; a simple and conservative transport model; baseline and potential worst-case release scenarios. Use of the transport model enables calculation of maximum permissible release rates within a repository in basalt for each of the scenarios. The maximum permissible release rates correspond to performance requirements for the engineered barrier system. The repository was assumed to be constructed in a basalt layer. For the cases considered, including a well drilled into an aquifer 1750 m from the repository center, little significant advantage is obtained from a 1000-yr as opposed to a 100-yr waste package. A 1000-yr waste package is of importance only for nuclides with half-lives much less than 100 yr which travel to the accessible environment in much less than 1000 yr. Such short travel times are extremely unlikely for a mined repository. Among the actinides, the most stringent maximum permissible release rates are for 236 U and 234 U. A simple solubility calculation suggests, however, that these performance requirements can be readily met by the engineered barrier system. Under the reducing conditions likely to occur in a repository located in basalt, uranium would be sufficiently insoluble that no solution could contain more than about 0.01% of the maximum permissible concentration at saturation. The performance requirements derived from the one-dimensional modeling approach are conservative by at least one to two orders of magnitude. More quantitative three-dimensional modeling at specific sites should enable relaxation of the performance criteria derived in this study. 12 references, 8 figures, 8 tables

  5. Developing the Cleanliness Requirements for an Organic-detection Instrument MOMA-MS

    Science.gov (United States)

    Perry, Radford; Canham, John; Lalime, Erin

    2015-01-01

    The cleanliness requirements for an organic-detection instrument, like the Mars Organic Molecule Analyzer Mass Spectrometer (MOMA-MS), on a Planetary Protection Class IVb mission can be extremely stringent. These include surface molecular and particulate, outgassing, and bioburden. The prime contractor for the European Space Agencys ExoMars 2018 project, Thales Alenia Space Italy, provided requirements based on a standard, conservative approach of defining limits which yielded levels that are unverifiable by standard cleanliness verification methods. Additionally, the conservative method for determining contamination surface area uses underestimation while conservative bioburden surface area relies on overestimation, which results in inconsistencies for the normalized reporting. This presentation will provide a survey of the challenge to define requirements that can be reasonably verified and still remain appropriate to the core science of the ExoMars mission.

  6. Are Dutch residents ready for a more stringent policy to enhance the energy performance of their homes?

    International Nuclear Information System (INIS)

    Middelkoop, Manon van; Vringer, Kees; Visser, Hans

    2017-01-01

    Investments in the energy performance of houses offer good prospects for reducing energy consumption and CO_2 emissions. However, people are not easily convinced of the need to take measures to improve the energy performance of their houses, even when financial benefits outweigh the costs. This article analyses the factors that influence the decision for improving the energy performance of existing homes, including policy instruments. Subsequently, the article provides policy suggestions on how to stimulate energy performance improvements. Both owners and tenants (50–70%) support government policy on energy performance improvements to existing homes. Nevertheless, people also have strong feelings of autonomy regarding their homes. Our results underline the importance of well-informed and competent decision-makers. Introducing the use of Energy Performance Certificates (EPCs) into the tax system for energy and residential buildings might therefore be an effective way to increase the interest of owners in the EPC, improve the use and effect of this informative instrument, and make the first step towards bridging the tension between autonomy and more stringent instruments.

  7. Error Mitigation of Point-to-Point Communication for Fault-Tolerant Computing

    Science.gov (United States)

    Akamine, Robert L.; Hodson, Robert F.; LaMeres, Brock J.; Ray, Robert E.

    2011-01-01

    Fault tolerant systems require the ability to detect and recover from physical damage caused by the hardware s environment, faulty connectors, and system degradation over time. This ability applies to military, space, and industrial computing applications. The integrity of Point-to-Point (P2P) communication, between two microcontrollers for example, is an essential part of fault tolerant computing systems. In this paper, different methods of fault detection and recovery are presented and analyzed.

  8. Air humidity requirements for human comfort

    DEFF Research Database (Denmark)

    Toftum, Jørn; Fanger, Povl Ole

    1999-01-01

    level near 100% rh. For respiratory comfort are the requirements much more stringent and results in lower permissible indoor air humidities. Compared with the upper humidity limit specified in existing thermal comfort standards, e.g. ASHRAE Addendum 55a, the humidity limit based on skin humidity......Upper humidity limits for the comfort zone determined from two recently presented models for predicting discomfort due to skin humidity and insufficient respiratory cooling are proposed. The proposed limits are compared with the maximum permissible humidity level prescribed in existing standards...... for the thermal indoor environment. The skin humidity model predicts discomfort as a function of the relative humidity of the skin, which is determined by existing models for human heat and moisture transfer based on environmental parameters, clothing characteristics and activity level. The respiratory model...

  9. Polarization (ellipsometric) measurements of liquid condensate deposition and evaporation rates and dew points in flowing salt/ash-containing combustion gases

    Science.gov (United States)

    Seshadri, K.; Rosner, D. E.

    1985-01-01

    An application of an optical polarization technique in a combustion environment is demonstrated by following, in real-time, growth rates of boric oxide condensate on heated platinum ribbons exposed to seeded propane-air combustion gases. The results obtained agree with the results of earlier interference measurements and also with theoretical chemical vapor deposition predictions. In comparison with the interference method, the polarization technique places less stringent requirements on surface quality, which may justify the added optical components needed for such measurements.

  10. Membrane-based, sedimentation-assisted plasma separator for point-of-care applications.

    Science.gov (United States)

    Liu, Changchun; Mauk, Michael; Gross, Robert; Bushman, Frederic D; Edelstein, Paul H; Collman, Ronald G; Bau, Haim H

    2013-11-05

    Often, high-sensitivity, point-of-care (POC) clinical tests, such as HIV viral load, require large volumes of plasma. Although centrifuges are ubiquitously used in clinical laboratories to separate plasma from whole blood, centrifugation is generally inappropriate for on-site testing. Suitable alternatives are not readily available to separate the relatively large volumes of plasma from milliliters of blood that may be needed to meet stringent limit-of-detection specifications for low-abundance target molecules. We report on a simple-to-use, low-cost, pump-free, membrane-based, sedimentation-assisted plasma separator capable of separating a relatively large volume of plasma from undiluted whole blood within minutes. This plasma separator consists of an asymmetric, porous, polysulfone membrane housed in a disposable chamber. The separation process takes advantage of both gravitational sedimentation of blood cells and size exclusion-based filtration. The plasma separator demonstrated a "blood in-plasma out" capability, consistently extracting 275 ± 33.5 μL of plasma from 1.8 mL of undiluted whole blood within less than 7 min. The device was used to separate plasma laden with HIV viruses from HIV virus-spiked whole blood with recovery efficiencies of 95.5% ± 3.5%, 88.0% ± 9.5%, and 81.5% ± 12.1% for viral loads of 35,000, 3500, and 350 copies/mL, respectively. The separation process is self-terminating to prevent excessive hemolysis. The HIV-laden plasma was then injected into our custom-made microfluidic chip for nucleic acid testing and was successfully subjected to reverse-transcriptase loop-mediated isothermal amplification (RT-LAMP), demonstrating that the plasma is sufficiently pure to support high-efficiency nucleic acid amplification.

  11. C-point and V-point singularity lattice formation and index sign conversion methods

    Science.gov (United States)

    Kumar Pal, Sushanta; Ruchi; Senthilkumaran, P.

    2017-06-01

    The generic singularities in an ellipse field are C-points namely stars, lemons and monstars in a polarization distribution with C-point indices (-1/2), (+1/2) and (+1/2) respectively. Similar to C-point singularities, there are V-point singularities that occur in a vector field and are characterized by Poincare-Hopf index of integer values. In this paper we show that the superposition of three homogenously polarized beams in different linear states leads to the formation of polarization singularity lattice. Three point sources at the focal plane of the lens are used to create three interfering plane waves. A radial/azimuthal polarization converter (S-wave plate) placed near the focal plane modulates the polarization states of the three beams. The interference pattern is found to host C-points and V-points in a hexagonal lattice. The C-points occur at intensity maxima and V-points occur at intensity minima. Modulating the state of polarization (SOP) of three plane waves from radial to azimuthal does not essentially change the nature of polarization singularity lattice as the Poincare-Hopf index for both radial and azimuthal polarization distributions is (+1). Hence a transformation from a star to a lemon is not trivial, as such a transformation requires not a single SOP change, but a change in whole spatial SOP distribution. Further there is no change in the lattice structure and the C- and V-points appear at locations where they were present earlier. Hence to convert an interlacing star and V-point lattice into an interlacing lemon and V-point lattice, the interferometer requires modification. We show for the first time a method to change the polarity of C-point and V-point indices. This means that lemons can be converted into stars and stars can be converted into lemons. Similarly the positive V-point can be converted to negative V-point and vice versa. The intensity distribution in all these lattices is invariant as the SOPs of the three beams are changed in an

  12. Acid dew point measurement in flue gases

    Energy Technology Data Exchange (ETDEWEB)

    Struschka, M.; Baumbach, G.

    1986-06-01

    The operation of modern boiler plants requires the continuous measurement of the acid dew point in flue gases. An existing measuring instrument was modified in such a way that it can determine acid dew points reliably, reproduceably and continuously. The authors present the mechanisms of the dew point formation, the dew point measuring principle, the modification and the operational results.

  13. IMPACT OF ENERGY GROUP STRUCTURE ON NUCLEAR DATA TARGET ACCURACY REQUIREMENTS FOR ADVANCED REACTOR SYSTEMS

    Energy Technology Data Exchange (ETDEWEB)

    G. Palmiotti; M. Salvatores; H. Hiruta

    2011-06-01

    A target accuracy assessment study using both a fine and a broad energy structure has shown that less stringent nuclear data accuracy requirements are needed for the latter energy structure. However, even though a reduction is observed, still the requirements will be very difficult to be met unless integral experiments are also used to reduce nuclear data uncertainties. Target accuracy assessment is the inverse problem of the uncertainty evaluation. To establish priorities and target accuracies on data uncertainty reduction, a formal approach can be adopted by defining target accuracy on design parameters and finding out required accuracy on data in order to meet them. In fact, the unknown uncertainty data requirements can be obtained by solving a minimization problem where the sensitivity coefficients in conjunction with the constraints on the integral parameters provide the needed quantities for finding the solutions.

  14. Plasma-equivalent glucose at the point-of-care: evaluation of Roche Accu-Chek Inform and Abbott Precision PCx glucose meters.

    Science.gov (United States)

    Ghys, Timothy; Goedhuys, Wim; Spincemaille, Katrien; Gorus, Frans; Gerlo, Erik

    2007-01-01

    Glucose testing at the bedside has become an integral part of the management strategy in diabetes and of the careful maintenance of normoglycemia in all patients in intensive care units. We evaluated two point-of-care glucometers for the determination of plasma-equivalent blood glucose. The Precision PCx and the Accu-Chek Inform glucometers were evaluated. Imprecision and bias relative to the Vitros 950 system were determined using protocols of the Clinical Laboratory Standards Institute (CLSI). The effects of low, normal, and high hematocrit levels were investigated. Interference by maltose was also studied. Within-run precision for both instruments ranged from 2-5%. Total imprecision was less than 5% except for the Accu-Chek Inform at the low level (2.9 mmol/L). Both instruments correlated well with the comparison instrument and showed excellent recovery and linearity. Both systems reported at least 95% of their values within zone A of the Clarke Error Grid, and both fulfilled the CLSI quality criteria. The more stringent goals of the American Diabetes Association, however, were not reached. Both systems showed negative bias at high hematocrit levels. Maltose interfered with the glucose measurements on the Accu-Chek Inform but not on the Precision PCx. Both systems showed satisfactory imprecision and were reliable in reporting plasma-equivalent glucose concentrations. The most stringent performance goals were however not met.

  15. Requirements for the retrofitting an extension of the maximum voltage power grid from the point of view of environmental protection and cultivated landscape work

    International Nuclear Information System (INIS)

    2013-01-01

    The project on the requirements for the retrofitting an extension of the maximum voltage power grid from the point of view of environmental protection and cultivated landscape work includes contributions on the following topics: the development of the European transmission grid, the grid extension law, restrictions for the power grid and their infrastructure, requirements for the regulations concerning the realization of the transnational grid extension, inclusion of the public - public acceptance - communication, requirements concerning the environmental compensation law, overhead line - underground cable - health hazards, ecological effects of overhead lines and underground cables, infrastructural projects, power supply in the future, structural relief by photovoltaics.

  16. Key points for the design of Mox facilities

    International Nuclear Information System (INIS)

    Ducroux, R.; Gaiffe, L.; Dumond, S.; Cret, L.

    1998-01-01

    The design of a MOX fuel fabrication facility involves specific technical difficulties: - Process aspects: for example, its is necessary to meet the stringent requirements on the end products, while handling large quantities of powders and pellets; - Safety aspects: for example, containment of radioactive materials requires to use gloveboxes, to design process equipment so as to limit dispersion to the gloveboxes and to use systems for dust collection. - Technological aspects: for example, it is necessary to take into account maintenance early in the design, in order to lower the operation costs and lower the dose to the personnel. - Quality control and information systems: for example, it is necessary to be able to trace all the different products (powder lots, pellets, rods, assemblies). The design methods and organization set-up by COGEMA enables to master these technical difficulties during the different design steps and to obtain a MOX fabrication facility at the best performance versus cost compromise. These design methods rely mainly on: - taking into account all the different above mentioned constraints from the very beginning of the design process (by using the know-how resulting from experience feed-back, and also specific design tools developed by COGEMA and SGN); - launching a technical development and testing program at the beginning of the project and incorporating its results in the course of the design. (author)

  17. Point Coulomb solutions of the Dirac equation: analytical results required for the evaluation of the bound electron propagator in quantum electrodynamics

    International Nuclear Information System (INIS)

    Whittingham, I.B.

    1977-12-01

    The bound electron propagator in quantum electrodynamics is reviewed and the Brown and Schaefer angular momentum representation of the propagator discussed. Regular and irregular solutions of the radial Dirac equations for both /E/ 2 and /E/ >or= mc 2 are required for the computation of the propagator. Analytical expressions for these solutions, and their corresponding Wronskians, are obtained for a point Coulomb potential. Some computational aspects are discussed in an appendix

  18. Technology Requirements For a Square-Meter, Arcsecond-Resolution Telescope for X-Rays: The SMART-X Mission

    Science.gov (United States)

    Schwartz, Daniel A.; Allured, Ryan; Bookbinder, Jay; Cotroneo, Vincenzo; Forman, William; Freeman, Mark; McMuldroch, Stuart; Reid, Paul; Tananbaum, Harvey; Vikhlinin, Alexey; hide

    2014-01-01

    Addressing the astrophysical problems of the 2020's requires sub-arcsecond x-ray imaging with square meter effective area. Such requirements can be derived, for example, by considering deep x-ray surveys to find the young black holes in the early universe (large redshifts) which will grow into the first supermassive black holes. We have envisioned a mission based on adjustable x-ray optics technology, in order to achieve the required reduction of mass to collecting area for the mirrors. We are pursuing technology which effects this adjustment via thin film piezoelectric "cells" deposited directly on the non-reflecting sides of thin, slumped glass. While SMARTX will also incorporate state-of-the-art x-ray cameras, the remaining spacecraft systems have no more stringent requirements than those which are well understood and proven on the current Chandra X-ray Observatory.

  19. Attitude control of an orbiting space vehicle.

    Science.gov (United States)

    Sutherlin, D. W.; Boland, J. S. , III; Borelli, M. T.

    1971-01-01

    Study of the normal and clamped modes of operation and dynamic response characteristics of the gimbaled control moment gyro (CMG) designed to fulfill the stringent pointing requirements of the Skylab telescope mount when the spacecraft is under the influence of both external and internal torques. The results indicate that the clamped mode of operation provides a feasible approach for significantly improving the system characteristics.

  20. Neutral-point current modeling and control for Neutral-Point Clamped three-level converter drive with small DC-link capacitors

    DEFF Research Database (Denmark)

    Maheshwari, Ram Krishan; Munk-Nielsen, Stig; Busquets-Monge, Sergio

    2011-01-01

    A Neutral-Point-Clamped (NPC) three-level inverter with small DC-link capacitors is presented in this paper. This inverter requires zero average neutral-point current for stable neutral-point potential. A simple carrier based modulation strategy is proposed for achieving zero average neutral...... drive with only 14 μF DC-link capacitors. A fast and stable performance of the neutral-point voltage controller is achieved and verified by experiments....

  1. Bayesian analysis of Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2006-01-01

    Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...... a partially ordered Markov point process as the auxiliary variable. As the method requires simulation from the "unknown" likelihood, perfect simulation algorithms for spatial point processes become useful....

  2. Beam finding algorithms at the interaction point of B factories

    International Nuclear Information System (INIS)

    Kozanecki, W.

    1992-10-01

    We review existing methods to bring beams in collision in circular machines, and examine collision alignment strategies proposed for e + e - B-factories. The two-ring feature of such machines, while imposing more stringent demands on beam control, also opens up new diagnostic possibilities

  3. General Approaches and Requirements on Safety and Security of Radioactive Materials Transport in Russian Federation

    International Nuclear Information System (INIS)

    Ershov, V.N.; Buchel'nikov, A.E.; Komarov, S.V.

    2016-01-01

    Development and implementation of safety and security requirements for transport of radioactive materials in the Russian Federation are addressed. At the outset it is worth noting that the transport safety requirements implemented are in full accordance with the IAEA's ''Regulations for the Safe Transport of Radioactive Material (2009 Edition)''. However, with respect to security requirements for radioactive material transport in some cases the Russian Federation requirements for nuclear material are more stringent compared to IAEA recommendations. The fundamental principles of safety and security of RM managements, recommended by IAEA documents (publications No. SF-1 and GOV/41/2001) are compared. Its correlation and differences concerning transport matters, the current level and the possibility of harmonization are analysed. In addition a reflection of the general approaches and concrete transport requirements is being evaluated. Problems of compliance assessment, including administrative and state control problems for safety and security provided at internal and international shipments are considered and compared. (author)

  4. Important requirements for RF generators for Accelerator-Driven Transmutation Technologies (ADTT)

    International Nuclear Information System (INIS)

    Lynch, M.T.; Tallerico, P.J.; Lawrence, G.P.

    1994-01-01

    All Accelerator-Driven Transmutation applications require very large amounts of RF Power. For example, one version of a Plutonium burning system requires an 800-MeV, 80-mA, proton accelerator running at 100% duty factor. This accelerator requires approximately 110-MW of continuous RF power if one assumes only 10% reserve power for control of the accelerator fields. In fact, to minimize beam spill, the RF controls may need as much as 15 to 20% of reserve power. In addition, unlike an electron accelerator in which the beam is relativistic, a failed RF station can disturb the synchronism of the beam, possibly shutting down the entire accelerator. These issues and more lead to a set of requirements for the RF generators which are stringent, and in some cases, conflicting. In this paper, we will describe the issues and requirements, and outline a plan for RF generator development to meet the needs of the Accelerator-Driven Transmutation Technologies. The key issues which will be discussed include: operating efficiency, operating linearity, effect on the input power grid, bandwidth, gain, reliability, operating voltage, and operating current

  5. Field-flood requirements for emission computed tomography with an Anger camer

    International Nuclear Information System (INIS)

    Rogers, W.L.; Clinthorne, N.H.; Harkness, B.A.; Koral, K.F.; Keyes, J.W. Jr.

    1982-01-01

    Emission computed tomography with a rotating camera places stringent requirements on camera uniformity and the stability of camera response. In terms of clinical tomographic imaging, we have studied the statistical accuracy required for camera flood correction, the requirements for flood accuracy, the utility and validity of flood and data image smoothing to reduce random noise effects, and the magnitude and effect of camera variations as a function of angular position, energy window, and tuning. Uniformity of the corrected flood response must be held to better than 1% to eliminate image artifacts that are apparent in a million-count image of a liver slice. This requires calibration with an accurate, well-mixed flood source. Both random fluctuations and variations in camera response with rotation must be kept below 1%. To meet the statistical limit, one requires at least 30 million counts for the flod-correction image. Smoothing the flood image alone introduces unacceptable image artifacts. Smoothing both the flood image and data, however, appears to be a good approach toward reducing noise effects. Careful camera tuning and magnetic shield design provide camera stability suitable for present clinical applications

  6. Criteria required for an acceptable point-of-care test for UTI detection: Obtaining consensus using the Delphi technique.

    Science.gov (United States)

    Weir, Nichola-Jane M; Pattison, Sally H; Kearney, Paddy; Stafford, Bob; Gormley, Gerard J; Crockard, Martin A; Gilpin, Deirdre F; Tunney, Michael M; Hughes, Carmel M

    2018-01-01

    Urinary Tract Infections (UTIs) are common bacterial infections, second only to respiratory tract infections and particularly prevalent within primary care. Conventional detection of UTIs is culture, however, return of results can take between 24 and 72 hours. The introduction of a point of care (POC) test would allow for more timely identification of UTIs, facilitating improved, targeted treatment. This study aimed to obtain consensus on the criteria required for a POC UTI test, to meet patient need within primary care. Criteria for consideration were compiled by the research team. These criteria were validated through a two-round Delphi process, utilising an expert panel of healthcare professionals from across Europe and United States of America. Using web-based questionnaires, panellists recorded their level of agreement with each criterion based on a 5-point Likert Scale, with space for comments. Using median response, interquartile range and comments provided, criteria were accepted/rejected/revised depending on pre-agreed cut-off scores. The first round questionnaire presented thirty-three criteria to the panel, of which 22 were accepted. Consensus was not achieved for the remaining 11 criteria. Following response review, one criterion was removed, while after revision, the remaining 10 criteria entered the second round. Of these, four were subsequently accepted, resulting in 26 criteria considered appropriate for a POC test to detect urinary infections. This study generated an approved set of criteria for a POC test to detect urinary infections. Criteria acceptance and comments provided by the healthcare professionals also supports the development of a multiplex point of care UTI test.

  7. Critical analysis of the stringent complete response in multiple myeloma: contribution of sFLC and bone marrow clonality.

    Science.gov (United States)

    Martínez-López, Joaquín; Paiva, Bruno; López-Anglada, Lucía; Mateos, María-Victoria; Cedena, Teresa; Vidríales, María-Belén; Sáez-Gómez, María Auxiliadora; Contreras, Teresa; Oriol, Albert; Rapado, Inmaculada; Teruel, Ana-Isabel; Cordón, Lourdes; Blanchard, María Jesús; Bengoechea, Enrique; Palomera, Luis; de Arriba, Felipe; Cueto-Felgueroso, Cecilia; Orfao, Alberto; Bladé, Joan; San Miguel, Jesús F; Lahuerta, Juan José

    2015-08-13

    Stringent complete response (sCR) criteria are used in multiple myeloma as a deeper response category compared with CR, but prospective validation is lacking, it is not always clear how evaluation of clonality is performed, and is it not known what the relative clinical influence is of the serum free light chain ratio (sFLCr) and bone marrow (BM) clonality to define more sCR. To clarify this controversy, we focused on 94 patients that reached CR, of which 69 (73%) also fulfilled the sCR criteria. Patients with sCR displayed slightly longer time to progression (median, 62 vs 53 months, respectively; P = .31). On analyzing this contribution to the prognosis of sFLCr or clonality, it was found that the sFLCr does not identify patients in CR at distinct risk; by contrast, low-sensitive multiparametric flow cytometry (MFC) immunophenotyping (2 colors), which is equivalent to immunohistochemistry, identifies a small number of patients (5 cases) with high residual tumor burden and dismal outcome; nevertheless, using traditional 4-color MFC, persistent clonal BM disease was detectable in 36% of patients, who, compared with minimal residual disease-negative cases, had a significantly inferior outcome. These results show that the current definition of sCR should be revised. © 2015 by The American Society of Hematology.

  8. Characterizing fixed points

    Directory of Open Access Journals (Sweden)

    Sanjo Zlobec

    2017-04-01

    Full Text Available A set of sufficient conditions which guarantee the existence of a point x⋆ such that f(x⋆ = x⋆ is called a "fixed point theorem". Many such theorems are named after well-known mathematicians and economists. Fixed point theorems are among most useful ones in applied mathematics, especially in economics and game theory. Particularly important theorem in these areas is Kakutani's fixed point theorem which ensures existence of fixed point for point-to-set mappings, e.g., [2, 3, 4]. John Nash developed and applied Kakutani's ideas to prove the existence of (what became known as "Nash equilibrium" for finite games with mixed strategies for any number of players. This work earned him a Nobel Prize in Economics that he shared with two mathematicians. Nash's life was dramatized in the movie "Beautiful Mind" in 2001. In this paper, we approach the system f(x = x differently. Instead of studying existence of its solutions our objective is to determine conditions which are both necessary and sufficient that an arbitrary point x⋆ is a fixed point, i.e., that it satisfies f(x⋆ = x⋆. The existence of solutions for continuous function f of the single variable is easy to establish using the Intermediate Value Theorem of Calculus. However, characterizing fixed points x⋆, i.e., providing answers to the question of finding both necessary and sufficient conditions for an arbitrary given x⋆ to satisfy f(x⋆ = x⋆, is not simple even for functions of the single variable. It is possible that constructive answers do not exist. Our objective is to find them. Our work may require some less familiar tools. One of these might be the "quadratic envelope characterization of zero-derivative point" recalled in the next section. The results are taken from the author's current research project "Studying the Essence of Fixed Points". They are believed to be original. The author has received several feedbacks on the preliminary report and on parts of the project

  9. Fixed-point signal processing

    CERN Document Server

    Padgett, Wayne T

    2009-01-01

    This book is intended to fill the gap between the ""ideal precision"" digital signal processing (DSP) that is widely taught, and the limited precision implementation skills that are commonly required in fixed-point processors and field programmable gate arrays (FPGAs). These skills are often neglected at the university level, particularly for undergraduates. We have attempted to create a resource both for a DSP elective course and for the practicing engineer with a need to understand fixed-point implementation. Although we assume a background in DSP, Chapter 2 contains a review of basic theory

  10. Development of 30-pin connectors for electronic modules of C and I systems for NPP's confirming to customized MIL STD-1344 requirements

    International Nuclear Information System (INIS)

    Marathe, P.P.; Madala, Kalyan C.; Ramakrishna, P.

    2014-01-01

    The electrical connectors form an important constituent of C and I system where customized circuits and hardware is required to be configured meeting the Nuclear Power Plant regulatory requirements. C and I hardware has to handle multiple hundreds of I/O's and the system architectures are made in modular construction having C and I system hardware packaged in plug-in electronic modules in the required form factors. In addition if the system has to satisfy customized JSS 55555 requirements meeting stringent shock, vibration and environmental specifications, the connectors used for the electronic modules shall meet the customized MIL STD-1344 requirements and meet reliability target for the system. 30-pin type special connectors for electronic modules and 2x30 (60) pin field cabling connectors were developed meeting the required qualification specifications. (author)

  11. Development of a Hard X-ray Beam Position Monitor for Insertion Device Beams at the APS

    Science.gov (United States)

    Decker, Glenn; Rosenbaum, Gerd; Singh, Om

    2006-11-01

    Long-term pointing stability requirements at the Advanced Photon Source (APS) are very stringent, at the level of 500 nanoradians peak-to-peak or better over a one-week time frame. Conventional rf beam position monitors (BPMs) close to the insertion device source points are incapable of assuring this level of stability, owing to mechanical, thermal, and electronic stability limitations. Insertion device gap-dependent systematic errors associated with the present ultraviolet photon beam position monitors similarly limit their ability to control long-term pointing stability. We report on the development of a new BPM design sensitive only to hard x-rays. Early experimental results will be presented.

  12. 33 CFR 117.1001 - Cat Point Creek.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Cat Point Creek. 117.1001 Section 117.1001 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Virginia § 117.1001 Cat Point Creek. The draw of the...

  13. Canadian energy standards : residential energy code requirements

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, K. [SAR Engineering Ltd., Burnaby, BC (Canada)

    2006-09-15

    A survey of residential energy code requirements was discussed. New housing is approximately 13 per cent more efficient than housing built 15 years ago, and more stringent energy efficiency requirements in building codes have contributed to decreased energy use and greenhouse gas (GHG) emissions. However, a survey of residential energy codes across Canada has determined that explicit demands for energy efficiency are currently only present in British Columbia (BC), Manitoba, Ontario and Quebec. The survey evaluated more than 4300 single-detached homes built between 2000 and 2005 using data from the EnerGuide for Houses (EGH) database. House area, volume, airtightness and construction characteristics were reviewed to create archetypes for 8 geographic areas. The survey indicated that in Quebec and the Maritimes, 90 per cent of houses comply with ventilation system requirements of the National Building Code, while compliance in the rest of Canada is much lower. Heat recovery ventilation use is predominant in the Atlantic provinces. Direct-vent or condensing furnaces constitute the majority of installed systems in provinces where natural gas is the primary space heating fuel. Details of Insulation levels for walls, double-glazed windows, and building code insulation standards were also reviewed. It was concluded that if R-2000 levels of energy efficiency were applied, total average energy consumption would be reduced by 36 per cent in Canada. 2 tabs.

  14. Development of NPP Safety Requirements into Kenya's Grid Codes

    Energy Technology Data Exchange (ETDEWEB)

    Ndirangu, Nguni James; Koo, Chang Choong [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-10-15

    As presently drafted, Kenya's grid codes do not contain any NPP requirements. Through case studies of selected grid codes, this paper will study frequency, voltage and fault ride through requirements for NPP connection and operation, and offer recommendation of how these requirements can be incorporated in the Kenya's grid codes. Voltage and frequency excursions in Kenya's grid are notably frequently outside the generic requirement and the values observed by the German and UK grid codes. Kenya's grid codes require continuous operation for ±10% of nominal voltage and 45.0 to 52Hz on the grid which poses safety issues for an NPP. Considering stringent NPP connection to grid and operational safety requirements, and the importance of the TSO to NPP safety, more elaborate requirements need to be documented in the Kenya's grid codes. UK and Germany have a history of meeting high standards of nuclear safety and it is therefore recommended that format like the one in Table 1 to 3 should be adopted. Kenya's Grid code considering NPP should have: • Strict rules for voltage variation, that is, -5% to +10% of the nominal voltage • Strict rules for frequency variation, that is, 48Hz to 52Hz of the nominal frequencyand.

  15. Development of NPP Safety Requirements into Kenya's Grid Codes

    International Nuclear Information System (INIS)

    Ndirangu, Nguni James; Koo, Chang Choong

    2015-01-01

    As presently drafted, Kenya's grid codes do not contain any NPP requirements. Through case studies of selected grid codes, this paper will study frequency, voltage and fault ride through requirements for NPP connection and operation, and offer recommendation of how these requirements can be incorporated in the Kenya's grid codes. Voltage and frequency excursions in Kenya's grid are notably frequently outside the generic requirement and the values observed by the German and UK grid codes. Kenya's grid codes require continuous operation for ±10% of nominal voltage and 45.0 to 52Hz on the grid which poses safety issues for an NPP. Considering stringent NPP connection to grid and operational safety requirements, and the importance of the TSO to NPP safety, more elaborate requirements need to be documented in the Kenya's grid codes. UK and Germany have a history of meeting high standards of nuclear safety and it is therefore recommended that format like the one in Table 1 to 3 should be adopted. Kenya's Grid code considering NPP should have: • Strict rules for voltage variation, that is, -5% to +10% of the nominal voltage • Strict rules for frequency variation, that is, 48Hz to 52Hz of the nominal frequencyand

  16. Development and Qualification of an Antenna Pointing Mechanism for the ExoMars High-Gain Antenna

    Science.gov (United States)

    St-Andre, Stephane; Dumais, Marie-Christine; Lebel, Louis-Philippe; Langevin, Jean-Paul; Horth, Richard; Winton, Alistair; Lebleu, Denis

    2015-09-01

    The European Space Agency ExoMars 2016 mission required a gimbaled High Gain Antenna (HGA) for orbiter-to-earth communications. The ExoMars Program is a cooperative program between ESA and ROSCOSMOS with participation of NASA. The ExoMars Program industrial consortium is led by THALES ALENIA SPACE.This paper presents the design and qualification test results of the Antenna Pointing Mechanism (APM) used to point the HGA towards Earth. This electrically redundant APM includes motors, drive trains, optical encoders, cable cassette and RF Rotary Joints.Furthermore, the paper describes the design, development and the qualification approach applied to this APM. The design challenges include a wide pointing domain necessary to maximise the communication duty cycle during the early operation phase, the interplanetary cruise phase and during the mission’s orbital science phase. Other design drivers are an extended rotation cycle life with very low backlash yielding little wear and accurate position feedback on both axes. Major challenges and related areas of development include:• Large moments are induced on the APM due to aerobraking forces when the Mars atmosphere is used to slow the orbiter into its science mission orbit,• Thermal control of the critical components of the APM due to the different environments of the various phases of the mission. Also, the large travel range of the actuators complicated the radiator design in order to maintain clearances and to avoid overheating.• The APM, with a mass less than 17.5 kg, is exposed to a demanding dynamic environment due to its mounting on the spacecraft thrust tube and aggravated by its elevated location on the payload.• Power and Data transmission between elevation and azimuth axes through a compact large rotation range spiral type cable cassette.• Integration of a 16 bit redundant encoder on both axes for position feedback: Each encoder is installed on the back of a rotary actuator and is coupled using the

  17. At the Tipping Point

    Energy Technology Data Exchange (ETDEWEB)

    Wiley, H. S.

    2011-02-28

    There comes a time in every field of science when things suddenly change. While it might not be immediately apparent that things are different, a tipping point has occurred. Biology is now at such a point. The reason is the introduction of high-throughput genomics-based technologies. I am not talking about the consequences of the sequencing of the human genome (and every other genome within reach). The change is due to new technologies that generate an enormous amount of data about the molecular composition of cells. These include proteomics, transcriptional profiling by sequencing, and the ability to globally measure microRNAs and post-translational modifications of proteins. These mountains of digital data can be mapped to a common frame of reference: the organism’s genome. With the new high-throughput technologies, we can generate tens of thousands of data points from each sample. Data are now measured in terabytes and the time necessary to analyze data can now require years. Obviously, we can’t wait to interpret the data fully before the next experiment. In fact, we might never be able to even look at all of it, much less understand it. This volume of data requires sophisticated computational and statistical methods for its analysis and is forcing biologists to approach data interpretation as a collaborative venture.

  18. Technology Requirements for a Square Meter, Arcsecond Resolution Telescope for X-Rays: The SMART-X Mission

    Science.gov (United States)

    Schwartz, Daniel A.; Allured, Ryan; Bookbinder, Jay A.; Cotroneo, Vincenzo; Forman, William R.; Freeman, Mark D.; McMuldroch, Stuart; Reid, Paul B.; Tananbaum, Harvey; Vikhlinin, Alexey A.; hide

    2014-01-01

    Addressing the astrophysical problems of the 2020's requires sub-arcsecond x-ray imaging with square meter effective area. Such requirements can be derived, for example, by considering deep x-ray surveys to find the young black holes in the early universe (large redshifts) which will grow into the first super-massive black holes. We have envisioned a mission, the Square Meter Arcsecond Resolution Telescope for X-rays (SMART-X), based on adjustable x-ray optics technology, incorporating mirrors with the required small ratio of mass to collecting area. We are pursuing technology which achieves sub-arcsecond resolution by on-orbit adjustment via thin film piezoelectric "cells" deposited directly on the non-reflecting sides of thin, slumped glass. While SMART-X will also incorporate state-of-the-art x-ray cameras, the remaining spacecraft systems have no requirements more stringent than those which are well understood and proven on the current Chandra X-ray Observatory.

  19. Point cloud processing for smart systems

    Directory of Open Access Journals (Sweden)

    Jaromír Landa

    2013-01-01

    Full Text Available High population as well as the economical tension emphasises the necessity of effective city management – from land use planning to urban green maintenance. The management effectiveness is based on precise knowledge of the city environment. Point clouds generated by mobile and terrestrial laser scanners provide precise data about objects in the scanner vicinity. From these data pieces the state of the roads, buildings, trees and other objects important for this decision-making process can be obtained. Generally, they can support the idea of “smart” or at least “smarter” cities.Unfortunately the point clouds do not provide this type of information automatically. It has to be extracted. This extraction is done by expert personnel or by object recognition software. As the point clouds can represent large areas (streets or even cities, usage of expert personnel to identify the required objects can be very time-consuming, therefore cost ineffective. Object recognition software allows us to detect and identify required objects semi-automatically or automatically.The first part of the article reviews and analyses the state of current art point cloud object recognition techniques. The following part presents common formats used for point cloud storage and frequently used software tools for point cloud processing. Further, a method for extraction of geospatial information about detected objects is proposed. Therefore, the method can be used not only to recognize the existence and shape of certain objects, but also to retrieve their geospatial properties. These objects can be later directly used in various GIS systems for further analyses.

  20. Perceptions of point-of-care infectious disease testing among European medical personnel, point-of-care test kit manufacturers, and the general public

    NARCIS (Netherlands)

    W.E. Kaman (Wendy); E-R. Andrinopoulou (Eleni-Rosalina); J.P. Hays (John)

    2013-01-01

    textabstractBackground: The proper development and implementation of point-of-care (POC) diagnostics requires knowledge of the perceived requirements and barriers to their implementation. To determine the current requirements and perceived barriers to the introduction of POC diagnostics in the field

  1. Future requirements for petroleum fuels - an environmental perspective

    International Nuclear Information System (INIS)

    White, R.

    1998-01-01

    The environmental impacts of fuel emissions were discussed. Emissions from petroleum fuels are the largest contributor to a wide range of environmental problems including damage to the ozone layer and risks to human health. Forecasts indicate that future demand for fossil fuels for energy will continue to grow. The transportation sector is the largest single source of air emissions in Canada. The environmental requirements for all fuels will become progressively more stringent. The pollutants of primary concern include toxics, nitrogen oxides, volatile organic compounds, carbon monoxide, sulphur dioxide, and particulates. The U.S. auto-oil research program has conducted considerable research to understand the impact of fuel parameters of vehicle tailpipe emissions. In Canada, lead was removed from Canadian gas a decade ago. Since January 1998, low sulphur diesel (less than 500 ppm) is required for on-road use. Regulations have also been passed to reduce the level of benzene in gasoline to less than one per cent by mid-1999. It will be necessary to manage our fossil fuels to minimize the environmental impacts from combustion. In the longer term, it will be necessary to minimize fossil fuel use through conservation and shift to less polluting fuels

  2. A carrier-based approach for overmodulation of three-level neutral-point-lamped inverter with zero neutral-point current

    DEFF Research Database (Denmark)

    Maheshwari, Ram Krishan; Munk-Nielsen, Stig; Busquets-Monge, S.

    2012-01-01

    In a voltage source inverter, overmodulation is required to extend the range of operation and enhance the dc-link voltage utilization. A carrier-based implementation of a modulation strategy for the three-level neutral-point-clamped inverter is proposed for the overmodulation region. The modulation...... strategy ensures zero average neutral-point current in a switching period. A newly proposed boundary compression is used to regulate the dc-link voltage at all operating points. A description of the algorithm to implement the modulation strategy is also presented. The main advantage of the proposed...

  3. Nanotexturing of surfaces to reduce melting point.

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, Ernest J.; Zubia, David (University of Texas at El Paso El Paso, TX); Mireles, Jose (Universidad Aut%C3%94onoma de Ciudad Ju%C3%94arez Ciudad Ju%C3%94arez, Mexico); Marquez, Noel (University of Texas at El Paso El Paso, TX); Quinones, Stella (University of Texas at El Paso El Paso, TX)

    2011-11-01

    This investigation examined the use of nano-patterned structures on Silicon-on-Insulator (SOI) material to reduce the bulk material melting point (1414 C). It has been found that sharp-tipped and other similar structures have a propensity to move to the lower energy states of spherical structures and as a result exhibit lower melting points than the bulk material. Such a reduction of the melting point would offer a number of interesting opportunities for bonding in microsystems packaging applications. Nano patterning process capabilities were developed to create the required structures for the investigation. One of the technical challenges of the project was understanding and creating the specialized conditions required to observe the melting and reshaping phenomena. Through systematic experimentation and review of the literature these conditions were determined and used to conduct phase change experiments. Melting temperatures as low as 1030 C were observed.

  4. On conjugate points and the Leitmann equivalent problem approach

    NARCIS (Netherlands)

    Wagener, F.O.O.

    2009-01-01

    This article extends the Leitmann equivalence method to a class of problems featuring conjugate points. The class is characterised by the requirement that the set of indifference points of a given problem forms a finite stratification.

  5. A fixed-point farrago

    CERN Document Server

    Shapiro, Joel H

    2016-01-01

    This text provides an introduction to some of the best-known fixed-point theorems, with an emphasis on their interactions with topics in analysis. The level of exposition increases gradually throughout the book, building from a basic requirement of undergraduate proficiency to graduate-level sophistication. Appendices provide an introduction to (or refresher on) some of the prerequisite material and exercises are integrated into the text, contributing to the volume’s ability to be used as a self-contained text. Readers will find the presentation especially useful for independent study or as a supplement to a graduate course in fixed-point theory. The material is split into four parts: the first introduces the Banach Contraction-Mapping Principle and the Brouwer Fixed-Point Theorem, along with a selection of interesting applications; the second focuses on Brouwer’s theorem and its application to John Nash’s work; the third applies Brouwer’s theorem to spaces of infinite dimension; and the fourth rests ...

  6. Extreme simplification and rendering of point sets using algebraic multigrid

    NARCIS (Netherlands)

    Reniers, D.; Telea, A.C.

    2009-01-01

    We present a novel approach for extreme simplification of point set models, in the context of real-time rendering. Point sets are often rendered using simple point primitives, such as oriented discs. However, this requires using many primitives to render even moderately simple shapes. Often, one

  7. Share point 2013 Implementation Strategy for Supporting KM System Requirements in Nuclear Malaysia

    International Nuclear Information System (INIS)

    Mohamad Safuan Sulaiman; Siti Nurbahyah Hamdan; Abdul Muin Abdul Rahman

    2015-01-01

    Knowledge Management system (KMS or KM System) is an important tool for knowledge intensive organization such as Nuclear Malaysia. In June 2010, MS Share Point 2007 was deployed as a tool for KM System in Nuclear Malaysia and was functioning correctly until the end of 2013, whereby the system failed due to software malfunction and inability of the infrastructure to support its continuous operation and usage expansion. This led to difficulties for users to access their operational data and information, hence hampering access to one of the most important tool for KM System in Nuclear Malaysia. However, recently a newer and updated version of the system for example Share point 2013 was deployed to meet the same objectives. Learning from previous failures, the tool has been analyzed at various stages of technical and management reviews. The implementation of this newer version has been designed to overcome most of the deficiencies faced by the older version, both from the software and infrastructure point of views. The tool has performed very well ever since its commissioning from December 2014 till today. As it is still under warranty till March 2016, minimum maintenance issues have been experienced and any problems have been rectified promptly. This paper describes the implementation strategy in preparing the design information of software and hardware architecture of the new tool to overcome the problems of older version, in order to provide a better platform for KM System in Nuclear Malaysia. (author)

  8. Dynamic Flow Migration for Delay Constrained Traffic in Software-Defined Networks

    NARCIS (Netherlands)

    Berger, Andre; Gross, James; Danielis, Peter; Dán, György

    2017-01-01

    Various industrial control applications have stringent end-to-end latency requirements in the order of a few milliseconds. Software-defined networking (SDN) is a promising solution in order to meet these stringent requirements under varying traffic patterns, as it enables the flexible management of

  9. Manufacturing requirements of reactor assembly components for PFBR (Paper No. 041)

    International Nuclear Information System (INIS)

    Murty, C.G.K.; Bhoje, S.B.

    1987-02-01

    This paper enumerates the requirements of 500 MWe Prototype Fast Breeder Reactor (PFBR) components and considering the present state of art of Indian industry an analysis is made on the challenges to be faced in manufacture highlighting the areas needing development. The large sizes and weights of the components coupled with the limitations on shop facilities and ODC transport, demand part of the fabrication to be done at shop and balance assembly work as well as certain assembly machining operations to be done at site work shop. The stringent geometrical tolerances coupled with extensive destructive and non-destructive examinations call for balanced and low heat input welding techniques and special inspection equipment like electronic co-ordinate determination system. The present paper deals with the specific manufacturing problems of the main reactor components. (author)

  10. Beaconless Pointing for Deep-Space Optical Communication

    Science.gov (United States)

    Swank, Aaron J.; Aretskin-Hariton, Eliot; Le, Dzu K.; Sands, Obed S.; Wroblewski, Adam

    2016-01-01

    Free space optical communication is of interest to NASA as a complement to existing radio frequency communication methods. The potential for an increase in science data return capability over current radio-frequency communications is the primary objective. Deep space optical communication requires laser beam pointing accuracy on the order of a few microradians. The laser beam pointing approach discussed here operates without the aid of a terrestrial uplink beacon. Precision pointing is obtained from an on-board star tracker in combination with inertial rate sensors and an outgoing beam reference vector. The beaconless optical pointing system presented in this work is the current approach for the Integrated Radio and Optical Communication (iROC) project.

  11. Computer-aided design of control systems to meet many requirements

    Science.gov (United States)

    Schy, A. A.; Adams, W. M., Jr.; Johnson, K. G.

    1974-01-01

    A method is described for using nonlinear programing in the computer-aided design of airplane control systems. It is assumed that the quality of such systems depends on many criteria. These criteria are included in the constraints vector (instead of attempting to combine them into a single scalar criterion, as is usually done), and the design proceeds through a sequence of nonlinear programing solutions in which the designer varies the specification of sets of requirements levels. The method is applied to design of a lateral stability augmentation system (SAS) for a fighter airplane, in which the requirements vector is chosen from the official handling qualities specifications. Results are shown for several simple SAS configurations designed to obtain desirable handling qualities over all design flight conditions with minimum feedback gains. The choice of the final design for each case is not unique but depends on the designer's decision as to which achievable set of requirements levels represents the best for that system. Results indicate that it may be possible to design constant parameter SAS which can satisfy the most stringent handling qualities requirements for fighter airplanes in all flight conditions. The role of the designer as a decision maker, interacting with the computer program, is discussed. Advantages of this type of designer-computer interaction are emphasized. Desirable extensions of the method are indicated.

  12. Naïve Point Estimation

    Science.gov (United States)

    Lindskog, Marcus; Winman, Anders; Juslin, Peter

    2013-01-01

    The capacity of short-term memory is a key constraint when people make online judgments requiring them to rely on samples retrieved from memory (e.g., Dougherty & Hunter, 2003). In this article, the authors compare 2 accounts of how people use knowledge of statistical distributions to make point estimates: either by retrieving precomputed…

  13. Development of an integrated pointing device driver for the disabled.

    Science.gov (United States)

    Shih, Ching-Hsiang; Shih, Ching-Tien

    2010-01-01

    To help people with disabilities such as those with spinal cord injury (SCI) to effectively utilise commercial pointing devices to operate computers. This study proposes a novel method to integrate the functions of commercial pointing devices. Utilising software technology to develop an integrated pointing device driver (IPDD) for a computer operating system. The proposed IPDD has the following benefits: (1) it does not require additional hardware cost or circuit preservations, (2) it supports all standard interfaces of commercial pointing devices, including PS/2, USB and wireless interfaces and (3) it can integrate any number of devices. The IPDD can be selected and combined according to their physical restriction. The IPDD is a novel method of integrating commercial pointing devices. Through IPDD, people with disabilities can choose a suitable combination of commercial pointing devices to achieve full cursor control and optimise operational performance. In contrast with previous studies, the software-based solution does not require additional hardware or circuit preservations, and it can support unlimited devices. In summary, the IPDD has the benefits of flexibility, low cost and high-device compatibility.

  14. Visibility Analysis in a Point Cloud Based on the Medial Axis Transform

    NARCIS (Netherlands)

    Peters, R.; Ledoux, H.; Biljecki, F.

    2015-01-01

    Visibility analysis is an important application of 3D GIS data. Current approaches require 3D city models that are often derived from detailed aerial point clouds. We present an approach to visibility analysis that does not require a city model but works directly on the point cloud. Our approach is

  15. Automated Micro Hall Effect measurements

    DEFF Research Database (Denmark)

    Petersen, Dirch Hjorth; Henrichsen, Henrik Hartmann; Lin, Rong

    2014-01-01

    With increasing complexity of processes and variety of materials used for semiconductor devices, stringent control of the electronic properties is becoming ever more relevant. Collinear micro four-point probe (M4PP) based measurement systems have become high-end metrology methods for characteriza......With increasing complexity of processes and variety of materials used for semiconductor devices, stringent control of the electronic properties is becoming ever more relevant. Collinear micro four-point probe (M4PP) based measurement systems have become high-end metrology methods...

  16. Nuclear controls are stringent

    International Nuclear Information System (INIS)

    Sonnekus, D.

    1983-01-01

    The peace-time application of nuclear power in South Africa, the organisations concerned and certain provisions laid down by the Act on Nuclear Energy, aimed at safeguarding the general public, are discussed

  17. Nonlinear consider covariance analysis using a sigma-point filter formulation

    Science.gov (United States)

    Lisano, Michael E.

    2006-01-01

    The research reported here extends the mathematical formulation of nonlinear, sigma-point estimators to enable consider covariance analysis for dynamical systems. This paper presents a novel sigma-point consider filter algorithm, for consider-parameterized nonlinear estimation, following the unscented Kalman filter (UKF) variation on the sigma-point filter formulation, which requires no partial derivatives of dynamics models or measurement models with respect to the parameter list. It is shown that, consistent with the attributes of sigma-point estimators, a consider-parameterized sigma-point estimator can be developed entirely without requiring the derivation of any partial-derivative matrices related to the dynamical system, the measurements, or the considered parameters, which appears to be an advantage over the formulation of a linear-theory sequential consider estimator. It is also demonstrated that a consider covariance analysis performed with this 'partial-derivative-free' formulation yields equivalent results to the linear-theory consider filter, for purely linear problems.

  18. Fiber design and realization of point-by-point written fiber Bragg gratings in polymer optical fibers

    DEFF Research Database (Denmark)

    Stefani, Alessio; Stecher, Matthias; Town, Graham E.

    2012-01-01

    the gratings make the point-by-point grating writing technique very interesting and would appear to be able to fill this technological gap. On the other end this technique is hardly applicable for microstructured fibers because of the writing beam being scattered by the air-holes. We report on the design...... and because they allow to tune the guiding parameters by modifying the microstructure. Now a days the only technique used to write gratings in such fibers is the phase mask technique with UV light illumination. Despite the good results that have been obtained, a limited flexibility on the grating design...... and the very long times required for the writing of FBGs raise some questions about the possibility of exporting POF FBGs and the sensors based on them from the laboratory bench to the mass production market. The possibility of arbitrary design of fiber Bragg gratings and the very short time required to write...

  19. Pointo - a Low Cost Solution to Point Cloud Processing

    Science.gov (United States)

    Houshiar, H.; Winkler, S.

    2017-11-01

    With advance in technology access to data especially 3D point cloud data becomes more and more an everyday task. 3D point clouds are usually captured with very expensive tools such as 3D laser scanners or very time consuming methods such as photogrammetry. Most of the available softwares for 3D point cloud processing are designed for experts and specialists in this field and are usually very large software packages containing variety of methods and tools. This results in softwares that are usually very expensive to acquire and also very difficult to use. Difficulty of use is caused by complicated user interfaces that is required to accommodate a large list of features. The aim of these complex softwares is to provide a powerful tool for a specific group of specialist. However they are not necessary required by the majority of the up coming average users of point clouds. In addition to complexity and high costs of these softwares they generally rely on expensive and modern hardware and only compatible with one specific operating system. Many point cloud customers are not point cloud processing experts or willing to spend the high acquisition costs of these expensive softwares and hardwares. In this paper we introduce a solution for low cost point cloud processing. Our approach is designed to accommodate the needs of the average point cloud user. To reduce the cost and complexity of software our approach focuses on one functionality at a time in contrast with most available softwares and tools that aim to solve as many problems as possible at the same time. Our simple and user oriented design improve the user experience and empower us to optimize our methods for creation of an efficient software. In this paper we introduce Pointo family as a series of connected softwares to provide easy to use tools with simple design for different point cloud processing requirements. PointoVIEWER and PointoCAD are introduced as the first components of the Pointo family to provide a

  20. Direct-to-physician and direct-to-consumer advertising: Time to have stringent regulations.

    Science.gov (United States)

    Kannan, S; Gowri, S; Tyagi, V; Kohli, S; Jain, R; Kapil, P; Bhardwaj, A

    2015-01-01

    the opinion regarding DTCA, 69.9% physicians had a patient discussing DTCA that was clinically inappropriate. One hundred (64.5%) out of 155 physicians opined that DTCA encourage patients to attend physicians regarding preventive healthcare. On the contrary, 82/155 (52.9%) physicians felt that DTCA would damage the same. Similarly, 69 out of the total 100 patients felt that drug advertisements aid them to have better discussions with their treating physicians. Surprisingly, a large majority (91/100) were of the opinion that only safe drugs are allowed to be advertised. To conclude, from the findings of this study both the physicians and patients should be cautious and not overzealous while dealing with drug advertisements or promotional literature. More stringent scrutiny and issue of WLs or blacklisting of indulging pharmaceutical companies are mandatory by the regulatory agency to contain the same.

  1. Microbiological Monitoring for the Constellation Program: Current Requirements and Future Considerations

    Science.gov (United States)

    Ott, C. Mark

    2007-01-01

    Microbiological requirements for spaceflight are based on assessments of infectious disease risk which could impact crew health or mission success. The determination of risk from infectious disease is composed of several factors including (1) crew susceptibility, (2) crew exposure to the infectious disease agent, (3) the concentration of the infectious agent, and (4) the characteristics of the infectious agent. As a result of the Health Stabilization Program, stringent monitoring, and cleaning protocols, in-flight environmental microbial monitoring is not necessary for short-duration spaceflights. However, risk factors change for long-duration missions, as exemplified by the presence of medically significant organisms in the environments of both the Mir and International Space Station (ISS). Based upon this historical evidence, requirements for short duration usage aboard the Orion Crew Exploration Vehicle and Lunar Lander Vehicle will not require in-flight monitoring; however, as mission duration increases with a Lunar Outpost, an ability to detect microbial hazard will be necessary. The nature of the detection requirements will depend on the maturity of technology in a rapidly evolving marketplace. Regardless, the hardware will still need to maximize information to discipline experts and the crew, while minimizing the size, mass, power consumption, and crew time usage. The refinement of these monitors will be a major goal in our efforts to travel successfully to Mars.

  2. Environmental requirements in thermochemical and biochemical conversion of biomass

    International Nuclear Information System (INIS)

    Frings, R.M.; Mackie, K.L.; Hunter, I.R.

    1992-01-01

    Many biological and thermochemical processing options exist for the conversion of biomass to fuels. Commercially, these options are assessed in terms of fuel product yield and quality. However, attention must also be paid to the environmental aspects of each technology so that any commercial plant can meet the increasingly stringent environmental legislation in the world today. The environmental aspects of biological conversion (biogasification and bioliquefaction) and thermal conversion (high pressure liquefaction, flash pyrolysis, and gasification) are reviewed. Biological conversion processes are likely to generate waste streams which are more treatable than those from thermal conversion processes but the available data for thermal liquefaction are very limited. Close attention to waste minimisation is recommended and processing options that greatly reduce or eliminate waste streams have been identified. Product upgrading and its effect on wastewater quality also requires attention. Emphasis in further research studies needs to be placed on providing authentic waste streams for environmental assessment. (author)

  3. Fuel processing requirements and techniques for fuel cell propulsion power

    Science.gov (United States)

    Kumar, R.; Ahmed, S.; Yu, M.

    Fuels for fuel cells in transportation systems are likely to be methanol, natural gas, hydrogen, propane, or ethanol. Fuels other than hydrogen will need to be reformed to hydrogen on-board the vehicle. The fuel reformer must meet stringent requirements for weight and volume, product quality, and transient operation. It must be compact and lightweight, must produce low levels of CO and other byproducts, and must have rapid start-up and good dynamic response. Catalytic steam reforming, catalytic or noncatalytic partial oxidation reforming, or some combination of these processes may be used. This paper discusses salient features of the different kinds of reformers and describes the catalysts and processes being examined for the oxidation reforming of methanol and the steam reforming of ethanol. Effective catalysts and reaction conditions for the former have been identified; promising catalysts and reaction conditions for the latter are being investigated.

  4. Integrated approach for power quality requirements at the point of connection

    NARCIS (Netherlands)

    Cobben, J.F.G.; Bhattacharyya, S.; Myrzik, J.M.A.; Kling, W.L.

    2007-01-01

    Given the nature of electricity, every party connected to the power system influences voltage quality, which means that every party also should meet requirements. In this field, a sound coordination among technical standards (system-related, installation-related and product-related) is of paramount

  5. Maximum power point tracker based on fuzzy logic

    International Nuclear Information System (INIS)

    Daoud, A.; Midoun, A.

    2006-01-01

    The solar energy is used as power source in photovoltaic power systems and the need for an intelligent power management system is important to obtain the maximum power from the limited solar panels. With the changing of the sun illumination due to variation of angle of incidence of sun radiation and of the temperature of the panels, Maximum Power Point Tracker (MPPT) enables optimization of solar power generation. The MPPT is a sub-system designed to extract the maximum power from a power source. In the case of solar panels power source. the maximum power point varies as a result of changes in its electrical characteristics which in turn are functions of radiation dose, temperature, ageing and other effects. The MPPT maximum the power output from panels for a given set of conditions by detecting the best working point of the power characteristic and then controls the current through the panels or the voltage across them. Many MPPT methods have been reported in literature. These techniques of MPPT can be classified into three main categories that include: lookup table methods, hill climbing methods and computational methods. The techniques vary according to the degree of sophistication, processing time and memory requirements. The perturbation and observation algorithm (hill climbing technique) is commonly used due to its ease of implementation, and relative tracking efficiency. However, it has been shown that when the insolation changes rapidly, the perturbation and observation method is slow to track the maximum power point. In recent years, the fuzzy controllers are used for maximum power point tracking. This method only requires the linguistic control rules for maximum power point, the mathematical model is not required and therefore the implementation of this control method is easy to real control system. In this paper, we we present a simple robust MPPT using fuzzy set theory where the hardware consists of the microchip's microcontroller unit control card and

  6. EDGAR CO2 purity. Type and quantities of impurities related to CO2 point source and capture technology. A Literature study

    Energy Technology Data Exchange (ETDEWEB)

    Walspurger, S.; Van Dijk, H.A.J. [ECN Biomass and Energy Efficiency, Petten (Netherlands)

    2012-08-15

    Carbon capture and storage (CCS) is an important tool that will contribute significantly to CO2 emissions abatement both in power and industrial sectors. Capture technologies as well as transport and distribution infrastructure development need to be carried out to ensure efficient CO2 separation and safe transport to storage sites. This study aimed at identifying, and when possible quantifying, the impurities present in CO2 streams resulting from various CO2 capture plants, such that challenges in development of appropriate materials and cleaning technologies for future CCS infrastructure may be anticipated. In its first part, the study provides a description of the characteristics of the different CO2 capture technologies with respect to their response to different type and quantity of impurities, striving for describing realistic combinations of point sources and capture technologies. Composition of CO2 gaseous streams was found to be highly dependent upon the type of CO2 point source and the removal technology selected. In most of the capture processes, most impurities concentration may be minimised by fine tuning of process operation. However plant economics eventually govern the impurity level in the CO2 stream. For mature technologies such as absorption by chemical or physical solvents lower impurity levels were found to be theoretically quite low, but when energy spent for regeneration is lowered, or when second generation capture with lower energy requirement are considered, the impurity level in CO2 stream increases. Accordingly, the report also addresses the conditioning technologies that are available or need to be developed for removal of traces elements such as mercury, volatile compounds and other condensable and points at technologies to be developed, especially in the sulphur compounds removal from CO2. In its final part the report addresses the quantification of future specification and concludes based on literature study that pipeline

  7. Maximum power point tracking

    International Nuclear Information System (INIS)

    Enslin, J.H.R.

    1990-01-01

    A well engineered renewable remote energy system, utilizing the principal of Maximum Power Point Tracking can be m ore cost effective, has a higher reliability and can improve the quality of life in remote areas. This paper reports that a high-efficient power electronic converter, for converting the output voltage of a solar panel, or wind generator, to the required DC battery bus voltage has been realized. The converter is controlled to track the maximum power point of the input source under varying input and output parameters. Maximum power point tracking for relative small systems is achieved by maximization of the output current in a battery charging regulator, using an optimized hill-climbing, inexpensive microprocessor based algorithm. Through practical field measurements it is shown that a minimum input source saving of 15% on 3-5 kWh/day systems can easily be achieved. A total cost saving of at least 10-15% on the capital cost of these systems are achievable for relative small rating Remote Area Power Supply systems. The advantages at larger temperature variations and larger power rated systems are much higher. Other advantages include optimal sizing and system monitor and control

  8. A chemical-genetic strategy reveals distinct temporal requirements for SAD-1 kinase in neuronal polarization and synapse formation

    Directory of Open Access Journals (Sweden)

    Shokat Kevan M

    2008-09-01

    Full Text Available Abstract Background Neurons assemble into a functional network through a sequence of developmental processes including neuronal polarization and synapse formation. In Caenorhabditis elegans, the serine/threonine SAD-1 kinase is essential for proper neuronal polarity and synaptic organization. To determine if SAD-1 activity regulates the establishment or maintenance of these neuronal structures, we examined its temporal requirements using a chemical-genetic method that allows for selective and reversible inactivation of its kinase activity in vivo. Results We generated a PP1 analog-sensitive variant of SAD-1. Through temporal inhibition of SAD-1 kinase activity we show that its activity is required for the establishment of both neuronal polarity and synaptic organization. However, while SAD-1 activity is needed strictly when neurons are polarizing, the temporal requirement for SAD-1 is less stringent in synaptic organization, which can also be re-established during maintenance. Conclusion This study reports the first temporal analysis of a neural kinase activity using the chemical-genetic system. It reveals that neuronal polarity and synaptic organization have distinct temporal requirements for SAD-1.

  9. Dose point kernels for beta-emitting radioisotopes

    International Nuclear Information System (INIS)

    Prestwich, W.V.; Chan, L.B.; Kwok, C.S.; Wilson, B.

    1986-01-01

    Knowledge of the dose point kernel corresponding to a specific radionuclide is required to calculate the spatial dose distribution produced in a homogeneous medium by a distributed source. Dose point kernels for commonly used radionuclides have been calculated previously using as a basis monoenergetic dose point kernels derived by numerical integration of a model transport equation. The treatment neglects fluctuations in energy deposition, an effect which has been later incorporated in dose point kernels calculated using Monte Carlo methods. This work describes new calculations of dose point kernels using the Monte Carlo results as a basis. An analytic representation of the monoenergetic dose point kernels has been developed. This provides a convenient method both for calculating the dose point kernel associated with a given beta spectrum and for incorporating the effect of internal conversion. An algebraic expression for allowed beta spectra has been accomplished through an extension of the Bethe-Bacher approximation, and tested against the exact expression. Simplified expression for first-forbidden shape factors have also been developed. A comparison of the calculated dose point kernel for 32 P with experimental data indicates good agreement with a significant improvement over the earlier results in this respect. An analytic representation of the dose point kernel associated with the spectrum of a single beta group has been formulated. 9 references, 16 figures, 3 tables

  10. Operating point considerations for the Reference Theta-Pinch Reactor (RTPR)

    International Nuclear Information System (INIS)

    Krakowski, R.A.; Miller, R.L.; Hagenson, R.L.

    1976-01-01

    Aspects of the continuing engineering design-point reassessment and optimization of the Reference Theta-Pinch Reactor (RTPR) are discussed. An updated interim design point which achieves a favorable energy balance and involves relaxed technological requirements, which nonetheless satisfy more rigorous physics and engineering constraints, is presented

  11. Packaging configurations and handling requirements for nuclear materials

    International Nuclear Information System (INIS)

    Jefferson, R.M.

    1981-01-01

    The basic safety concepts for radioactive material are that the package is the primary protection for the public, that the protection afforded by the package should be proportional to the hazard and that the package must be proved by performance. These principles are contained in Department of Energy (DOE), Nuclear Regulatory Commission (NRC) and Department of Transportation (DOT) regulations which classify hazards of various radioactive materials and link packaging requirements to the physical form and quantities being shipped. Packaging requirements are reflected in performance standards to guarantee that shipments of low hazard quantities will survive the rigors of normal transportation and that shipments of high hazard quantities will survive extreme severity transportation accidents. Administrative controls provide for segregation of radioactive material from people and other sensitive or hazardous material. They also provide the necessary information function to control the total amounts in a conveyance and to assure that appropriate emergency response activities be started in case of accidents or other emergencies. Radioactive materials shipped in conjunction with the nuclear reactor programs include, ores, concentrates, gaseous diffusion feedstocks, enriched and depleted uranium, fresh fuel, spent fuel, high level wastes, low level wastes and transuranic wastes. Each material is packaged and shipped in accordance with regulations and all hazard classes, quantity limits and packaging types are called into use. From the minimal requirements needed to ship the low hazard uranium ores or concentrates to the very stringent requirements in packaging and moving high level wastes or spent fuel, the regulatory system provides a means for carrying out transportation of radioactive material which assures low and controlled risk to the public

  12. Reassessing Function Points

    Directory of Open Access Journals (Sweden)

    G.R. Finnie

    1997-05-01

    Full Text Available Accurate estimation of the size and development effort for software projects requires estimation models which can be used early enough in the development life cycle to be of practical value. Function Point Analysis (FPA has become possibly the most widely used estimation technique in practice. However the technique was developed in the data processing environment of the 1970's and, despite undergoing considerable reassessment and formalisation, still attracts criticism for the weighting scoring it employs and for the way in which the function point score is adapted for specific system characteristics. This paper reviews the validity of the weighting scheme and the value of adjusting for system characteristics by studying their effect in a sample of 299 software developments. In general the value adjustment scheme does not appear to cater for differences in productivity. The weighting scheme used to adjust system components in terms of being simple, average or complex also appears suspect and should be redesigned to provide a more realistic estimate of system functionality.

  13. Feasibility of Smartphone Based Photogrammetric Point Clouds for the Generation of Accessibility Maps

    Science.gov (United States)

    Angelats, E.; Parés, M. E.; Kumar, P.

    2018-05-01

    Accessible cities with accessible services are an old claim of people with reduced mobility. But this demand is still far away of becoming a reality as lot of work is required to be done yet. First step towards accessible cities is to know about real situation of the cities and its pavement infrastructure. Detailed maps or databases on street slopes, access to sidewalks, mobility in public parks and gardens, etc. are required. In this paper, we propose to use smartphone based photogrammetric point clouds, as a starting point to create accessible maps or databases. This paper analyses the performance of these point clouds and the complexity of the image acquisition procedure required to obtain them. The paper proves, through two test cases, that smartphone technology is an economical and feasible solution to get the required information, which is quite often seek by city planners to generate accessible maps. The proposed approach paves the way to generate, in a near term, accessibility maps through the use of point clouds derived from crowdsourced smartphone imagery.

  14. Zero risk fuel fabrication: a systems analysis

    International Nuclear Information System (INIS)

    1979-01-01

    Zero risk is a concept used to ensure that system requirements are developed through a systems approach such that the choice(s) among alternatives represents the balanced viewpoints of performance, achievability and risk. Requirements to ensure characteristics such as stringent accountability, low personnel exposure and etc. are needed to guide the development of component and subsystems for future LMFBR fuel supply systems. To establish a consistent and objective set of requirements, RF and M-TMC has initiated a systems requirements analysis activity. This activity pivots on judgement and experience provided by a Task Force representing industrial companies engaged in fuel fabrication in licensed facilities. The Task Force members are listed in Appendix A. Input developed by this group is presented as a starting point for the systems requirements analysis

  15. Validation of intermediate end points in cancer research.

    Science.gov (United States)

    Schatzkin, A; Freedman, L S; Schiffman, M H; Dawsey, S M

    1990-11-21

    Investigations using intermediate end points as cancer surrogates are quicker, smaller, and less expensive than studies that use malignancy as the end point. We present a strategy for determining whether a given biomarker is a valid intermediate end point between an exposure and incidence of cancer. Candidate intermediate end points may be selected from case series, ecologic studies, and animal experiments. Prospective cohort and sometimes case-control studies may be used to quantify the intermediate end point-cancer association. The most appropriate measure of this association is the attributable proportion. The intermediate end point is a valid cancer surrogate if the attributable proportion is close to 1.0, but not if it is close to 0. Usually, the attributable proportion is close to neither 1.0 nor 0; in this case, valid surrogacy requires that the intermediate end point mediate an established exposure-cancer relation. This would in turn imply that the exposure effect would vanish if adjusted for the intermediate end point. We discuss the relative advantages of intervention and observational studies for the validation of intermediate end points. This validation strategy also may be applied to intermediate end points for adverse reproductive outcomes and chronic diseases other than cancer.

  16. CMOS Cell Sensors for Point-of-Care Diagnostics

    Science.gov (United States)

    Adiguzel, Yekbun; Kulah, Haluk

    2012-01-01

    The burden of health-care related services in a global era with continuously increasing population and inefficient dissipation of the resources requires effective solutions. From this perspective, point-of-care diagnostics is a demanded field in clinics. It is also necessary both for prompt diagnosis and for providing health services evenly throughout the population, including the rural districts. The requirements can only be fulfilled by technologies whose productivity has already been proven, such as complementary metal-oxide-semiconductors (CMOS). CMOS-based products can enable clinical tests in a fast, simple, safe, and reliable manner, with improved sensitivities. Portability due to diminished sensor dimensions and compactness of the test set-ups, along with low sample and power consumption, is another vital feature. CMOS-based sensors for cell studies have the potential to become essential counterparts of point-of-care diagnostics technologies. Hence, this review attempts to inform on the sensors fabricated with CMOS technology for point-of-care diagnostic studies, with a focus on CMOS image sensors and capacitance sensors for cell studies. PMID:23112587

  17. Accuracy Constraint Determination in Fixed-Point System Design

    Directory of Open Access Journals (Sweden)

    Serizel R

    2008-01-01

    Full Text Available Most of digital signal processing applications are specified and designed with floatingpoint arithmetic but are finally implemented using fixed-point architectures. Thus, the design flow requires a floating-point to fixed-point conversion stage which optimizes the implementation cost under execution time and accuracy constraints. This accuracy constraint is linked to the application performances and the determination of this constraint is one of the key issues of the conversion process. In this paper, a method is proposed to determine the accuracy constraint from the application performance. The fixed-point system is modeled with an infinite precision version of the system and a single noise source located at the system output. Then, an iterative approach for optimizing the fixed-point specification under the application performance constraint is defined and detailed. Finally the efficiency of our approach is demonstrated by experiments on an MP3 encoder.

  18. MIMIC: An Innovative Methodology for Determining Mobile Laser Scanning System Point Density

    Directory of Open Access Journals (Sweden)

    Conor Cahalane

    2014-08-01

    Full Text Available Understanding how various Mobile Mapping System (MMS laser hardware configurations and operating parameters exercise different influence on point density is important for assessing system performance, which in turn facilitates system design and MMS benchmarking. Point density also influences data processing, as objects that can be recognised using automated algorithms generally require a minimum point density. Although obtaining the necessary point density impacts on hardware costs, survey time and data storage requirements, a method for accurately and rapidly assessing MMS performance is lacking for generic MMSs. We have developed a method for quantifying point clouds collected by an MMS with respect to known objects at specified distances using 3D surface normals, 2D geometric formulae and line drawing algorithms. These algorithms were combined in a system called the Mobile Mapping Point Density Calculator (MIMIC and were validated using point clouds captured by both a single scanner and a dual scanner MMS. Results from MIMIC were promising: when considering the number of scan profiles striking the target, the average error equated to less than 1 point per scan profile. These tests highlight that MIMIC is capable of accurately calculating point density for both single and dual scanner MMSs.

  19. Integrated modeling and analysis methodology for precision pointing applications

    Science.gov (United States)

    Gutierrez, Homero L.

    2002-07-01

    Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integrated modeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integrated model resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

  20. A rapid response air quality analysis system for use in projects having stringent quality assurance requirements

    International Nuclear Information System (INIS)

    Bowman, A.W.

    1990-01-01

    This paper describes an approach to solve air quality problems which frequently occur during iterations of the baseline change process. From a schedule standpoint, it is desirable to perform this evaluation in as short a time as possible while budgetary pressures limit the size of the staff available to do the work. Without a method in place to deal with baseline change proposal requests the environment analysts may not be able to produce the analysis results in the time frame expected. Using a concept called the Rapid Response Air Quality Analysis System (RAAS), the problems of timing and cost become tractable. The system could be adapted to assess other atmospheric pathway impacts, e.g., acoustics or visibility. The air quality analysis system used to perform the EA analysis (EA) for the Salt Repository Project (part of the Civilian Radioactive Waste Management Program), and later to evaluate the consequences of proposed baseline changes, consists of three components: Emission source data files; Emission rates contained in spreadsheets; Impact assessment model codes. The spreadsheets contain user-written codes (macros) that calculate emission rates from (1) emission source data (e.g., numbers and locations of sources, detailed operating schedules, and source specifications including horsepower, load factor, and duty cycle); (2) emission factors such as those published by the U.S. Environmental Protection Agency, and (3) control efficiencies

  1. Coarse point cloud registration by EGI matching of voxel clusters

    NARCIS (Netherlands)

    Wang, J.; Lindenbergh, R.C.; Shen, Y.; Menenti, M.

    2016-01-01

    Laser scanning samples the surface geometry of objects efficiently and records versatile information as point clouds. However, often more scans are required to fully cover a scene. Therefore, a registration step is required that transforms the different scans into a common coordinate system. The

  2. Temperature impacts on economic growth warrant stringent mitigation policy

    Science.gov (United States)

    Moore, Frances C.; Diaz, Delavane B.

    2015-02-01

    Integrated assessment models compare the costs of greenhouse gas mitigation with damages from climate change to evaluate the social welfare implications of climate policy proposals and inform optimal emissions reduction trajectories. However, these models have been criticized for lacking a strong empirical basis for their damage functions, which do little to alter assumptions of sustained gross domestic product (GDP) growth, even under extreme temperature scenarios. We implement empirical estimates of temperature effects on GDP growth rates in the DICE model through two pathways, total factor productivity growth and capital depreciation. This damage specification, even under optimistic adaptation assumptions, substantially slows GDP growth in poor regions but has more modest effects in rich countries. Optimal climate policy in this model stabilizes global temperature change below 2 °C by eliminating emissions in the near future and implies a social cost of carbon several times larger than previous estimates. A sensitivity analysis shows that the magnitude of climate change impacts on economic growth, the rate of adaptation, and the dynamic interaction between damages and GDP are three critical uncertainties requiring further research. In particular, optimal mitigation rates are much lower if countries become less sensitive to climate change impacts as they develop, making this a major source of uncertainty and an important subject for future research.

  3. A New Blind Pointing Model Improves Large Reflector Antennas Precision Pointing at Ka-Band (32 GHz)

    Science.gov (United States)

    Rochblatt, David J.

    2009-01-01

    The National Aeronautics and Space Administration (NASA), Jet Propulsion Laboratory (JPL)-Deep Space Network (DSN) subnet of 34-m Beam Waveguide (BWG) Antennas was recently upgraded with Ka-Band (32-GHz) frequency feeds for space research and communication. For normal telemetry tracking a Ka-Band monopulse system is used, which typically yields 1.6-mdeg mean radial error (MRE) pointing accuracy on the 34-m diameter antennas. However, for the monopulse to be able to acquire and lock, for special radio science applications where monopulse cannot be used, or as a back-up for the monopulse, high-precision open-loop blind pointing is required. This paper describes a new 4th order pointing model and calibration technique, which was developed and applied to the DSN 34-m BWG antennas yielding 1.8 to 3.0-mdeg MRE pointing accuracy and amplitude stability of 0.2 dB, at Ka-Band, and successfully used for the CASSINI spacecraft occultation experiment at Saturn and Titan. In addition, the new 4th order pointing model was used during a telemetry experiment at Ka-Band (32 GHz) utilizing the Mars Reconnaissance Orbiter (MRO) spacecraft while at a distance of 0.225 astronomical units (AU) from Earth and communicating with a DSN 34-m BWG antenna at a record high rate of 6-megabits per second (Mb/s).

  4. Concurrent growth rate and transcript analyses reveal essential gene stringency in Escherichia coli.

    Directory of Open Access Journals (Sweden)

    Shan Goh

    Full Text Available BACKGROUND: Genes essential for bacterial growth are of particular scientific interest. Many putative essential genes have been identified or predicted in several species, however, little is known about gene expression requirement stringency, which may be an important aspect of bacterial physiology and likely a determining factor in drug target development. METHODOLOGY/PRINCIPAL FINDINGS: Working from the premise that essential genes differ in absolute requirement for growth, we describe silencing of putative essential genes in E. coli to obtain a titration of declining growth rates and transcript levels by using antisense peptide nucleic acids (PNA and expressed antisense RNA. The relationship between mRNA decline and growth rate decline reflects the degree of essentiality, or stringency, of an essential gene, which is here defined by the minimum transcript level for a 50% reduction in growth rate (MTL(50. When applied to four growth essential genes, both RNA silencing methods resulted in MTL(50 values that reveal acpP as the most stringently required of the four genes examined, with ftsZ the next most stringently required. The established antibacterial targets murA and fabI were less stringently required. CONCLUSIONS: RNA silencing can reveal stringent requirements for gene expression with respect to growth. This method may be used to validate existing essential genes and to quantify drug target requirement.

  5. Conference: ActiWiz – Optimizing material selection at CERN's accelerators from the radiological point of view

    CERN Multimedia

    2012-01-01

    by Dr. Helmut Vincke (CERN), Chris Theis (CERN). Tuesday, October 30, 2012 from 15:00 to 16:30 at CERN ( 864-1-D02 - BE Auditorium Prévessin ) Description: The operation of a high-energy accelerator inevitably triggers the activation of equipment, which poses a safety hazard. Consequently access and handling constraints have to be imposed to ensure optimized working conditions. One of the key parameters determining the level of radioactivity is the material composition. Considering the radiological impact in addition to the engineering requirements during the selection of material clearly results in a safety benefit as well as a more efficient accelerator operation due to less stringent access and handling constraints. Another aspect is the minimization of future radioactive waste, which constitutes an important part of CERN’s commitment to limit its environmental impact by applying best practices. The ActiWiz software developed at CERN provides an easy to use method to optimize the m...

  6. Real World SharePoint 2010 Indispensable Experiences from 22 MVPs

    CERN Document Server

    Hillier, Scot; Bishop, Darrin; Bleeker, Todd; Bogue, Robert; Bosch, Karine; Brotto, Claudio; Buenz, Adam; Connell, Andrew; Drisgill, Randy; Lapointe, Gary; Medero, Jason; Molnar, Agnes; O'Brien, Chris; Klindt, Todd; Poelmans, Joris; Rehmani, Asif; Ross, John; Swan, Nick; Walsh, Mike; Williams, Randy; Young, Shane; Macori, Igor

    2010-01-01

    Proven real-world best practices from leading Microsoft SharePoint MVPsSharePoint enables Web sites to host shared workspaces and is a leading solution for Enterprise Content Management. The newest version boasts significant changes, impressive enhancements, and new features, requiring developers and administrators of all levels of experience to quickly get up to speed on the latest changes. This book is a must-have anthology of current best practices for SharePoint 2010 from 20 of the top SharePoint MVPs. They offer insider advice on everything from installation, workflow, and Web parts to bu

  7. Developing control points for halal slaughtering of poultry.

    Science.gov (United States)

    Shahdan, I A; Regenstein, J M; Shahabuddin, A S M; Rahman, M T

    2016-07-01

    Halal (permissible or lawful) poultry meat production must meet industry, economic, and production needs, and government health requirements without compromising the Islamic religious requirements derived from the Qur'an and the Hadiths (the actions and sayings of the Prophet Muhammad, peace and blessings be upon him). Halal certification authorities may vary in their interpretation of these teachings, which leads to differences in halal slaughter requirements. The current study proposes 6 control points (CP) for halal poultry meat production based on the most commonly used halal production systems. CP 1 describes what is allowed and prohibited, such as blood and animal manure, and feed ingredients for halal poultry meat production. CP 2 describes the requirements for humane handling during lairage. CP 3 describes different methods for immobilizing poultry, when immobilization is used, such as water bath stunning. CP 4 describes the importance of intention, details of the halal slaughter, and the equipment permitted. CP 5 and CP 6 describe the requirements after the neck cut has been made such as the time needed before the carcasses can enter the scalding tank, and the potential for meat adulteration with fecal residues and blood. It is important to note that the proposed halal CP program is presented as a starting point for any individual halal certifying body to improve its practices. © 2016 Poultry Science Association Inc.

  8. Accounting for professionalism: an innovative point system to assess resident professionalism

    Directory of Open Access Journals (Sweden)

    Gary L. Malakoff

    2014-04-01

    Full Text Available Background: Professionalism is a core competency for residency required by the Accreditation Council of Graduate Medical Education. We sought a means to objectively assess professionalism among internal medicine and transitional year residents. Innovation: We established a point system to document unprofessional behaviors demonstrated by internal medicine and transitional year residents along with opportunities to redeem such negative points by deliberate positive professional acts. The intent of the policy is to assist residents in becoming aware of what constitutes unprofessional behavior and to provide opportunities for remediation by accruing positive points. A committee of core faculty and department leadership including the program director and clinic nurse manager determines professionalism points assigned. Negative points might be awarded for tardiness to mandatory or volunteered for events without a valid excuse, late evaluations or other paperwork required by the department, non-attendance at meetings prepaid by the department, and inappropriate use of personal days or leave. Examples of actions through which positive points can be gained to erase negative points include delivery of a mentored pre-conference talk, noon conference, medical student case/shelf review session, or a written reflection. Results: Between 2009 and 2012, 83 residents have trained in our program. Seventeen categorical internal medicine and two transitional year residents have been assigned points. A total of 55 negative points have been assigned and 19 points have been remediated. There appears to be a trend of fewer negative points and more positive points being assigned over each of the past three academic years. Conclusion: Commitment to personal professional behavior is a lifelong process that residents must commit to during their training. A professionalism policy, which employs a point system, has been instituted in our programs and may be a novel tool to

  9. Pointing control using a moving base of support.

    Science.gov (United States)

    Hondzinski, Jan M; Kwon, Taegyong

    2009-07-01

    The purposes of this study were to determine whether gaze direction provides a control signal for movement direction for a pointing task requiring a step and to gain insight into why discrepancies previously identified in the literature for endpoint accuracy with gaze directed eccentrically exist. Straight arm pointing movements were performed to real and remembered target locations, either toward or 30 degrees eccentric to gaze direction. Pointing occurred in normal room lighting or darkness while subjects sat, stood still or side-stepped left or right. Trunk rotation contributed 22-65% to gaze orientations when it was not constrained. Error differences for different target locations explained discrepancies among previous experiments. Variable pointing errors were influenced by gaze direction, while mean systematic pointing errors and trunk orientations were influenced by step direction. These data support the use of a control strategy that relies on gaze direction and equilibrium inputs for whole-body goal-directed movements.

  10. End points and assessments in esthetic dental treatment.

    Science.gov (United States)

    Ishida, Yuichi; Fujimoto, Keiko; Higaki, Nobuaki; Goto, Takaharu; Ichikawa, Tetsuo

    2015-10-01

    There are two key considerations for successful esthetic dental treatments. This article systematically describes the two key considerations: the end points of esthetic dental treatments and assessments of esthetic outcomes, which are also important for acquiring clinical skill in esthetic dental treatments. The end point and assessment of esthetic dental treatment were discussed through literature reviews and clinical practices. Before designing a treatment plan, the end point of dental treatment should be established. The section entitled "End point of esthetic dental treatment" discusses treatments for maxillary anterior teeth and the restoration of facial profile with prostheses. The process of assessing treatment outcomes entitled "Assessments of esthetic dental treatment" discusses objective and subjective evaluation methods. Practitioners should reach an agreement regarding desired end points with patients through medical interviews, and continuing improvements and developments of esthetic assessments are required to raise the therapeutic level of esthetic dental treatments. Copyright © 2015 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  11. Development of a High Temperature Antenna Pointing Mechanism for BepiColombo Planetary Orbiter

    Science.gov (United States)

    Campo, Pablo; Barrio, Aingeru; Puente, Nicolas; Kyle, Robert

    2013-09-01

    BepiColombo is an ESA mission to Mercury its planetary orbiter (MPO) has two antenna pointing mechanism, High gain antenna pointing mechanism steers and points a large reflector which is integrated at system level by TAS-I Rome. Medium gain antenna (MGA) APM points a 1.5 m boom with a horn antenna. Both radiating elements exposed to sun fluxes as high as 10 solar constants without protections.The pointing mechanism is a major challenge as high performances are required in a harsh environment. It has required the development of new technologies, and components specially dedicated for the mission needs. Some of the state of the art required for the mission was achieved during the preparatory technology development activities [1]. However the number of critical elements involved, and the difficulties of some areas have required the continuation of the developments, and new research activities had to be launched in CD phase. Some of the major concerns and related areas of development are:- High temperature and long life requirements for the gearhead motors (up to 15500 equivalent APM revolutions, 19 million motor revolution)- Low thermal distortion of the mechanical chain, being at the same time insulating from external environment and interfaces (55 arcsec pointing error)- Low heat leak to the spacecraft (in the order of 50W per APM)- High precision position control, low microvibration noise and error stability in motion (16 arcsec/s)- High power radio frequency (18W in band Ka, 30 in X band) with phase stability for use in radio-science (3mm in Ka band, 5o in X band).- Wide range of motion (full 360o with end-stops)Currently HGA APM EQM azimuth and elevation stages are assembled and ready for test at actuator level.

  12. Pacific Northwest National Laboratory Facility Radionuclide Emission Points and Sampling Systems

    International Nuclear Information System (INIS)

    Barfuss, Brad C.; Barnett, J. M.; Ballinger, Marcel Y.

    2009-01-01

    Battelle-Pacific Northwest Division operates numerous research and development laboratories in Richland, Washington, including those associated with the Pacific Northwest National Laboratory (PNNL) on the Department of Energy's Hanford Site that have the potential for radionuclide air emissions. The National Emission Standard for Hazardous Air Pollutants (NESHAP 40 CFR 61, Subparts H and I) requires an assessment of all effluent release points that have the potential for radionuclide emissions. Potential emissions are assessed annually. Sampling, monitoring, and other regulatory compliance requirements are designated based upon the potential-to-emit dose criteria found in the regulations. The purpose of this document is to describe the facility radionuclide air emission sampling program and provide current and historical facility emission point system performance, operation, and design information. A description of the buildings, exhaust points, control technologies, and sample extraction details is provided for each registered or deregistered facility emission point. Additionally, applicable stack sampler configuration drawings, figures, and photographs are provided

  13. Pacific Northwest National Laboratory Facility Radionuclide Emission Points and Sampling Systems

    Energy Technology Data Exchange (ETDEWEB)

    Barfuss, Brad C.; Barnett, J. Matthew; Ballinger, Marcel Y.

    2009-04-08

    Battelle—Pacific Northwest Division operates numerous research and development laboratories in Richland, Washington, including those associated with the Pacific Northwest National Laboratory (PNNL) on the Department of Energy’s Hanford Site that have the potential for radionuclide air emissions. The National Emission Standard for Hazardous Air Pollutants (NESHAP 40 CFR 61, Subparts H and I) requires an assessment of all effluent release points that have the potential for radionuclide emissions. Potential emissions are assessed annually. Sampling, monitoring, and other regulatory compliance requirements are designated based upon the potential-to-emit dose criteria found in the regulations. The purpose of this document is to describe the facility radionuclide air emission sampling program and provide current and historical facility emission point system performance, operation, and design information. A description of the buildings, exhaust points, control technologies, and sample extraction details is provided for each registered or deregistered facility emission point. Additionally, applicable stack sampler configuration drawings, figures, and photographs are provided.

  14. Coding and decoding in a point-to-point communication using the polarization of the light beam.

    Science.gov (United States)

    Kavehvash, Z; Massoumian, F

    2008-05-10

    A new technique for coding and decoding of optical signals through the use of polarization is described. In this technique the concept of coding is translated to polarization. In other words, coding is done in such a way that each code represents a unique polarization. This is done by implementing a binary pattern on a spatial light modulator in such a way that the reflected light has the required polarization. Decoding is done by the detection of the received beam's polarization. By linking the concept of coding to polarization we can use each of these concepts in measuring the other one, attaining some gains. In this paper the construction of a simple point-to-point communication where coding and decoding is done through polarization will be discussed.

  15. Asymptotic stability estimates near an equilibrium point

    Science.gov (United States)

    Dumas, H. Scott; Meyer, Kenneth R.; Palacián, Jesús F.; Yanguas, Patricia

    2017-07-01

    We use the error bounds for adiabatic invariants found in the work of Chartier, Murua and Sanz-Serna [3] to bound the solutions of a Hamiltonian system near an equilibrium over exponentially long times. Our estimates depend only on the linearized system and not on the higher order terms as in KAM theory, nor do we require any steepness or convexity conditions as in Nekhoroshev theory. We require that the equilibrium point where our estimate applies satisfy a type of formal stability called Lie stability.

  16. Analysis of point source size on measurement accuracy of lateral point-spread function of confocal Raman microscopy

    Science.gov (United States)

    Fu, Shihang; Zhang, Li; Hu, Yao; Ding, Xiang

    2018-01-01

    Confocal Raman Microscopy (CRM) has matured to become one of the most powerful instruments in analytical science because of its molecular sensitivity and high spatial resolution. Compared with conventional Raman Microscopy, CRM can perform three dimensions mapping of tiny samples and has the advantage of high spatial resolution thanking to the unique pinhole. With the wide application of the instrument, there is a growing requirement for the evaluation of the imaging performance of the system. Point-spread function (PSF) is an important approach to the evaluation of imaging capability of an optical instrument. Among a variety of measurement methods of PSF, the point source method has been widely used because it is easy to operate and the measurement results are approximate to the true PSF. In the point source method, the point source size has a significant impact on the final measurement accuracy. In this paper, the influence of the point source sizes on the measurement accuracy of PSF is analyzed and verified experimentally. A theoretical model of the lateral PSF for CRM is established and the effect of point source size on full-width at half maximum of lateral PSF is simulated. For long-term preservation and measurement convenience, PSF measurement phantom using polydimethylsiloxane resin, doped with different sizes of polystyrene microspheres is designed. The PSF of CRM with different sizes of microspheres are measured and the results are compared with the simulation results. The results provide a guide for measuring the PSF of the CRM.

  17. Floating-to-Fixed-Point Conversion for Digital Signal Processors

    Directory of Open Access Journals (Sweden)

    Menard Daniel

    2006-01-01

    Full Text Available Digital signal processing applications are specified with floating-point data types but they are usually implemented in embedded systems with fixed-point arithmetic to minimise cost and power consumption. Thus, methodologies which establish automatically the fixed-point specification are required to reduce the application time-to-market. In this paper, a new methodology for the floating-to-fixed point conversion is proposed for software implementations. The aim of our approach is to determine the fixed-point specification which minimises the code execution time for a given accuracy constraint. Compared to previous methodologies, our approach takes into account the DSP architecture to optimise the fixed-point formats and the floating-to-fixed-point conversion process is coupled with the code generation process. The fixed-point data types and the position of the scaling operations are optimised to reduce the code execution time. To evaluate the fixed-point computation accuracy, an analytical approach is used to reduce the optimisation time compared to the existing methods based on simulation. The methodology stages are described and several experiment results are presented to underline the efficiency of this approach.

  18. Floating-to-Fixed-Point Conversion for Digital Signal Processors

    Science.gov (United States)

    Menard, Daniel; Chillet, Daniel; Sentieys, Olivier

    2006-12-01

    Digital signal processing applications are specified with floating-point data types but they are usually implemented in embedded systems with fixed-point arithmetic to minimise cost and power consumption. Thus, methodologies which establish automatically the fixed-point specification are required to reduce the application time-to-market. In this paper, a new methodology for the floating-to-fixed point conversion is proposed for software implementations. The aim of our approach is to determine the fixed-point specification which minimises the code execution time for a given accuracy constraint. Compared to previous methodologies, our approach takes into account the DSP architecture to optimise the fixed-point formats and the floating-to-fixed-point conversion process is coupled with the code generation process. The fixed-point data types and the position of the scaling operations are optimised to reduce the code execution time. To evaluate the fixed-point computation accuracy, an analytical approach is used to reduce the optimisation time compared to the existing methods based on simulation. The methodology stages are described and several experiment results are presented to underline the efficiency of this approach.

  19. Environmental and public interface for Point Aconi generating station, Point Aconi, Nova Scotia

    Energy Technology Data Exchange (ETDEWEB)

    Toner, T P

    1993-01-01

    Nova Scotia Power's most recent generating station is a 165 MW coal-fired circulating fluidized bed (CFB) unit located at Point Aconi on the northern tip of Boularderie Island. This paper discusses the environmental and public interfaces associated with this project, particularly on the unique items and issues requiring delicate and/or innovative approaches for their successful completion. Specific issues discussed include clarification of the process, the turnkey arrangement, the community liaison committee, freshwater supply, air emissions and dealings with commercial growers, dealings with lobster fishermen, dealings with Native peoples, and the transmission line.

  20. Generation of a statistical shape model with probabilistic point correspondences and the expectation maximization- iterative closest point algorithm

    International Nuclear Information System (INIS)

    Hufnagel, Heike; Pennec, Xavier; Ayache, Nicholas; Ehrhardt, Jan; Handels, Heinz

    2008-01-01

    Identification of point correspondences between shapes is required for statistical analysis of organ shapes differences. Since manual identification of landmarks is not a feasible option in 3D, several methods were developed to automatically find one-to-one correspondences on shape surfaces. For unstructured point sets, however, one-to-one correspondences do not exist but correspondence probabilities can be determined. A method was developed to compute a statistical shape model based on shapes which are represented by unstructured point sets with arbitrary point numbers. A fundamental problem when computing statistical shape models is the determination of correspondences between the points of the shape observations of the training data set. In the absence of landmarks, exact correspondences can only be determined between continuous surfaces, not between unstructured point sets. To overcome this problem, we introduce correspondence probabilities instead of exact correspondences. The correspondence probabilities are found by aligning the observation shapes with the affine expectation maximization-iterative closest points (EM-ICP) registration algorithm. In a second step, the correspondence probabilities are used as input to compute a mean shape (represented once again by an unstructured point set). Both steps are unified in a single optimization criterion which depe nds on the two parameters 'registration transformation' and 'mean shape'. In a last step, a variability model which best represents the variability in the training data set is computed. Experiments on synthetic data sets and in vivo brain structure data sets (MRI) are then designed to evaluate the performance of our algorithm. The new method was applied to brain MRI data sets, and the estimated point correspondences were compared to a statistical shape model built on exact correspondences. Based on established measures of ''generalization ability'' and ''specificity'', the estimates were very satisfactory

  1. Points requiring elucidation” about Hawaiian volcanism: Chapter 24

    Science.gov (United States)

    Poland, Michael P.; Carey, Rebecca; Cayol, Valérie; Poland, Michael P.; Weis, Dominique

    2015-01-01

    Hawaiian volcanoes, which are easily accessed and observed at close range, are among the most studied on the planet and have spurred great advances in the geosciences, from understanding deep Earth processes to forecasting volcanic eruptions. More than a century of continuous observation and study of Hawai‘i's volcanoes has also sharpened focus on those questions that remain unanswered. Although there is good evidence that volcanism in Hawai‘i is the result of a high-temperature upwelling plume from the mantle, the source composition and dynamics of the plume are controversial. Eruptions at the surface build the volcanoes of Hawai‘i, but important topics, including how the volcanoes grow and collapse and how magma is stored and transported, continue to be subjects of intense research. Forecasting volcanic activity is based mostly on pattern recognition, but determining and predicting the nature of eruptions, especially in serving the critical needs of hazards mitigation, require more realistic models and a greater understanding of what drives eruptive activity. These needs may be addressed by better integration among disciplines as well as by developing dynamic physics- and chemistry-based models that more thoroughly relate the physiochemical behavior of Hawaiian volcanism, from the deep Earth to the surface, to geological, geochemical, and geophysical data.

  2. Requirements on the Redshift Accuracy for future Supernova and Number Count Surveys

    International Nuclear Information System (INIS)

    Huterer, Dragan; Kim, Alex; Broderick, Tamara

    2004-01-01

    We investigate the required redshift accuracy of type Ia supernova and cluster number-count surveys in order for the redshift uncertainties not to contribute appreciably to the dark energy parameter error budget. For the SNAP supernova experiment, we find that, without the assistance of ground-based measurements, individual supernova redshifts would need to be determined to about 0.002 or better, which is a challenging but feasible requirement for a low-resolution spectrograph. However, we find that accurate redshifts for z < 0.1 supernovae, obtained with ground-based experiments, are sufficient to immunize the results against even relatively large redshift errors at high z. For the future cluster number-count surveys such as the South Pole Telescope, Planck or DUET, we find that the purely statistical error in photometric redshift is less important, and that the irreducible, systematic bias in redshift drives the requirements. The redshift bias will have to be kept below 0.001-0.005 per redshift bin (which is determined by the filter set), depending on the sky coverage and details of the definition of the minimal mass of the survey. Furthermore, we find that X-ray surveys have a more stringent required redshift accuracy than Sunyaev-Zeldovich (SZ) effect surveys since they use a shorter lever arm in redshift; conversely, SZ surveys benefit from their high redshift reach only so long as some redshift information is available for distant (zgtrsim1) clusters

  3. Scattering and absorption of particles emitted by a point source in a cluster of point scatterers

    International Nuclear Information System (INIS)

    Liljequist, D.

    2012-01-01

    A theory for the scattering and absorption of particles isotropically emitted by a point source in a cluster of point scatterers is described and related to the theory for the scattering of an incident particle beam. The quantum mechanical probability of escape from the cluster in different directions is calculated, as well as the spatial distribution of absorption events within the cluster. A source strength renormalization procedure is required. The average quantum scattering in clusters with randomly shifting scatterer positions is compared to trajectory simulation with the aim of studying the validity of the trajectory method. Differences between the results of the quantum and trajectory methods are found primarily for wavelengths larger than the average distance between nearest neighbour scatterers. The average quantum results include, for example, a local minimum in the number of absorption events at the location of the point source and interference patterns in the angle-dependent escape probability as well as in the distribution of absorption events. The relative error of the trajectory method is in general, though not generally, of similar magnitude as that obtained for beam scattering.

  4. Coarse Point Cloud Registration by Egi Matching of Voxel Clusters

    Science.gov (United States)

    Wang, Jinhu; Lindenbergh, Roderik; Shen, Yueqian; Menenti, Massimo

    2016-06-01

    Laser scanning samples the surface geometry of objects efficiently and records versatile information as point clouds. However, often more scans are required to fully cover a scene. Therefore, a registration step is required that transforms the different scans into a common coordinate system. The registration of point clouds is usually conducted in two steps, i.e. coarse registration followed by fine registration. In this study an automatic marker-free coarse registration method for pair-wise scans is presented. First the two input point clouds are re-sampled as voxels and dimensionality features of the voxels are determined by principal component analysis (PCA). Then voxel cells with the same dimensionality are clustered. Next, the Extended Gaussian Image (EGI) descriptor of those voxel clusters are constructed using significant eigenvectors of each voxel in the cluster. Correspondences between clusters in source and target data are obtained according to the similarity between their EGI descriptors. The random sampling consensus (RANSAC) algorithm is employed to remove outlying correspondences until a coarse alignment is obtained. If necessary, a fine registration is performed in a final step. This new method is illustrated on scan data sampling two indoor scenarios. The results of the tests are evaluated by computing the point to point distance between the two input point clouds. The presented two tests resulted in mean distances of 7.6 mm and 9.5 mm respectively, which are adequate for fine registration.

  5. The Purification Method of Matching Points Based on Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    DONG Yang

    2017-02-01

    Full Text Available The traditional purification method of matching points usually uses a small number of the points as initial input. Though it can meet most of the requirements of point constraints, the iterative purification solution is easy to fall into local extreme, which results in the missing of correct matching points. To solve this problem, we introduce the principal component analysis method to use the whole point set as initial input. And thorough mismatching points step eliminating and robust solving, more accurate global optimal solution, which intends to reduce the omission rate of correct matching points and thus reaches better purification effect, can be obtained. Experimental results show that this method can obtain the global optimal solution under a certain original false matching rate, and can decrease or avoid the omission of correct matching points.

  6. The conserved glycine residues in the transmembrane domain of the Semliki Forest virus fusion protein are not required for assembly and fusion

    International Nuclear Information System (INIS)

    Liao Maofu; Kielian, Margaret

    2005-01-01

    The alphavirus Semliki Forest virus (SFV) infects cells via a low pH-triggered fusion reaction mediated by the viral E1 protein. Both the E1 fusion peptide and transmembrane (TM) domain are essential for membrane fusion, but the functional requirements for the TM domain are poorly understood. Here we explored the role of the five TM domain glycine residues, including the highly conserved glycine pair at E1 residues 415/416. SFV mutants with alanine substitutions for individual or all five glycine residues (5G/A) showed growth kinetics and fusion pH dependence similar to those of wild-type SFV. Mutants with increasing substitution of glycine residues showed an increasingly more stringent requirement for cholesterol during fusion. The 5G/A mutant showed decreased fusion kinetics and extent in fluorescent lipid mixing assays. TM domain glycine residues thus are not required for efficient SFV fusion or assembly but can cause subtle effects on the properties of membrane fusion

  7. The fixed point structure of lattice field theories

    International Nuclear Information System (INIS)

    Baier, R.; Reusch, H.J.; Lang, C.B.

    1989-01-01

    Monte-Carlo renormalization group methods allow to analyze lattice regularized quantum field theories. The properties of the quantized field theory in the continuum may be recovered at a critical point of the lattice model. This requires a study of the phase diagram and the renormalization flow structure of the coupling constants. As an example the authors discuss the results of a recent MCRG investigation of the SU(2) adjoint Higgs model, where they find evidence for the existence of a tricritical point at finite values of the inverse gauge coupling β

  8. Engineering to Control Noise, Loading, and Optimal Operating Points

    International Nuclear Information System (INIS)

    Mitchell R. Swartz

    2000-01-01

    Successful engineering of low-energy nuclear systems requires control of noise, loading, and optimum operating point (OOP) manifolds. The latter result from the biphasic system response of low-energy nuclear reaction (LENR)/cold fusion systems, and their ash production rate, to input electrical power. Knowledge of the optimal operating point manifold can improve the reproducibility and efficacy of these systems in several ways. Improved control of noise, loading, and peak production rates is available through the study, and use, of OOP manifolds. Engineering of systems toward the OOP-manifold drive-point peak may, with inclusion of geometric factors, permit more accurate uniform determinations of the calibrated activity of these materials/systems

  9. Chemical Quality Control on Water Produced from WFI System at Medical Technology Division, Nuclear Malaysia

    International Nuclear Information System (INIS)

    Muhammad Hanaffi Mohd Mokhtar; Norhafizah Othman; Muhamad Syazwan Zulkifli

    2015-01-01

    Water for Injection (WFI) used in production of Technetium-99m (Tc-99m) bound to stringent specifications in order to comply with Good Manufacturing Practice (GMP) requirements. The WFI should meet British Pharmacopoeia (BP) and United States Pharmacopoeia (USP) grade specifications. The goal of the study is to test the WFI system whether it is running optimally and operating normally after undergo some repairs and maintenance s. Physical appearance, pH, total organic carbon and conductivity test is done in order to evaluate the water quality which consequently implies indirectly the condition of the water system. Results have shown that purified water from sampling point 4 and WFI from sampling point 20 indirectly indicate possible problems in the water system. Overall the WFI system running normally and optimally except for sampling point 4 and 20 which need further investigation. (author)

  10. Induced Temporal Signatures for Point-Source Detection

    International Nuclear Information System (INIS)

    Stephens, Daniel L.; Runkle, Robert C.; Carlson, Deborah K.; Peurrung, Anthony J.; Seifert, Allen; Wyatt, Cory R.

    2005-01-01

    Detection of radioactive point-sized sources is inherently divided into two regimes encompassing stationary and moving detectors. The two cases differ in their treatment of background radiation and its influence on detection sensitivity. In the stationary detector case the statistical fluctuation of the background determines the minimum detectable quantity. In the moving detector case the detector may be subjected to widely and irregularly varying background radiation, as a result of geographical and environmental variation. This significant systematic variation, in conjunction with the statistical variation of the background, requires a conservative threshold to be selected to yield the same false-positive rate as the stationary detection case. This results in lost detection sensitivity for real sources. This work focuses on a simple and practical modification of the detector geometry that increase point-source recognition via a distinctive temporal signature. A key part of this effort is the integrated development of both detector geometries that induce a highly distinctive signature for point sources and the development of statistical algorithms able to optimize detection of this signature amidst varying background. The identification of temporal signatures for point sources has been demonstrated and compared with the canonical method showing good results. This work demonstrates that temporal signatures are efficient at increasing point-source discrimination in a moving detector system

  11. Increasing radiographer productivity by an incentive point system.

    Science.gov (United States)

    Williams, B; Chacko, P T

    1983-01-01

    Because of a very low technologist productivity in their Radiology Department, the authors describe a Productive Point System they developed and implemented to solve this personnel problem. After establishing the average time required to perform all exams, point credits (one point for every ten minutes utilized) were assigned to each exam performed, thereby determining an index of production. A Productive Index of 80% was considered realistic and was the equivalent of 192 points for a 40-hour work week. From 1975 to 1978 personal productivity increased from 79% to 113%. This resulted in an average yearly fiscal savings of over $20,000.00 for this three-year period. There was also a significant improvement in exam efficiency and quality, job attitude, personnel morale, and public relations. This program was highly successful because technologist acceptance and cooperation was complete, and this occurred mainly because the system supports the normal occupational goals and expectations of technologists.

  12. Wrox SharePoint 2010 SharePoint911 three-pack

    CERN Document Server

    Klindt, Todd; Mason, Jennifer; Rogers, Laura; Drisgill, Randy; Ross, John; Riemann, Larry; Perran, Amanda; Perran, Shane; Sanford, Jacob J; Stubbs, Paul; Caravajal, Steve

    2012-01-01

    The Wrox SharePoint 2010 SharePoint911 Three-Pack combines the contents of three full e-books written by the experts from SharePoint911.  That's over 1800 pages of hands-on advice from Todd Klindt, Shane Young, Laura Rogers, Randy Drisgill, Jennifer Mason, John Ross, and Larry Riemann, among others. In Beginning SharePoint 2010: Building Business Solutions with SharePoint (ISBN 978-0-470-61789-2) by Amanda Perran, Shane Perran, Jennifer Mason, and Laura Rogers, readers learn the core concepts, terminology, and features of SharePoint 2010. In Professiona

  13. Automatic continuous dew point measurement in combustion gases

    Energy Technology Data Exchange (ETDEWEB)

    Fehler, D.

    1986-08-01

    Low exhaust temperatures serve to minimize energy consumption in combustion systems. This requires accurate, continuous measurement of exhaust condensation. An automatic dew point meter for continuous operation is described. The principle of measurement, the design of the measuring system, and practical aspects of operation are discussed.

  14. Secure firmware updates for point of sale terminals

    CSIR Research Space (South Africa)

    Tsague, HD

    2015-03-01

    Full Text Available of the equipment. In particular, there is an important cost related to the deployment of new software upgrades for the point of sale terminals, since in most cases human intervention is required. In this paper, we present a lightweight protocol for secure firmware...

  15. Quantifying requirements volatility effects

    NARCIS (Netherlands)

    Kulk, G.P.; Verhoef, C.

    2008-01-01

    In an organization operating in the bancassurance sector we identified a low-risk IT subportfolio of 84 IT projects comprising together 16,500 function points, each project varying in size and duration, for which we were able to quantify its requirements volatility. This representative portfolio

  16. The IAEAs incident and emergency centre: the global focal point for nuclear and radiological emergency preparedness and response

    Energy Technology Data Exchange (ETDEWEB)

    Buglova, E.

    2016-08-01

    The continuous use of nuclear power to generate electricity and the continued threat of radioactive materials being used for nefarious reasons reminds us of the importance to stay prepared to respond to nuclear or radiological emergencies. Stringent nuclear safety and nuclear security requirements, the training of personnel, operational checks and legal frameworks cannot always prevent radiation-related emergencies. Though these events can range in severity, each has the potential to cause harm to the public, employees, patients, property and the environment. Until the Chernobyl nuclear accident in 1986, there was no international information exchange system. Immediately following that accident, the international community negotiated the so-called Emergency Conventions to ensure that the country suffering an accident with an international transboundary release of radioactive material would issue timely, authenticated information, while the States that could field technical support, would do so in a coordinated fashion. The Conventions also place specific legal obligations on the International Atomic energy Agency (IAEA) with regard to emergency preparedness and response. (Author)

  17. Brayton Point coal conversion project (NEPCO)

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, W.F. Jr.

    1982-05-01

    The New England Power Company (NEPCO) recently converted Brayton Point Power Station Units 1, 2, and 3 from oil to coal. The coal conversion project is the largest coal conversion project in the nation to date. Stone and Webster Engineering Corporation (SWEC) was hired as the engineer/constructor for the project. Units 1 and 2 are 250-MW Combustion Engineering boilers, and Unit 3 is a 650-MW Babcock and Wilcox boiler. All three units were originally designed to burn pulverized coal but were converted to oil during the years of low oil prices. Studies performed by NEPCO and SWEC indicated that the areas discussed in the following paragraphs required upgrading before the units could efficiently burn coal and meet Federal and State environmental requirements. All units have been converted and are operating. This paper discusses design modifications required to burn coal, startup, and initial operating problems, and solutions.

  18. Beginning SharePoint 2010 Administration Windows SharePoint Foundation 2010 and Microsoft SharePoint Server 2010

    CERN Document Server

    Husman, Göran

    2010-01-01

    Complete coverage on the latest advances in SharePoint 2010 administration. SharePoint 2010 comprises an abundance of new features, and this book shows you how to take advantage of all SharePoint 2010's many improvements. Written by a four-time SharePoint MVP, Beginning SharePoint 2010 Administration begins with a comparison of SharePoint 2010 compared to the previous version and then examines the differences between WSS 4.0 and MSS 2010. Packed with step-by-step instructions, tips and tricks, and real-world examples, this book dives into the basics of how to install, manage, and administrate

  19. Determination of point of incidence for the case of reflection or refraction at spherical surface knowing two points lying on the ray.

    Science.gov (United States)

    Mikš, Antonín; Novák, Pavel

    2017-09-01

    The paper is focused on the problem of determination of the point of incidence of a light ray for the case of reflection or refraction at the spherical optical surface, assuming that two fixed points in space that the sought light ray should go through are given. The requirement is that one of these points lies on the incident ray and the other point on the reflected/refracted ray. Although at first glance it seems to be a simple problem, it will be shown that it has no simple analytical solution. The basic idea of the solution is given, and it is shown that the problem leads to a nonlinear equation in one variable. The roots of the resulting nonlinear equation can be found by numerical methods of mathematical optimization. The proposed methods were implemented in MATLAB, and the proper function of these algorithms was verified on several examples.

  20. AECB staff annual assessment of the Point Lepreau Nuclear Generating Station

    International Nuclear Information System (INIS)

    1997-06-01

    The Atomic Energy Control Board is the independent federal agency that controls all nuclear activities in Canada. A major use of nuclear energy in Canada is electricity production. The AECB assesses every station's performance against legal requirements, including the conditions in the operating licence. Each station is inspected and all aspects of the station's operation and management is reviewed. This report is the AECB staff assessment of reactor safety at the Point Lepreau Generating Station in 1996. Point Lepreau operated safely but the worsening trends in NB Power's safety performance leads to the conclusion that urgent action is required. NB Power is required to report formally to the AECB on progress with measures to improve safety management every six months. Further licensing action will be taken on NB Power if it fails to make the improvements

  1. Quantitative functional analysis of Late Glacial projectile points from northern Europe

    DEFF Research Database (Denmark)

    Dev, Satya; Riede, Felix

    2012-01-01

    This paper discusses the function of Late Glacial arch-backed and tanged projectile points from northern Europe in general and southern Scandinavia in particular. Ballistic requirements place clear and fairly well understood constraints on the design of projectile points. We outline the argument...... surely fully serviceable, diverged considerably from the functional optimum predicated by ballistic theory. These observations relate directly to southern Scandinavian Late Glacial culture-history which is marked by a sequence of co-occurrence of arch-backed and large tanged points in the earlier part...

  2. An Updated Point Design for Heavy Ion Fusion

    International Nuclear Information System (INIS)

    Yu, S.S.; Meier, W.R.; Abbott, R.B.; Barnard, J.J.; Brown, T.; Callahan, D.A.; Heitzenroeder, P.; Latkowski, J.F.; Logan, B.G.; Pemberton, S.J.; Peterson, P.F.; Rose, D.V.; Sabbi, G.L.; Sharp, W.M.; Welch, D.R.

    2002-01-01

    An updated, self-consistent point design for a heavy ion fusion (HIF) power plant based on an induction linac driver, indirect-drive targets, and a thick liquid wall chamber has been completed. Conservative parameters were selected to allow each design area to meet its functional requirements in a robust manner, and thus this design is referred to as the Robust Point Design (RPD-2002). This paper provides a top-level summary of the major characteristics and design parameters for the target, driver, final focus magnet layout and shielding, chamber, beam propagation to the target, and overall power plant

  3. Discovering Symmetry in Everyday Environments: A Creative Approach to Teaching Symmetry and Point Groups

    Science.gov (United States)

    Fuchigami, Kei; Schrandt, Matthew; Miessler, Gary L.

    2016-01-01

    A hands-on symmetry project is proposed as an innovative way of teaching point groups to undergraduate chemistry students. Traditionally, courses teaching symmetry require students to identify the point group of a given object. This project asks the reverse: students are instructed to identify an object that matches each point group. Doing so…

  4. Next Generation Microbiology Requirements

    Science.gov (United States)

    Ott, C. M.; Oubre, C. M.; Elliott, T. F.; Castro, V. A.; Pierson, D. L.

    2012-01-01

    technology. During 2011, this study focused on evaluating potable water requirements by assembling a forum of internal and external experts from NASA, other federal agencies, and academia. Key findings from this forum included: (1) Preventive design and operational strategies should be stringent and the primary focus of NASA's mitigation efforts, as they are cost effective and can be attained with conventional technology. (2) Microbial monitoring hardware should be simple and must be able to measure the viability of microorganisms in a sample. Multiple monitoring technologies can be utilized as long as at the microorganisms being identified can also be confirmed as viable. (3) Evidence showing alterations in the crew immune function and microbial virulence complicates risk assessments and creates the need for very conservative requirements. (4) One key source of infectious agents will always be the crew, and appropriate preventative measures should be taken preflight. (5) Water systems should be thoroughly disinfected (sterilized if possible) preflight and retain a residual biocide throughout the mission. Future forums will cover requirements for other types of samples, specifically spaceflight food and environmental samples, such as vehicle air and vehicle and cargo surfaces. An interim report on the potable water forum has been delivered to the Human Research Program with a final report on the recommendations for all sample types being delivered in September 2013.

  5. May 2002 Lidar Point Data of Southern California Coastline: Dana Point to Point La Jolla

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains lidar point data from a strip of Southern California coastline (including water, beach, cliffs, and top of cliffs) from Dana Point to Point La...

  6. September 2002 Lidar Point Data of Southern California Coastline: Dana Point to Point La Jolla

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains lidar point data from a strip of Southern California coastline (including water, beach, cliffs, and top of cliffs) from Dana Point to Point La...

  7. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  8. Two-point model for divertor transport

    International Nuclear Information System (INIS)

    Galambos, J.D.; Peng, Y.K.M.

    1984-04-01

    Plasma transport along divertor field lines was investigated using a two-point model. This treatment requires considerably less effort to find solutions to the transport equations than previously used one-dimensional (1-D) models and is useful for studying general trends. It also can be a valuable tool for benchmarking more sophisticated models. The model was used to investigate the possibility of operating in the so-called high density, low temperature regime

  9. 28 CFR 552.24 - Use of four-point restraints.

    Science.gov (United States)

    2010-07-01

    ... beyond eight hours requires the supervision of qualified health personnel. Mental health and qualified... Section 552.24 Judicial Administration BUREAU OF PRISONS, DEPARTMENT OF JUSTICE INSTITUTIONAL MANAGEMENT...-point restraints, qualified health personnel shall initially assess the inmate to ensure appropriate...

  10. Null-point titration measurements of free magnesium in stored erythrocytes

    International Nuclear Information System (INIS)

    Bock, J.L.; Yusuf, Y.; Puntillo, E.

    1987-01-01

    Free intracellular magnesium concentration (Mg/sub i/) was measured in stored human erythrocytes, using null-point titration with ionophore A23187. For cells stored 31 P NMR spectroscopy, which showed a decrease in Mg/sub i/ with storage. However, the NMR measurements are performed with no pretreatment of the cells, while the null-point method requires an initial washing step, which alters pH/sub i/ and may also alter Mg/sub i/. The titration-measured Mg/sub i/ values are still surprisingly low for long-stored cells, considering that depletion of ATP and 2,3-DPG should release bound Mg. Using the titration-measured Mg/sub i/ values along with measurements of total Mg, ATP, and 2,3-DPG, they estimate that an additional buffer contains about 47% of total Mg in cells stored 21 days. Mg/sub i/ determinations by both 31 P NMR and null-point titration thus indicate that erythrocyte Mg is largely bound to a high-capacity, low-affinity buffer whose relative importance increases during cell storage. Discrepancies between the methods require further investigation

  11. Design and Implementation of Numerical Linear Algebra Algorithms on Fixed Point DSPs

    Directory of Open Access Journals (Sweden)

    Gene Frantz

    2007-01-01

    Full Text Available Numerical linear algebra algorithms use the inherent elegance of matrix formulations and are usually implemented using C/C++ floating point representation. The system implementation is faced with practical constraints because these algorithms usually need to run in real time on fixed point digital signal processors (DSPs to reduce total hardware costs. Converting the simulation model to fixed point arithmetic and then porting it to a target DSP device is a difficult and time-consuming process. In this paper, we analyze the conversion process. We transformed selected linear algebra algorithms from floating point to fixed point arithmetic, and compared real-time requirements and performance between the fixed point DSP and floating point DSP algorithm implementations. We also introduce an advanced code optimization and an implementation by DSP-specific, fixed point C code generation. By using the techniques described in the paper, speed can be increased by a factor of up to 10 compared to floating point emulation on fixed point hardware.

  12. Dual keel Space Station payload pointing system design and analysis feasibility study

    Science.gov (United States)

    Smagala, Tom; Class, Brian F.; Bauer, Frank H.; Lebair, Deborah A.

    1988-01-01

    A Space Station attached Payload Pointing System (PPS) has been designed and analyzed. The PPS is responsible for maintaining fixed payload pointing in the presence of disturbance applied to the Space Station. The payload considered in this analysis is the Solar Optical Telescope. System performance is evaluated via digital time simulations by applying various disturbance forces to the Space Station. The PPS meets the Space Station articulated pointing requirement for all disturbances except Shuttle docking and some centrifuge cases.

  13. Effect Through Broadcasting System Access Point For Video Transmission

    Directory of Open Access Journals (Sweden)

    Leni Marlina

    2015-08-01

    Full Text Available Most universities are already implementing wired and wireless network that is used to access integrated information systems and the Internet. At present it is important to do research on the influence of the broadcasting system through the access point for video transmitter learning in the university area. At every university computer network through the access point must also use the cable in its implementation. These networks require cables that will connect and transmit data from one computer to another computer. While wireless networks of computers connected through radio waves. This research will be a test or assessment of how the influence of the network using the WLAN access point for video broadcasting means learning from the server to the client. Instructional video broadcasting from the server to the client via the access point will be used for video broadcasting means of learning. This study aims to understand how to build a wireless network by using an access point. It also builds a computer server as instructional videos supporting software that can be used for video server that will be emitted by broadcasting via the access point and establish a system of transmitting video from the server to the client via the access point.

  14. Address Points - COUNTY_ADDRESS_POINTS_IDHS_IN: Address Points Maintained by County Agencies in Indiana (Indiana Department of Homeland Security, Point feature class)

    Data.gov (United States)

    NSGIC State | GIS Inventory — COUNTY_ADDRESS_POINTS_IDHS_IN is an ESRI Geodatabase point feature class that contains address points maintained by county agencies in Indiana, provided by personnel...

  15. Assessing Readiness for En Pointe in Young Ballet Dancers.

    Science.gov (United States)

    Lai, Jeffrey C; Kruse, David W

    2016-01-01

    Children begin ballet lessons as young as age 2 years. The graceful movements of classical ballet require a combination of artistry, flexibility, and strength to perform. During the training and development of a young ballerina, the transition to dancing en pointe ("on the toes") represents a significant milestone and traditionally begins around age 11 or 12 years, assuming the proper training background and dance aspirations. However, current dance medicine literature describes factors such as maturity, proper technique, strength, and postural control as the more significant factors in determining pointe readiness. An in-office evaluation of these factors can be performed by the clinician to assist dancers, their family, and their dance instructor(s) determine pointe readiness. Copyright 2016, SLACK Incorporated.

  16. Oil-points - Designers means to evaluate sustainability of concepts

    DEFF Research Database (Denmark)

    Bey, Niki; Lenau, Torben Anker

    1998-01-01

    Designers have an essential influence on product design and are therefore one target group for environmental evaluation methods. This implies, that such evaluation methods have to meet designers requirements. Evaluation of sustainability of products is often done using formal Life Cycle Assessment....... This is investigated by means of three case studies where environmental impact is estimated using the EDIP method, the Eco-indicator 95 method, and the Oil Point method proposed by the authors. It is found that the results obtained using Oil Points are in acceptable conformity with the results obtained with more...

  17. Preliminary design of high-power wave-guide/transmission system for multimegawatt CW requirements of 100 MeV proton Linac

    International Nuclear Information System (INIS)

    Shrivastava, Purushottam; Wanmode, Y.D.; Hannurkar, P.R.

    2002-01-01

    Development of a 100 MeV CW proton Linac has been planned at CAT. This Linac will be needing CW rf power in the frequency ranges of 350 MHz and 700 MHz for its RFQ and DTL/CCDTL/SFDTL structures respectively. The power to the accelerating structures will be produced by either 1 MW CW or 250 kW CW klystron/inductive output tubes (HOM IOTs). The power needed by respective feed points in the structure is max. 250 kW which will be powered by splitting the power from 1 MW klystron/klystrode into four channels by using a wave-guide system. In case of using 250 kW tubes the power to the structures will be provided directly from each tube. Two types of wave-guide transmission system have been considered, viz WR 2300 for 350 MHz rf needs and WR 1500 for 700 MHz rf needs. The typical wave-guide system has been designed using the 1 MW CW klystron followed by wave-guide filter, dual directional coupler, high-power circulator, three 3 dB magic TEE power dividers to split the main channel into four equal channels of 250 kW each. Each individual channel has dual directional couplers, flexible wave-guide sections and high power ceramic vacuum window. The circulator and each power divider is terminated into the isolated ports by high power CW loads. Out of the four channels three channels have phase shifters. Present paper describes the technological aspects and design specifications-considerations for these stringent requirements. (author)

  18. Using a space filling curve approach for the management of dynamic point clouds

    NARCIS (Netherlands)

    Psomadaki, S; van Oosterom, P.J.M.; Tijssen, T.P.M.; Baart, F.

    2016-01-01

    Point cloud usage has increased over the years. The development of low-cost sensors makes it now possible to acquire frequent point cloud measurements on a short time period (day, hour, second). Based on the requirements coming from the coastal monitoring domain, we have developed, implemented and

  19. TUNNEL POINT CLOUD FILTERING METHOD BASED ON ELLIPTIC CYLINDRICAL MODEL

    Directory of Open Access Journals (Sweden)

    N. Zhu

    2016-06-01

    Full Text Available The large number of bolts and screws that attached to the subway shield ring plates, along with the great amount of accessories of metal stents and electrical equipments mounted on the tunnel walls, make the laser point cloud data include lots of non-tunnel section points (hereinafter referred to as non-points, therefore affecting the accuracy for modeling and deformation monitoring. This paper proposed a filtering method for the point cloud based on the elliptic cylindrical model. The original laser point cloud data was firstly projected onto a horizontal plane, and a searching algorithm was given to extract the edging points of both sides, which were used further to fit the tunnel central axis. Along the axis the point cloud was segmented regionally, and then fitted as smooth elliptic cylindrical surface by means of iteration. This processing enabled the automatic filtering of those inner wall non-points. Experiments of two groups showed coincident results, that the elliptic cylindrical model based method could effectively filter out the non-points, and meet the accuracy requirements for subway deformation monitoring. The method provides a new mode for the periodic monitoring of tunnel sections all-around deformation in subways routine operation and maintenance.

  20. Beginning SharePoint 2010 Building Business Solutions with SharePoint

    CERN Document Server

    Perran, Amanda; Mason, Jennifer; Rogers, Laura

    2010-01-01

    Two SharePoint MVPs provide the ultimate introduction to SharePoint 2010Beginning SharePoint 2010: Building Team Solutions with SharePoint provides information workers and site managers with extensive knowledge and expert advice, empowering them to become SharePoint champions within their organizations.Provides expansive coverage of SharePoint topics, as well as specialty areas such as forms, excel services, records management, and web content managementDetails realistic usage scenarios, and includes practice examples that highlight best practices for configuration and customizationIncludes de

  1. End points for adjuvant therapy trials: has the time come to accept disease-free survival as a surrogate end point for overall survival?

    Science.gov (United States)

    Gill, Sharlene; Sargent, Daniel

    2006-06-01

    The intent of adjuvant therapy is to eradicate micro-metastatic residual disease following curative resection with the goal of preventing or delaying recurrence. The time-honored standard for demonstrating efficacy of new adjuvant therapies is an improvement in overall survival (OS). This typically requires phase III trials of large sample size with lengthy follow-up. With the intent of reducing the cost and time of completing such trials, there is considerable interest in developing alternative or surrogate end points. A surrogate end point may be employed as a substitute to directly assess the effects of an intervention on an already accepted clinical end point such as mortality. When used judiciously, surrogate end points can accelerate the evaluation of new therapies, resulting in the more timely dissemination of effective therapies to patients. The current review provides a perspective on the suitability and validity of disease-free survival (DFS) as an alternative end point for OS. Criteria for establishing surrogacy and the advantages and limitations associated with the use of DFS as a primary end point in adjuvant clinical trials and as the basis for approval of new adjuvant therapies are discussed.

  2. Interactive Trunk Extraction from Forest Point Cloud

    Directory of Open Access Journals (Sweden)

    T. Mizoguchi

    2014-06-01

    Full Text Available For forest management or monitoring, it is required to constantly measure several parameters of each tree, such as height, diameter at breast height, and trunk volume. Terrestrial laser scanner has been used for this purpose instead of human workers to reduce time and cost for the measurement. In order to use point cloud captured by terrestrial laser scanner in the above applications, it is an important step to extract all trees or their trunks separately. For this purpose, we propose an interactive system in which a user can intuitively and efficiently extract each trunk by a simple editing on the distance image created from the point cloud. We demonstrate the effectiveness of our proposed system from various experiments.

  3. Programming languages and compiler design for realistic quantum hardware

    Science.gov (United States)

    Chong, Frederic T.; Franklin, Diana; Martonosi, Margaret

    2017-09-01

    Quantum computing sits at an important inflection point. For years, high-level algorithms for quantum computers have shown considerable promise, and recent advances in quantum device fabrication offer hope of utility. A gap still exists, however, between the hardware size and reliability requirements of quantum computing algorithms and the physical machines foreseen within the next ten years. To bridge this gap, quantum computers require appropriate software to translate and optimize applications (toolflows) and abstraction layers. Given the stringent resource constraints in quantum computing, information passed between layers of software and implementations will differ markedly from in classical computing. Quantum toolflows must expose more physical details between layers, so the challenge is to find abstractions that expose key details while hiding enough complexity.

  4. High heat load synchrotron optics

    International Nuclear Information System (INIS)

    Mills, D.M.

    1993-01-01

    Third generation synchrotron radiation sources currently being constructed worldwide will produce x-ray beams of unparalleled power and power density. These high heat fluxes coupled with the stringent dimensional requirements of the x-ray optical components pose a prodigious challenge to designers of x-ray optical elements, specifically x-ray mirrors and crystal monochromators. Although certain established techniques for the cooling of high heat flux components can be directly applied to this problem, the thermal management of high heat load x-ray optical components has several unusual aspects that may ultimately lead to unique solutions. This manuscript attempts to summarize the various approaches currently being applied to this undertaking and to point out the areas of research that require further development

  5. Programming languages and compiler design for realistic quantum hardware.

    Science.gov (United States)

    Chong, Frederic T; Franklin, Diana; Martonosi, Margaret

    2017-09-13

    Quantum computing sits at an important inflection point. For years, high-level algorithms for quantum computers have shown considerable promise, and recent advances in quantum device fabrication offer hope of utility. A gap still exists, however, between the hardware size and reliability requirements of quantum computing algorithms and the physical machines foreseen within the next ten years. To bridge this gap, quantum computers require appropriate software to translate and optimize applications (toolflows) and abstraction layers. Given the stringent resource constraints in quantum computing, information passed between layers of software and implementations will differ markedly from in classical computing. Quantum toolflows must expose more physical details between layers, so the challenge is to find abstractions that expose key details while hiding enough complexity.

  6. Real-time multi-GNSS single-frequency precise point positioning

    NARCIS (Netherlands)

    de Bakker, P.F.; Tiberius, C.C.J.M.

    2017-01-01

    Precise Point Positioning (PPP) is a popular Global Positioning System (GPS) processing strategy, thanks to its high precision without requiring additional GPS infrastructure. Single-Frequency PPP (SF-PPP) takes this one step further by no longer relying on expensive dual-frequency GPS receivers,

  7. Optimization of Control Points Number at Coordinate Measurements based on the Monte-Carlo Method

    Science.gov (United States)

    Korolev, A. A.; Kochetkov, A. V.; Zakharov, O. V.

    2018-01-01

    Improving the quality of products causes an increase in the requirements for the accuracy of the dimensions and shape of the surfaces of the workpieces. This, in turn, raises the requirements for accuracy and productivity of measuring of the workpieces. The use of coordinate measuring machines is currently the most effective measuring tool for solving similar problems. The article proposes a method for optimizing the number of control points using Monte Carlo simulation. Based on the measurement of a small sample from batches of workpieces, statistical modeling is performed, which allows one to obtain interval estimates of the measurement error. This approach is demonstrated by examples of applications for flatness, cylindricity and sphericity. Four options of uniform and uneven arrangement of control points are considered and their comparison is given. It is revealed that when the number of control points decreases, the arithmetic mean decreases, the standard deviation of the measurement error increases and the probability of the measurement α-error increases. In general, it has been established that it is possible to repeatedly reduce the number of control points while maintaining the required measurement accuracy.

  8. Reevaluation of steam generator level trip set point

    Energy Technology Data Exchange (ETDEWEB)

    Shim, Yoon Sub; Soh, Dong Sub; Kim, Sung Oh; Jung, Se Won; Sung, Kang Sik; Lee, Joon [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-06-01

    The reactor trip by the low level of steam generator water accounts for a substantial portion of reactor scrams in a nuclear plant and the feasibility of modification of the steam generator water level trip system of YGN 1/2 was evaluated in this study. The study revealed removal of the reactor trip function from the SG water level trip system is not possible because of plant safety but relaxation of the trip set point by 9 % is feasible. The set point relaxation requires drilling of new holes for level measurement to operating steam generators. Characteristics of negative neutron flux rate trip and reactor trip were also reviewed as an additional work. Since the purpose of the trip system modification for reduction of a reactor scram frequency is not to satisfy legal requirements but to improve plant performance and the modification yields positive and negative aspects, the decision of actual modification needs to be made based on the results of this study and also the policy of a plant owner. 37 figs, 6 tabs, 14 refs. (Author).

  9. Impact of applying the more stringent validation criteria of the revised European Society of Hypertension International Protocol 2010 on earlier validation studies.

    Science.gov (United States)

    Stergiou, George S; Karpettas, Nikos; Atkins, Neil; O'Brien, Eoin

    2011-04-01

    Since 2002 when the European Society of Hypertension International Protocol (ESH-IP) was published it has become the preferred protocol for validating blood pressure monitors worldwide. In 2010, a revised version of the ESH-IP with more stringent criteria was published. This study assesses the impact of applying the revised ESH-IP criteria. A systematic literature review of ESH-IP studies reported between 2002 and 2010 was conducted. The impact of applying the ESH-IP 2010 criteria retrospectively on the data reported in these studies was investigated. The performance of the oscillometric devices in the last decade was also investigated on the basis of the ESH-IP criteria. Among 119 published studies, 112 with sufficient data were analyzed. According to ESH-IP 2002, the test device failed in 19 studies, whereas by applying the ESH-IP 2010 criteria in 28 additional studies increased the failure rate from 17 to 42%. Of these 28 studies, in 20 (71%) the test device failed at part 1 (accuracy per measurement) and in 22 (79%) at part 2 (accuracy per subject). Most of the failures involved the '5 mmHg or less' criterion. In the last decade there has been a consistent trend toward improved performance of oscillometric devices assessed on the basis of the ESH-IP criteria. This retrospective analysis shows that the stricter revised ESH-IP 2010 criteria will noticeably increase the failure rate of devices being validated. Oscillometric devices are becoming more accurate, and the revised ESH-IP by acknowledging this trend will allow more accurate devices to enter the market.

  10. Tipping Point

    Medline Plus

    Full Text Available ... en español Blog About OnSafety CPSC Stands for Safety The Tipping Point Home > 60 Seconds of Safety (Videos) > The Tipping Point The Tipping Point by ... danger death electrical fall furniture head injury product safety television tipover tv Watch the video in Adobe ...

  11. Design requirements of communication architecture of SMART safety system

    International Nuclear Information System (INIS)

    Park, H. Y.; Kim, D. H.; Sin, Y. C.; Lee, J. Y.

    2001-01-01

    To develop the communication network architecture of safety system of SMART, the evaluation elements for reliability and performance factors are extracted from commercial networks and classified the required-level by importance. A predictable determinacy, status and fixed based architecture, separation and isolation from other systems, high reliability, verification and validation are introduced as the essential requirements of safety system communication network. Based on the suggested requirements, optical cable, star topology, synchronous transmission, point-to-point physical link, connection-oriented logical link, MAC (medium access control) with fixed allocation are selected as the design elements. The proposed architecture will be applied as basic communication network architecture of SMART safety system

  12. Beam alignment based on two-dimensional power spectral density of a near-field image.

    Science.gov (United States)

    Wang, Shenzhen; Yuan, Qiang; Zeng, Fa; Zhang, Xin; Zhao, Junpu; Li, Kehong; Zhang, Xiaolu; Xue, Qiao; Yang, Ying; Dai, Wanjun; Zhou, Wei; Wang, Yuanchen; Zheng, Kuixing; Su, Jingqin; Hu, Dongxia; Zhu, Qihua

    2017-10-30

    Beam alignment is crucial to high-power laser facilities and is used to adjust the laser beams quickly and accurately to meet stringent requirements of pointing and centering. In this paper, a novel alignment method is presented, which employs data processing of the two-dimensional power spectral density (2D-PSD) for a near-field image and resolves the beam pointing error relative to the spatial filter pinhole directly. Combining this with a near-field fiducial mark, the operation of beam alignment is achieved. It is experimentally demonstrated that this scheme realizes a far-field alignment precision of approximately 3% of the pinhole size. This scheme adopts only one near-field camera to construct the alignment system, which provides a simple, efficient, and low-cost way to align lasers.

  13. TouchGrid: Touchpad pointing by recursively mapping taps to smaller display regions

    DEFF Research Database (Denmark)

    Hertzum, Morten; Hornbæk, Kasper

    2005-01-01

    Touchpad devices are widely used but lacking in pointing efficiency. The TouchGrid, an instance of what we term cell cursors, replaces moving the cursor through dragging the finger on a touchpad with tapping in different regions of the touchpad. The touchpad regions are recursively mapped...... to smaller display regions and thereby enable high-precision pointing without requiring high tapping precision. In an experiment, six subjects used the TouchGrid and a standard touchpad across different numbers of targets, distances to targets, and target widths. Whereas standard touchpad operation follows...... Fitts’ law, target selection time with the TouchGrid is a linear function of the required number of taps. The TouchGrid was significantly faster for small targets and for tasks requiring one tap, and marginally faster for two-tap tasks. Error rates tended to be higher with the TouchGrid than...

  14. Assessment of economic impact of offshore and coastal discharge requirements on present and future operations in the Gulf of Mexico. Final report

    International Nuclear Information System (INIS)

    Lindsey, R.

    1996-06-01

    The high potential costs of compliance associated with new effluent guidelines for offshore and coastal oil and gas operations could significantly affect the economics of finding, developing, and producing oil and gas in the Gulf of Mexico. This report characterizes the potential economic impacts of alternative treatment and discharge regulations for produced water on reserves and production in Gulf of Mexico coastal, territorial and outer continental shelf (OCS) waters, quantifying the impacts of both recent regulatory changes and possible more stringent requirements. The treatment technologies capable of meeting these requirements are characterized in terms of cost, performance, and applicability to coastal and offshore situations. As part of this analysis, an extensive database was constructed that includes oil and gas production forecasts by field, data on existing platforms, and the current treatment methods in place for produced water treatment and disposal on offshore facilities. This work provides the first comprehensive evaluation of the impacts of alternative regulatory requirements for produced water management and disposal in coastal and offshore areas of the Gulf of Mexico

  15. Practical challenges related to point of care testing

    Directory of Open Access Journals (Sweden)

    Julie L.V. Shaw

    2016-04-01

    Full Text Available Point of care testing (POCT refers to laboratory testing that occurs near to the patient, often at the patient bedside. POCT can be advantageous in situations requiring rapid turnaround time of test results for clinical decision making. There are many challenges associated with POCT, mainly related to quality assurance. POCT is performed by clinical staff rather than laboratory trained individuals which can lead to errors resulting from a lack of understanding of the importance of quality control and quality assurance practices. POCT is usually more expensive than testing performed in the central laboratory and requires a significant amount of support from the laboratory to ensure the quality testing and meet accreditation requirements.Here, specific challenges related to POCT compliance with accreditation standards are discussed along with strategies that can be used to overcome these challenges. These areas include: documentation of POCT orders, charting of POCT results as well as training and certification of individuals performing POCT. Factors to consider when implementing connectivity between POCT instruments and the electronic medical record are also discussed in detail and include: uni-directional versus bidirectional communication, linking patient demographic information with POCT software, the importance of positive patient identification and considering where to chart POCT results in the electronic medical record. Keywords: Point of care Testing, Laboratory accreditation, Medical directive, Results documentation, Electronic Medical Record, Transcription error, Connectivity, Positive patient identification

  16. History Matching Through a Smooth Formulation of Multiple-Point Statistics

    DEFF Research Database (Denmark)

    Melnikova, Yulia; Zunino, Andrea; Lange, Katrine

    2014-01-01

    and the mismatch with multiple-point statistics. As a result, in the framework of the Bayesian approach, such a solution belongs to a high posterior region. The methodology, while applicable to any inverse problem with a training-image-based prior, is especially beneficial for problems which require expensive......We propose a smooth formulation of multiple-point statistics that enables us to solve inverse problems using gradient-based optimization techniques. We introduce a differentiable function that quantifies the mismatch between multiple-point statistics of a training image and of a given model. We...... show that, by minimizing this function, any continuous image can be gradually transformed into an image that honors the multiple-point statistics of the discrete training image. The solution to an inverse problem is then found by minimizing the sum of two mismatches: the mismatch with data...

  17. The Homogeneous Interior-Point Algorithm: Nonsymmetric Cones, Warmstarting, and Applications

    DEFF Research Database (Denmark)

    Skajaa, Anders

    algorithms for these problems is still limited. The goal of this thesis is to investigate and shed light on two computational aspects of homogeneous interior-point algorithms for convex conic optimization: The first part studies the possibility of devising a homogeneous interior-point method aimed at solving...... problems involving constraints that require nonsymmetric cones in their formulation. The second part studies the possibility of warmstarting the homogeneous interior-point algorithm for conic problems. The main outcome of the first part is the introduction of a completely new homogeneous interior......-point algorithm designed to solve nonsymmetric convex conic optimization problems. The algorithm is presented in detail and then analyzed. We prove its convergence and complexity. From a theoretical viewpoint, it is fully competitive with other algorithms and from a practical viewpoint, we show that it holds lots...

  18. The scalar-photon 3-point vertex in massless quenched scalar QED

    International Nuclear Information System (INIS)

    Concha-Sánchez, Y; Gutiérrez-Guerrero, L X; Fernández-Rangel, L A

    2016-01-01

    Non perturbative studies of Schwinger-Dyson equations (SDEs) require their infinite, coupled tower to be truncated in order to reduce them to a practically solvable set. In this connection, a physically acceptable ansatz for the three point vertex is the most favorite choice. Scalar quantum electrodynamics (sQED) provides a simple and neat platform to address this problem. The most general form of the scalar-photon three point vertex can be expressed in terms of only two independent form factors, longitudinal and transverse. Ball and Chiu have demonstrated that the longitudinal vertex is fixed by requiring the Ward-Fradkin-Green- Takahashi identity (WFGTI), while the transverse vertex remains undetermined. In massless quenched sQED, we propose the transverse part of the non perturbative scalar-photon vertex. (paper)

  19. Sustainable optimization of wooden window profiles to attain the requirements of EnEV 2012; Nachhaltige Optimierung von Holzfensterprofilen zur Erreichung der Anforderungen der EnEV 2012

    Energy Technology Data Exchange (ETDEWEB)

    Bliemetsrieder, Benno; Sack, Norbert

    2011-07-01

    On-going and future specifications of the Energy Savings Ordinance (EnEV), rising energy costs and additional incentives provided by government sponsored measures constantly call for construction techniques that are more energy- efficient. Since this results in the requirements of individual construction products becoming increasingly stringent, both for new structures and in the segment of energy-related refurbishment of buildings, elements of windows and exterior doors, too, must keep pace with this development and there must be an improvement in the values of heat transfer coefficients (U values) of these constructions The increasing stringencies in future will require considerable improvement in the heat transfer coefficients of window frames (Uf value), in addition to improvements in the field of glazing. Therefore, the aim of the planned research project was to formulate concepts for optimising the thermal protection of timber window profiles as well as suggestions how to meet the increasing requirements while taking all windowrelevant specifications into account. (orig.)

  20. Analysis of protective and cytotoxic immune responses in vivo against metabolically inactivated and untreated cells of a mutagenized tumor line (requirements for tumor immunogenicity)

    International Nuclear Information System (INIS)

    Wehrmaker, A.; Lehmann, V.; Droege, W.

    1986-01-01

    The immunogenicity of a mutagenized subline (ESb-D) of the weakly immunogenic T-cell lymphoma L 5178 Y ESb has been characterized. The injection of 10(6) ESb-D cells ip did not establish lethal tumors in untreated DBA/2 mice but established tumors in sublethally irradiated mice. Injection of ESb-D cells into otherwise untreated DBA/2 mice established also a state of protective immunity against the subsequent injection of otherwise lethal doses of ESb tumor cells. Protection was only obtained after injection of intact but not UV-irradiated or mitomycin-C-treated ESb-D cells. A direct T-cell-mediated cytotoxic activity was also demonstrable in the spleen cells of DBA/2 mice after injection of ESb-D cells but not ESb cells. The cytotoxic activity was variant specific for ESb-D target cells, and it was induced only with intact but not UV-irradiated or mitomycin C-treated ESb-D cells. This suggested that the induction of protective and cytotoxic immunity may require the persistence of the antigen or unusually high antigen doses. The in vivo priming for a secondary in vitro cytotoxic response, in contrast, was achieved with intact and also with mitomycin C-treated ESb-D cells but again not with UV-irradiated ESb-D cells. This indicated that the metabolic activity was a minimal requirement for the in vivo immunogenicity of the ESb-D tumor line. The secondary cytotoxic activity was demonstrable on ESb-D and ESb target cells and could be restimulated in vitro about equally well with ESb-D and ESb cells. But the in vivo priming was again only obtained with ESb-D cells and not with ESb cells. These experiments thus demonstrated that the requirements for immunogenicity are more stringent in vivo than in vitro, and more stringent for the induction of direct cytotoxic and protective immunity in vivo than for the in vivo priming for secondary in vitro responses

  1. Enhancement of precision and reduction of measuring points in tomographic reconstructions

    International Nuclear Information System (INIS)

    Lustfeld, H.; Hirschfeld, J.A.; Reissel, M.; Steffen, B.

    2011-01-01

    Accurate external measurements are required in tomographic problems to obtain a reasonable knowledge of the internal structures. Crucial is the distribution of the external measuring points. We suggest a procedure how to systematically optimize this distribution viz. to increase the precision (i.e. to shrink error bars) of the reconstruction by detecting the important and by eliminating the irrelevant measuring points. In a realistic numerical example we apply our scheme to magnetotomography of fuel cells. The result is striking: Starting from a smooth distribution of measuring points on a surface of a cuboid around the fuel cell, the number of measuring points can systematically be reduced by more than 90%. At the same time the precision increases by a factor of nearly 3.

  2. 40 CFR 141.170 - General requirements.

    Science.gov (United States)

    2010-07-01

    ... recontamination by surface water runoff and a point downstream before or at the first customer for filtered...) Compliance with the profiling and benchmark requirements under the provisions of § 141.172. (b) A public...

  3. Distributed maximum power point tracking in wind micro-grids

    Directory of Open Access Journals (Sweden)

    Carlos Andrés Ramos-Paja

    2012-06-01

    Full Text Available With the aim of reducing the hardware requirements in micro-grids based on wind generators, a distributed maximum power point tracking algorithm is proposed. Such a solution reduces the amount of current sensors and processing devices to maximize the power extracted from the micro-grid, reducing the application cost. The analysis of the optimal operating points of the wind generator was performed experimentally, which in addition provides realistic model parameters. Finally, the proposed solution was validated by means of detailed simulations performed in the power electronics software PSIM, contrasting the achieved performance with traditional solutions.

  4. Diagnostics

    DEFF Research Database (Denmark)

    Donné, A.J.H.; Costley, A.E.; Barnsley, R.

    2007-01-01

    of the measurements—time and spatial resolutions, etc—will in some cases be more stringent. Many of the measurements will be used in the real time control of the plasma driving a requirement for very high reliability in the systems (diagnostics) that provide the measurements. The implementation of diagnostic systems...... on ITER is a substantial challenge. Because of the harsh environment (high levels of neutron and gamma fluxes, neutron heating, particle bombardment) diagnostic system selection and design has to cope with a range of phenomena not previously encountered in diagnostic design. Extensive design and R......&D is needed to prepare the systems. In some cases the environmental difficulties are so severe that new diagnostic techniques are required. The starting point in the development of diagnostics for ITER is to define the measurement requirements and develop their justification. It is necessary to include all...

  5. Fuzzy Controller Design Using FPGA for Photovoltaic Maximum Power Point Tracking

    OpenAIRE

    Basil M Hamed; Mohammed S. El-Moghany

    2012-01-01

    The cell has optimum operating point to be able to get maximum power. To obtain Maximum Power from photovoltaic array, photovoltaic power system usually requires Maximum Power Point Tracking (MPPT) controller. This paper provides a small power photovoltaic control system based on fuzzy control with FPGA technology design and implementation for MPPT. The system composed of photovoltaic module, buck converter and the fuzzy logic controller implemented on FPGA for controlling on/off time of MOSF...

  6. Point-to-point people with purpose—Exploring the possibility of a commercial traveler market for point-to-point suborbital space transportation

    Science.gov (United States)

    Webber, Derek

    2013-12-01

    An argument was made at the First Arcachon Conference on Private Human Access to Space in 2008 [1] that some systematic market research should be conducted into potential market segments for point-to-point suborbital space transportation (PtP), in order to understand whether a commercial market exists which might augment possible government use for such a vehicle. The cargo market potential was subsequently addressed via desk research, and the results, which resulted in a pessimistic business case outlook, were presented in [2]. The same desk research approach is now used in this paper to address the potential business and wealthy individual passenger traveler market segment ("point-to-point people with purpose"). The results, with the assumed ticket pricing, are not encouraging.

  7. A threshold auto-adjustment algorithm of feature points extraction based on grid

    Science.gov (United States)

    Yao, Zili; Li, Jun; Dong, Gaojie

    2018-02-01

    When dealing with high-resolution digital images, detection of feature points is usually the very first important step. Valid feature points depend on the threshold. If the threshold is too low, plenty of feature points will be detected, and they may be aggregated in the rich texture regions, which consequently not only affects the speed of feature description, but also aggravates the burden of following processing; if the threshold is set high, the feature points in poor texture area will lack. To solve these problems, this paper proposes a threshold auto-adjustment method of feature extraction based on grid. By dividing the image into numbers of grid, threshold is set in every local grid for extracting the feature points. When the number of feature points does not meet the threshold requirement, the threshold will be adjusted automatically to change the final number of feature points The experimental results show that feature points produced by our method is more uniform and representative, which avoids the aggregation of feature points and greatly reduces the complexity of following work.

  8. Performance testing of 3D point cloud software

    Science.gov (United States)

    Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.

    2013-10-01

    LiDAR systems are being used widely in recent years for many applications in the engineering field: civil engineering, cultural heritage, mining, industry and environmental engineering. One of the most important limitations of this technology is the large computational requirements involved in data processing, especially for large mobile LiDAR datasets. Several software solutions for data managing are available in the market, including open source suites, however, users often unknown methodologies to verify their performance properly. In this work a methodology for LiDAR software performance testing is presented and four different suites are studied: QT Modeler, VR Mesh, AutoCAD 3D Civil and the Point Cloud Library running in software developed at the University of Vigo (SITEGI). The software based on the Point Cloud Library shows better results in the loading time of the point clouds and CPU usage. However, it is not as strong as commercial suites in working set and commit size tests.

  9. Performance testing of 3D point cloud software

    Directory of Open Access Journals (Sweden)

    M. Varela-González

    2013-10-01

    Full Text Available LiDAR systems are being used widely in recent years for many applications in the engineering field: civil engineering, cultural heritage, mining, industry and environmental engineering. One of the most important limitations of this technology is the large computational requirements involved in data processing, especially for large mobile LiDAR datasets. Several software solutions for data managing are available in the market, including open source suites, however, users often unknown methodologies to verify their performance properly. In this work a methodology for LiDAR software performance testing is presented and four different suites are studied: QT Modeler, VR Mesh, AutoCAD 3D Civil and the Point Cloud Library running in software developed at the University of Vigo (SITEGI. The software based on the Point Cloud Library shows better results in the loading time of the point clouds and CPU usage. However, it is not as strong as commercial suites in working set and commit size tests.

  10. Establishment of the Co-C Eutectic Fixed-Point Cell for Thermocouple Calibrations at NIMT

    Science.gov (United States)

    Ongrai, O.; Elliott, C. J.

    2017-08-01

    In 2015, NIMT first established a Co-C eutectic temperature reference (fixed-point) cell measurement capability for thermocouple calibration to support the requirements of Thailand's heavy industries and secondary laboratories. The Co-C eutectic fixed-point cell is a facility transferred from NPL, where the design was developed through European and UK national measurement system projects. In this paper, we describe the establishment of a Co-C eutectic fixed-point cell for thermocouple calibration at NIMT. This paper demonstrates achievement of the required furnace uniformity, the Co-C plateau realization and the comparison data between NIMT and NPL Co-C cells by using the same standard Pt/Pd thermocouple, demonstrating traceability. The NIMT measurement capability for noble metal type thermocouples at the new Co-C eutectic fixed point (1324.06°C) is estimated to be within ± 0.60 K (k=2). This meets the needs of Thailand's high-temperature thermocouple users—for which previously there has been no traceable calibration facility.

  11. MODIL cryocooler producibility demonstration project results

    International Nuclear Information System (INIS)

    Cruz, G.E.; Franks, R.M.

    1993-01-01

    The production of large quantities of spacecraft needed by SDIO will require a cultural change in design and production practices. Low rates production and the need for exceedingly high reliability has driven the industry to custom designed, hand crafted, and exhaustively tested satellites. These factors have mitigated against employing design and manufacturing cost reduction methods commonly used in tactical missile production. Additional challenges to achieving production efficiencies are presented by the SDI spacecraft mission requirement. IR sensor systems, for example, are comprised of subassemblies and components that require the design, manufacture, and maintenance of ultra precision tolerances over challenging operational lifetimes. These IR sensors demand the use of reliable, closed loop, cryogenic refrigerators or active cryocoolers to meet stringent system acquisition and pointing requirements. The authors summarize some spacecraft cryocooler requirements and discuss observations regarding Industry's current production capabilities of cryocoolers. The results of the Lawrence Livermore National Laboratory (LLNL) Spacecraft Fabrication and Test (SF and T) MODIL's Phase I producibility demonstration project is presented

  12. Hardware-accelerated Point Generation and Rendering of Point-based Impostors

    DEFF Research Database (Denmark)

    Bærentzen, Jakob Andreas

    2005-01-01

    This paper presents a novel scheme for generating points from triangle models. The method is fast and lends itself well to implementation using graphics hardware. The triangle to point conversion is done by rendering the models, and the rendering may be performed procedurally or by a black box API....... I describe the technique in detail and discuss how the generated point sets can easily be used as impostors for the original triangle models used to create the points. Since the points reside solely in GPU memory, these impostors are fairly efficient. Source code is available online....

  13. Limits in point to point resolution of MOS based pixels detector arrays

    Science.gov (United States)

    Fourches, N.; Desforge, D.; Kebbiri, M.; Kumar, V.; Serruys, Y.; Gutierrez, G.; Leprêtre, F.; Jomard, F.

    2018-01-01

    In high energy physics point-to-point resolution is a key prerequisite for particle detector pixel arrays. Current and future experiments require the development of inner-detectors able to resolve the tracks of particles down to the micron range. Present-day technologies, although not fully implemented in actual detectors, can reach a 5-μm limit, this limit being based on statistical measurements, with a pixel-pitch in the 10 μm range. This paper is devoted to the evaluation of the building blocks for use in pixel arrays enabling accurate tracking of charged particles. Basing us on simulations we will make here a quantitative evaluation of the physical and technological limits in pixel size. Attempts to design small pixels based on SOI technology will be briefly recalled here. A design based on CMOS compatible technologies that allow a reduction of the pixel size below the micrometer is introduced here. Its physical principle relies on a buried carrier-localizing collecting gate. The fabrication process needed by this pixel design can be based on existing process steps used in silicon microelectronics. The pixel characteristics will be discussed as well as the design of pixel arrays. The existing bottlenecks and how to overcome them will be discussed in the light of recent ion implantation and material characterization experiments.

  14. Pointing Verification Method for Spaceborne Lidars

    Directory of Open Access Journals (Sweden)

    Axel Amediek

    2017-01-01

    Full Text Available High precision acquisition of atmospheric parameters from the air or space by means of lidar requires accurate knowledge of laser pointing. Discrepancies between the assumed and actual pointing can introduce large errors due to the Doppler effect or a wrongly assumed air pressure at ground level. In this paper, a method for precisely quantifying these discrepancies for airborne and spaceborne lidar systems is presented. The method is based on the comparison of ground elevations derived from the lidar ranging data with high-resolution topography data obtained from a digital elevation model and allows for the derivation of the lateral and longitudinal deviation of the laser beam propagation direction. The applicability of the technique is demonstrated by using experimental data from an airborne lidar system, confirming that geo-referencing of the lidar ground spot trace with an uncertainty of less than 10 m with respect to the used digital elevation model (DEM can be obtained.

  15. Point-Cloud Compression for Vehicle-Based Mobile Mapping Systems Using Portable Network Graphics

    Science.gov (United States)

    Kohira, K.; Masuda, H.

    2017-09-01

    A mobile mapping system is effective for capturing dense point-clouds of roads and roadside objects Point-clouds of urban areas, residential areas, and arterial roads are useful for maintenance of infrastructure, map creation, and automatic driving. However, the data size of point-clouds measured in large areas is enormously large. A large storage capacity is required to store such point-clouds, and heavy loads will be taken on network if point-clouds are transferred through the network. Therefore, it is desirable to reduce data sizes of point-clouds without deterioration of quality. In this research, we propose a novel point-cloud compression method for vehicle-based mobile mapping systems. In our compression method, point-clouds are mapped onto 2D pixels using GPS time and the parameters of the laser scanner. Then, the images are encoded in the Portable Networking Graphics (PNG) format and compressed using the PNG algorithm. In our experiments, our method could efficiently compress point-clouds without deteriorating the quality.

  16. POINT-CLOUD COMPRESSION FOR VEHICLE-BASED MOBILE MAPPING SYSTEMS USING PORTABLE NETWORK GRAPHICS

    Directory of Open Access Journals (Sweden)

    K. Kohira

    2017-09-01

    Full Text Available A mobile mapping system is effective for capturing dense point-clouds of roads and roadside objects.Point-clouds of urban areas, residential areas, and arterial roads are useful for maintenance of infrastructure, map creation, and automatic driving. However, the data size of point-clouds measured in large areas is enormously large. A large storage capacity is required to store such point-clouds, and heavy loads will be taken on network if point-clouds are transferred through the network. Therefore, it is desirable to reduce data sizes of point-clouds without deterioration of quality. In this research, we propose a novel point-cloud compression method for vehicle-based mobile mapping systems. In our compression method, point-clouds are mapped onto 2D pixels using GPS time and the parameters of the laser scanner. Then, the images are encoded in the Portable Networking Graphics (PNG format and compressed using the PNG algorithm. In our experiments, our method could efficiently compress point-clouds without deteriorating the quality.

  17. Hinkley Point 'C' power station public inquiry: outline statement of case

    International Nuclear Information System (INIS)

    1988-05-01

    This outline statement relates to the public inquiry to be held into the planning application by the Central Electricity Generating Board (CEGB) to construct a 1200 MW Pressurized Water Reactor (PWR) power station at Hinkley Point (Hinkley Point ''C'') in the United Kingdom, adjacent to an existing nuclear power station. The inquiry will consider economic, safety, environmental and planning matters relevant to the application and the implications for agriculture and local amenities of the re-aligning of two 400 kV overhead transmission lines. The outline statement contains submissions on: policy contest and approach; the requirement for Hinkley Point ''C''; design and safety; local issues. (UK)

  18. Constraints from conformal symmetry on the three point scalar correlator in inflation

    International Nuclear Information System (INIS)

    Kundu, Nilay; Shukla, Ashish; Trivedi, Sandip P.

    2015-01-01

    Using symmetry considerations, we derive Ward identities which relate the three point function of scalar perturbations produced during inflation to the scalar four point function, in a particular limit. The derivation assumes approximate conformal invariance, and the conditions for the slow roll approximation, but is otherwise model independent. The Ward identities allow us to deduce that the three point function must be suppressed in general, being of the same order of magnitude as in the slow roll model. They also fix the three point function in terms of the four point function, upto one constant which we argue is generically suppressed. Our approach is based on analyzing the wave function of the universe, and the Ward identities arise by imposing the requirements of spatial and time reparametrization invariance on it.

  19. StreamMap: Smooth Dynamic Visualization of High-Density Streaming Points.

    Science.gov (United States)

    Li, Chenhui; Baciu, George; Han, Yu

    2018-03-01

    Interactive visualization of streaming points for real-time scatterplots and linear blending of correlation patterns is increasingly becoming the dominant mode of visual analytics for both big data and streaming data from active sensors and broadcasting media. To better visualize and interact with inter-stream patterns, it is generally necessary to smooth out gaps or distortions in the streaming data. Previous approaches either animate the points directly or present a sampled static heat-map. We propose a new approach, called StreamMap, to smoothly blend high-density streaming points and create a visual flow that emphasizes the density pattern distributions. In essence, we present three new contributions for the visualization of high-density streaming points. The first contribution is a density-based method called super kernel density estimation that aggregates streaming points using an adaptive kernel to solve the overlapping problem. The second contribution is a robust density morphing algorithm that generates several smooth intermediate frames for a given pair of frames. The third contribution is a trend representation design that can help convey the flow directions of the streaming points. The experimental results on three datasets demonstrate the effectiveness of StreamMap when dynamic visualization and visual analysis of trend patterns on streaming points are required.

  20. Gran method for end point anticipation in monosegmented flow titration

    Directory of Open Access Journals (Sweden)

    Aquino Emerson V

    2004-01-01

    Full Text Available An automatic potentiometric monosegmented flow titration procedure based on Gran linearisation approach has been developed. The controlling program can estimate the end point of the titration after the addition of three or four aliquots of titrant. Alternatively, the end point can be determined by the second derivative procedure. In this case, additional volumes of titrant are added until the vicinity of the end point and three points before and after the stoichiometric point are used for end point calculation. The performance of the system was assessed by the determination of chloride in isotonic beverages and parenteral solutions. The system employs a tubular Ag2S/AgCl indicator electrode. A typical titration, performed according to the IUPAC definition, requires only 60 mL of sample and about the same volume of titrant (AgNO3 solution. A complete titration can be carried out in 1 - 5 min. The accuracy and precision (relative standard deviation of ten replicates are 2% and 1% for the Gran and 1% and 0.5% for the Gran/derivative end point determination procedures, respectively. The proposed system reduces the time to perform a titration, ensuring low sample and reagent consumption, and full automatic sampling and titrant addition in a calibration-free titration protocol.

  1. Detecting corner points from digital curves

    International Nuclear Information System (INIS)

    Sarfraz, M.

    2011-01-01

    Corners in digital images give important clues for shape representation, recognition, and analysis. Since dominant information regarding shape is usually available at the corners, they provide important features for various real life applications in the disciplines like computer vision, pattern recognition, computer graphics. Corners are the robust features in the sense that they provide important information regarding objects under translation, rotation and scale change. They are also important from the view point of understanding human perception of objects. They play crucial role in decomposing or describing the digital curves. They are also used in scale space theory, image representation, stereo vision, motion tracking, image matching, building mosaics and font designing systems. If the corner points are identified properly, a shape can be represented in an efficient and compact way with sufficient accuracy. Corner detection schemes, based on their applications, can be broadly divided into two categories: binary (suitable for binary images) and gray level (suitable for gray level images). Corner detection approaches for binary images usually involve segmenting the image into regions and extracting boundaries from those regions that contain them. The techniques for gray level images can be categorized into two classes: (a) Template based and (b) gradient based. The template based techniques utilize correlation between a sub-image and a template of a given angle. A corner point is selected by finding the maximum of the correlation output. Gradient based techniques require computing curvature of an edge that passes through a neighborhood in a gray level image. Many corner detection algorithms have been proposed in the literature which can be broadly divided into two parts. One is to detect corner points from grayscale images and other relates to boundary based corner detection. This contribution mainly deals with techniques adopted for later approach

  2. Fort Peck-Wolf Point transmission line project, Montana

    International Nuclear Information System (INIS)

    1992-01-01

    The primary objective of the project is to replace the existing 36-mile Fort Peck-Wolf Point transmission line which has reached the end of its useful service life. Presently, the overall condition of this existing section of the 47-year-old line is poor. Frequent repairs have been required because of the absence of overhead ground wires. The continued maintenance of the line will become more expensive and customer interruptions will persist because of the damage due to lightning. The expense of replacing shell rotted poles, and the concern for the safety of the maintenance personnel because of hazards caused by severe shell rot are also of primary importance. The operational and maintenance problems coupled with power system simulation studies, demonstrate the need for improvements to the Wolf Point area to serve area loads. Western's Wolf Point Substation is an important point of interconnection for the power output from the Fort Peck Dam to area loads as far away as Williston, North Dakota. The proposed transmission line replacement would assure that there will continue to be reliable transmission capacity available to serve area electrical loads, as well as provide a reliable second high-voltage transmission path from the Fort Peck generation to back-up a loss of the Fort Peck-Wolf Point 115-kV Line No. 1

  3. The MCM-binding protein ETG1 aids sister chromatid cohesion required for postreplicative homologous recombination repair.

    Directory of Open Access Journals (Sweden)

    Naoki Takahashi

    2010-01-01

    Full Text Available The DNA replication process represents a source of DNA stress that causes potentially spontaneous genome damage. This effect might be strengthened by mutations in crucial replication factors, requiring the activation of DNA damage checkpoints to enable DNA repair before anaphase onset. Here, we demonstrate that depletion of the evolutionarily conserved minichromosome maintenance helicase-binding protein ETG1 of Arabidopsis thaliana resulted in a stringent late G2 cell cycle arrest. This arrest correlated with a partial loss of sister chromatid cohesion. The lack-of-cohesion phenotype was intensified in plants without functional CTF18, a replication fork factor needed for cohesion establishment. The synergistic effect of the etg1 and ctf18 mutants on sister chromatid cohesion strengthened the impact on plant growth of the replication stress caused by ETG1 deficiency because of inefficient DNA repair. We conclude that the ETG1 replication factor is required for efficient cohesion and that cohesion establishment is essential for proper development of plants suffering from endogenous DNA stress. Cohesion defects observed upon knockdown of its human counterpart suggest an equally important developmental role for the orthologous mammalian ETG1 protein.

  4. The Extraction of Vegetation Points from LiDAR Using 3D Fractal Dimension Analyses

    Directory of Open Access Journals (Sweden)

    Haiquan Yang

    2015-08-01

    Full Text Available Light Detection and Ranging (LiDAR, a high-precision technique used for acquiring three-dimensional (3D surface information, is widely used to study surface vegetation information. Moreover, the extraction of a vegetation point set from the LiDAR point cloud is a basic starting-point for vegetation information analysis, and an important part of its further processing. To extract the vegetation point set completely and to describe the different spatial morphological characteristics of various features in a LiDAR point cloud, we have used 3D fractal dimensions. We discovered that every feature has its own distinctive 3D fractal dimension interval. Based on the 3D fractal dimensions of tall trees, we propose a new method for the extraction of vegetation using airborne LiDAR. According to this method, target features can be distinguished based on their morphological characteristics. The non-ground points acquired by filtering are processed by region growing segmentation and the morphological characteristics are evaluated by 3D fractal dimensions to determine the features required for the determination of the point set for tall trees. Avon, New York, USA was selected as the study area to test the method and the result proves the method’s efficiency. Thus, this approach is feasible. Additionally, the method uses the 3D coordinate properties of the LiDAR point cloud and does not require additional information, such as return intensity, giving it a larger scope of application.

  5. Asymmetrical floating point array processors, their application to exploration and exploitation

    Energy Technology Data Exchange (ETDEWEB)

    Geriepy, B L

    1983-01-01

    An asymmetrical floating point array processor is a special-purpose scientific computer which operates under asymmetrical control of a host computer. Although an array processor can receive fixed point input and produce fixed point output, its primary mode of operation is floating point. The first generation of array processors was oriented towards time series information. The next generation of array processors has proved much more versatile and their applicability ranges from petroleum reservoir simulation to speech syntheses. Array processors are becoming commonplace in mining, the primary usage being construction of grids-by usual methods or by kriging. The Australian mining community is among the world's leaders in regard to computer-assisted exploration and exploitation systems. Part of this leadership role must be providing guidance to computer vendors in regard to current and future requirements.

  6. Microprocessor-controlled step-down maximum-power-point tracker for photovoltaic systems

    Science.gov (United States)

    Mazmuder, R. K.; Haidar, S.

    1992-12-01

    An efficient maximum power point tracker (MPPT) has been developed and can be used with a photovoltaic (PV) array and a load which requires lower voltage than the PV array voltage to be operated. The MPPT makes the PV array to operate at maximum power point (MPP) under all insolation and temperature, which ensures the maximum amount of available PV power to be delivered to the load. The performance of the MPPT has been studied under different insolation levels.

  7. Estimating ISABELLE shielding requirements

    International Nuclear Information System (INIS)

    Stevens, A.J.; Thorndike, A.M.

    1976-01-01

    Estimates were made of the shielding thicknesses required at various points around the ISABELLE ring. Both hadron and muon requirements are considered. Radiation levels at the outside of the shield and at the BNL site boundary are kept at or below 1000 mrem per year and 5 mrem/year respectively. Muon requirements are based on the Wang formula for pion spectra, and the hadron requirements on the hadron cascade program CYLKAZ of Ranft. A muon shield thickness of 77 meters of sand is indicated outside the ring in one area, and hadron shields equivalent to from 2.7 to 5.6 meters in thickness of sand above the ring. The suggested safety allowance would increase these values to 86 meters and 4.0 to 7.2 meters respectively. There are many uncertainties in such estimates, but these last figures are considered to be rather conservative

  8. An examination of the concept of driving point receptance

    Science.gov (United States)

    Sheng, X.; He, Y.; Zhong, T.

    2018-04-01

    In the field of vibration, driving point receptance is a well-established and widely applied concept. However, as demonstrated in this paper, when a driving point receptance is calculated using the finite element (FE) method with solid elements, it does not converge as the FE mesh becomes finer, suggesting that there is a singularity. Hence, the concept of driving point receptance deserves a rigorous examination. In this paper, it is firstly shown that, for a point harmonic force applied on the surface of an elastic half-space, the Boussinesq formula can be applied to calculate the displacement amplitude of the surface if the response point is sufficiently close to the load. Secondly, by applying the Betti reciprocal theorem, it is shown that the displacement of an elastic body near a point harmonic force can be decomposed into two parts, with the first one being the displacement of an elastic half-space. This decomposition is useful, since it provides a solid basis for the introduction of a contact spring between a wheel and a rail in interaction. However, according to the Boussinesq formula, this decomposition also leads to the conclusion that a driving point receptance is infinite (singular), and would be undefinable. Nevertheless, driving point receptances have been calculated using different methods. Since the singularity identified in this paper was not appreciated, no account was given to the singularity in these calculations. Thus, the validity of these calculation methods must be examined. This constructs the third part of the paper. As the final development of the paper, the above decomposition is utilised to define and determine driving point receptances required for dealing with wheel/rail interactions.

  9. On Multi-Point Liouville Field Theory

    International Nuclear Information System (INIS)

    Zarrinkamar, S.; Rajabi, A. A.; Hassanabadi, H.

    2013-01-01

    In many cases, the classical or semi-classical Liouville field theory appears in the form of Fuchsian or Riemann differential equations whose solutions cannot be simply found, or at least require a comprehensive knowledge on analytical techniques of differential equations of mathematical physics. Here, instead of other cumbersome methodologies such as treating with the Heun functions, we use the quasi-exact ansatz approach and thereby solve the so-called resulting two- and three-point differential equations in a very simple manner. We apply the approach to two recent papers in the field. (author)

  10. A smooth landscape: ending saddle point inflation requires features to be shallow

    International Nuclear Information System (INIS)

    Battefeld, Diana; Battefeld, Thorsten

    2013-01-01

    We consider inflation driven near a saddle point in a higher dimensional field space, which is the most likely type of slow roll inflation on the string theoretical landscape; anthropic arguments need to be invoked in order to find a sufficiently flat region. To give all inflatons large masses after inflation and yield a small but positive cosmological constant, the trajectory in field space needs to terminate in a hole on the inflationary plateau, introducing a curved end-of-inflation hypersurface. We compute non-Gaussianities (bi- and tri-spectrum) caused by this curved hyper-surface and find a negative, potentially large, local non-linearity parameter. To be consistent with current observational bounds, the hole needs to be shallow, i.e. considerably wider than deep in natural units. To avoid singling out our vacuum as special (i.e. more special than a positive cosmological constant entails), we deduce that all features on field space should be similarly shallow, severely limiting the type of landscapes one may use for inflationary model building. We justify the use of a truncated Fourier series with random coefficients, which are suppressed the higher the frequency, to model such a smooth landscape by a random potential, as is often done in the literature without a good a priory reason

  11. 77 FR 9522 - Requirements for Consumer Registration of Durable Infant or Toddler Products

    Science.gov (United States)

    2012-02-17

    ... type of matrix barcode that allow[s] storage of information, including links that direct consumers to a... Required Font Size (Sec. 1130.6(b)(2)) As originally published, Sec. 1130.6(c) requires that registration forms use 12-point and 10-point type. Manufacturers and testing labs reported confusion concerning the...

  12. Engineering management at feasibility study stage of nuclear power plant under EPC mode

    International Nuclear Information System (INIS)

    Wang Zhiqiang

    2015-01-01

    After the investment reform by the State Council in 2004, NDRC carries out approval system for enterprises to invest in nuclear power plants. Feasibility study stage is a critical stage on the mainline of nuclear power project approval, which intersects with the license application, and engineering design. The owners of nuclear power plants are required stringently in engineering management. From the owners' management point of view under EPC mode, this paper sorts the preliminary project process for nuclear power plants, focusing on the management in the feasibility study stage. License application and engineering design management in the feasibility study stage are also discussed. (author)

  13. On the sound field requirements in the hearing protector standard ISO 4869-1

    DEFF Research Database (Denmark)

    Jensen, N. S.; Poulsen, Torben

    1999-01-01

    The sound field requirements in the ISO 4869 1 standard for hearing protector attenuation measurements comprise two parts: 1) a sound level difference requirement for positions around the head of the listener (ie at positions 15 cm from a reference point; up-down, front-back and left-right) and 2......) a directivity requirement for the sound incidence at the reference point, measured with a directional microphone, to ensure an approximate diffuse sound field. The level difference requirement (1) is not difficult to fulfil but the directivity requirement (2) may lead to contradicting results if the measurement...

  14. Fixed point structure of quenched, planar quantum electrodynamics

    International Nuclear Information System (INIS)

    Love, S.T.

    1986-07-01

    Gauge theories exhibiting a hierarchy of fermion mass scales may contain a pseudo-Nambu-Boldstone boson of spontaneously broken scale invariance. The relation between scale and chiral symmetry breaking is studied analytically in quenched, planar quantum electrodynamics in four dimensions. The model possesses a novel nonperturbative ultraviolet fixed point governing its strong coupling phase which requires the mixing of four fermion operators. 12 refs

  15. Design of point-of-care (POC) microfluidic medical diagnostic devices

    Science.gov (United States)

    Leary, James F.

    2018-02-01

    Design of inexpensive and portable hand-held microfluidic flow/image cytometry devices for initial medical diagnostics at the point of initial patient contact by emergency medical personnel in the field requires careful design in terms of power/weight requirements to allow for realistic portability as a hand-held, point-of-care medical diagnostics device. True portability also requires small micro-pumps for high-throughput capability. Weight/power requirements dictate use of super-bright LEDs and very small silicon photodiodes or nanophotonic sensors that can be powered by batteries. Signal-to-noise characteristics can be greatly improved by appropriately pulsing the LED excitation sources and sampling and subtracting noise in between excitation pulses. The requirements for basic computing, imaging, GPS and basic telecommunications can be simultaneously met by use of smartphone technologies, which become part of the overall device. Software for a user-interface system, limited real-time computing, real-time imaging, and offline data analysis can be accomplished through multi-platform software development systems that are well-suited to a variety of currently available cellphone technologies which already contain all of these capabilities. Microfluidic cytometry requires judicious use of small sample volumes and appropriate statistical sampling by microfluidic cytometry or imaging for adequate statistical significance to permit real-time (typically medical decisions for patients at the physician's office or real-time decision making in the field. One or two drops of blood obtained by pin-prick should be able to provide statistically meaningful results for use in making real-time medical decisions without the need for blood fractionation, which is not realistic in the field.

  16. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method.

    Science.gov (United States)

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-12-24

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  17. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method

    Directory of Open Access Journals (Sweden)

    Yueqian Shen

    2016-12-01

    Full Text Available A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  18. Managing distance and covariate information with point-based clustering

    Directory of Open Access Journals (Sweden)

    Peter A. Whigham

    2016-09-01

    Full Text Available Abstract Background Geographic perspectives of disease and the human condition often involve point-based observations and questions of clustering or dispersion within a spatial context. These problems involve a finite set of point observations and are constrained by a larger, but finite, set of locations where the observations could occur. Developing a rigorous method for pattern analysis in this context requires handling spatial covariates, a method for constrained finite spatial clustering, and addressing bias in geographic distance measures. An approach, based on Ripley’s K and applied to the problem of clustering with deliberate self-harm (DSH, is presented. Methods Point-based Monte-Carlo simulation of Ripley’s K, accounting for socio-economic deprivation and sources of distance measurement bias, was developed to estimate clustering of DSH at a range of spatial scales. A rotated Minkowski L1 distance metric allowed variation in physical distance and clustering to be assessed. Self-harm data was derived from an audit of 2 years’ emergency hospital presentations (n = 136 in a New Zealand town (population ~50,000. Study area was defined by residential (housing land parcels representing a finite set of possible point addresses. Results Area-based deprivation was spatially correlated. Accounting for deprivation and distance bias showed evidence for clustering of DSH for spatial scales up to 500 m with a one-sided 95 % CI, suggesting that social contagion may be present for this urban cohort. Conclusions Many problems involve finite locations in geographic space that require estimates of distance-based clustering at many scales. A Monte-Carlo approach to Ripley’s K, incorporating covariates and models for distance bias, are crucial when assessing health-related clustering. The case study showed that social network structure defined at the neighbourhood level may account for aspects of neighbourhood clustering of DSH. Accounting for

  19. Applicability of bacterial cellulose as an alternative to paper points in endodontic treatment.

    Science.gov (United States)

    Yoshino, Aya; Tabuchi, Mari; Uo, Motohiro; Tatsumi, Hiroto; Hideshima, Katsumi; Kondo, Seiji; Sekine, Joji

    2013-04-01

    Dental root canal treatment is required when dental caries progress to infection of the dental pulp. A major goal of this treatment is to provide complete decontamination of the dental root canal system. However, the morphology of dental root canal systems is complex, and many human dental roots have inaccessible areas. In addition, dental reinfection is fairly common. In conventional treatment, a cotton pellet and paper point made from plant cellulose is used to dry and sterilize the dental root canal. Such sterilization requires a treatment material with high absorbency to remove any residue, the ability to improve the efficacy of intracanal medication and high biocompatibility. Bacterial cellulose (BC) is produced by certain strains of bacteria. In this study, we developed BC in a pointed form and evaluated its applicability as a novel material for dental canal treatment with regard to solution absorption, expansion, tensile strength, drug release and biocompatibility. We found that BC has excellent material and biological characteristics compared with conventional materials, such as paper points (plant cellulose). BC showed noticeably higher absorption and expansion than paper points, and maintained a high tensile strength even when wet. The cumulative release of a model drug was significantly greater from BC than from paper points, and BC showed greater compatibility than paper points. Taken together, BC has great potential for use in dental root canal treatment. Copyright © 2012 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  20. 40 CFR 1054.625 - What requirements apply under the Transition Program for Equipment Manufacturers?

    Science.gov (United States)

    2010-07-01

    ... manufacturers to produce equipment with Class II engines that are subject to less stringent exhaust emission... equipment with Class II engines exempted under this section. You may use the exemptions in this section only... by determining your U.S.-directed production volume of equipment with Class II engines from January 1...

  1. Study on Huizhou architecture of point cloud registration based on optimized ICP algorithm

    Science.gov (United States)

    Zhang, Runmei; Wu, Yulu; Zhang, Guangbin; Zhou, Wei; Tao, Yuqian

    2018-03-01

    In view of the current point cloud registration software has high hardware requirements, heavy workload and moltiple interactive definition, the source of software with better processing effect is not open, a two--step registration method based on normal vector distribution feature and coarse feature based iterative closest point (ICP) algorithm is proposed in this paper. This method combines fast point feature histogram (FPFH) algorithm, define the adjacency region of point cloud and the calculation model of the distribution of normal vectors, setting up the local coordinate system for each key point, and obtaining the transformation matrix to finish rough registration, the rough registration results of two stations are accurately registered by using the ICP algorithm. Experimental results show that, compared with the traditional ICP algorithm, the method used in this paper has obvious time and precision advantages for large amount of point clouds.

  2. The Effect of Honors Courses on Grade Point Averages

    Science.gov (United States)

    Spisak, Art L.; Squires, Suzanne Carter

    2016-01-01

    High-ability entering college students give three main reasons for not choosing to become part of honors programs and colleges; they and/or their parents believe that honors classes at the university level require more work than non-honors courses, are more stressful, and will adversely affect their self-image and grade point average (GPA) (Hill;…

  3. Testing of the BipiColombo Antenna Pointing Mechanism

    Science.gov (United States)

    Campo, Pablo; Barrio, Aingeru; Martin, Fernando

    2015-09-01

    BepiColombo is an ESA mission to Mercury, its planetary orbiter (MPO) has two antenna pointing mechanism, High gain antenna (HGA) pointing mechanism steers and points a large reflector which is integrated at system level by TAS-I Rome. Medium gain antenna (MGA) APM points a 1.5 m boom with a horn antenna. Both radiating elements are exposed to sun fluxes as high as 10 solar constants without protections.A previous paper [1] described the design and development process to solve the challenges of performing in harsh environment.. Current paper is focused on the testing process of the qualification units. Testing performance of antenna pointing mechanism in its specific environmental conditions has required special set-up and techniques. The process has provided valuable feedback on the design and the testing methods which have been included in the PFM design and tests.Some of the technologies and components were developed on dedicated items priort to EQM, but once integrated, test behaviour had relevant differences.Some of the major concerns for the APM testing are:- Create during the thermal vacuum testing the qualification temperature map with gradients along the APM. From of 200oC to 70oC.- Test in that conditions the radio frequency and pointing performances adding also high RF power to check the power handling and self-heating of the rotary joint.- Test in life up to 12000 equivalent APM revolutions, that is 14.3 million motor revolutions in different thermal conditions.- Measure low thermal distortion of the mechanical chain, being at the same time insulated from external environment and interfaces (55 arcsec pointing error)- Perform deployment of large items guaranteeing during the process low humidity, below 5% to protect dry lubrication- Verify stability with representative inertia of large boom or reflector 20 Kgm2.

  4. The resilience of an operating point for a fusion power plant

    Energy Technology Data Exchange (ETDEWEB)

    Ward, David, E-mail: david.ward@ccfe.ac.uk; Kemp, Richard

    2015-10-15

    Highlights: • The need to control a power plant changes our view of the optimum design. • The need for control can be reduced by finding resilient design points. • It is important to include resilience and control in selecting design points. • Including these additional constraints reduces flexibility in choice of operating points. - Abstract: The operating point for fusion power plant design concepts is often determined by simultaneously satisfying the requirements of all of the main plant systems and finding an optimum solution, for instance the one with the lowest capital cost or cost of electricity. This static assessment takes no account of the sensitivity of that operating point to variations in key parameters and therefore includes no information about how difficult to adjust and control the chosen operating point may be. Control of the operation point is a large subject with much work still to be done, and is expected to play an increasing role in the future in choosing the optimum design point. Here we present results of two analyses: one relates to the ability to load follow, that is, to vary the power production in the light of varying demands for power from the electricity network; the other investigates in simple terms what choices we can make to improve the resilience of static operating points.

  5. 3-D OBJECT RECOGNITION FROM POINT CLOUD DATA

    Directory of Open Access Journals (Sweden)

    W. Smith

    2012-09-01

    Full Text Available The market for real-time 3-D mapping includes not only traditional geospatial applications but also navigation of unmanned autonomous vehicles (UAVs. Massively parallel processes such as graphics processing unit (GPU computing make real-time 3-D object recognition and mapping achievable. Geospatial technologies such as digital photogrammetry and GIS offer advanced capabilities to produce 2-D and 3-D static maps using UAV data. The goal is to develop real-time UAV navigation through increased automation. It is challenging for a computer to identify a 3-D object such as a car, a tree or a house, yet automatic 3-D object recognition is essential to increasing the productivity of geospatial data such as 3-D city site models. In the past three decades, researchers have used radiometric properties to identify objects in digital imagery with limited success, because these properties vary considerably from image to image. Consequently, our team has developed software that recognizes certain types of 3-D objects within 3-D point clouds. Although our software is developed for modeling, simulation and visualization, it has the potential to be valuable in robotics and UAV applications. The locations and shapes of 3-D objects such as buildings and trees are easily recognizable by a human from a brief glance at a representation of a point cloud such as terrain-shaded relief. The algorithms to extract these objects have been developed and require only the point cloud and minimal human inputs such as a set of limits on building size and a request to turn on a squaring option. The algorithms use both digital surface model (DSM and digital elevation model (DEM, so software has also been developed to derive the latter from the former. The process continues through the following steps: identify and group 3-D object points into regions; separate buildings and houses from trees; trace region boundaries; regularize and simplify boundary polygons; construct complex

  6. 3-D Object Recognition from Point Cloud Data

    Science.gov (United States)

    Smith, W.; Walker, A. S.; Zhang, B.

    2011-09-01

    The market for real-time 3-D mapping includes not only traditional geospatial applications but also navigation of unmanned autonomous vehicles (UAVs). Massively parallel processes such as graphics processing unit (GPU) computing make real-time 3-D object recognition and mapping achievable. Geospatial technologies such as digital photogrammetry and GIS offer advanced capabilities to produce 2-D and 3-D static maps using UAV data. The goal is to develop real-time UAV navigation through increased automation. It is challenging for a computer to identify a 3-D object such as a car, a tree or a house, yet automatic 3-D object recognition is essential to increasing the productivity of geospatial data such as 3-D city site models. In the past three decades, researchers have used radiometric properties to identify objects in digital imagery with limited success, because these properties vary considerably from image to image. Consequently, our team has developed software that recognizes certain types of 3-D objects within 3-D point clouds. Although our software is developed for modeling, simulation and visualization, it has the potential to be valuable in robotics and UAV applications. The locations and shapes of 3-D objects such as buildings and trees are easily recognizable by a human from a brief glance at a representation of a point cloud such as terrain-shaded relief. The algorithms to extract these objects have been developed and require only the point cloud and minimal human inputs such as a set of limits on building size and a request to turn on a squaring option. The algorithms use both digital surface model (DSM) and digital elevation model (DEM), so software has also been developed to derive the latter from the former. The process continues through the following steps: identify and group 3-D object points into regions; separate buildings and houses from trees; trace region boundaries; regularize and simplify boundary polygons; construct complex roofs. Several case

  7. Clinical evaluation of the FreeStyle Precision Pro system.

    Science.gov (United States)

    Brazg, Ronald; Hughes, Kristen; Martin, Pamela; Coard, Julie; Toffaletti, John; McDonnell, Elizabeth; Taylor, Elizabeth; Farrell, Lausanne; Patel, Mona; Ward, Jeanne; Chen, Ting; Alva, Shridhara; Ng, Ronald

    2013-06-05

    A new version of international standard (ISO 15197) and CLSI Guideline (POCT12) with more stringent accuracy criteria are near publication. We evaluated the glucose test performance of the FreeStyle Precision Pro system, a new blood glucose monitoring system (BGMS) designed to enhance accuracy for point-of-care testing (POCT). Precision, interference and system accuracy with 503 blood samples from capillary, venous and arterial sources were evaluated in a multicenter study. Study results were analyzed and presented in accordance with the specifications and recommendations of the final draft ISO 15197 and the new POCT12. The FreeStyle Precision Pro system demonstrated acceptable precision (CV FreeStyle Precision Pro system met the tighter accuracy requirements, providing a means for enhancing accuracy for point-of-care blood glucose monitoring. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. 40 CFR 425.06 - Monitoring requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Monitoring requirements. 425.06 Section 425.06 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS LEATHER TANNING AND FINISHING POINT SOURCE CATEGORY General Provisions § 425.06...

  9. Controllable resonant tunnelling through single-point potentials: A point triode

    International Nuclear Information System (INIS)

    Zolotaryuk, A.V.; Zolotaryuk, Yaroslav

    2015-01-01

    A zero-thickness limit of three-layer heterostructures under two bias voltages applied externally, where one of which is supposed to be a gate parameter, is studied. As a result, an effect of controllable resonant tunnelling of electrons through single-point potentials is shown to exist. Therefore the limiting structure may be termed a “point triode” and considered in the theory of point interactions as a new object. The simple limiting analytical expressions adequately describe the resonant behaviour in the transistor with realistic parameter values and thus one can conclude that the zero-range limit of multi-layer structures may be used in fabricating nanodevices. The difference between the resonant tunnelling across single-point potentials and the Fabry–Pérot interference effect is also emphasized. - Highlights: • The zero-thickness limit of three-layer heterostructures is described in terms of point interactions. • The effect of resonant tunnelling through these single-point potentials is established. • The resonant tunnelling is shown to be controlled by a gate voltage

  10. Point defect engineering strategies to retard phosphorous diffusion in germanium

    KAUST Repository

    Tahini, H. A.; Chroneos, Alexander I.; Grimes, Robin W.; Schwingenschlö gl, Udo; Bracht, Hartmut A.

    2013-01-01

    The diffusion of phosphorous in germanium is very fast, requiring point defect engineering strategies to retard it in support of technological application. Density functional theory corroborated with hybrid density functional calculations are used to investigate the influence of the isovalent codopants tin and hafnium in the migration of phosphorous via the vacancy-mediated diffusion process. The migration energy barriers for phosphorous are increased significantly in the presence of oversized isovalent codopants. Therefore, it is proposed that tin and in particular hafnium codoping are efficient point defect engineering strategies to retard phosphorous migration. © the Owner Societies 2013.

  11. Tipping Point

    Medline Plus

    Full Text Available ... OnSafety CPSC Stands for Safety The Tipping Point Home > 60 Seconds of Safety (Videos) > The Tipping Point ... 24 hours a day. For young children whose home is a playground, it’s the best way to ...

  12. Extreme values, regular variation and point processes

    CERN Document Server

    Resnick, Sidney I

    1987-01-01

    Extremes Values, Regular Variation and Point Processes is a readable and efficient account of the fundamental mathematical and stochastic process techniques needed to study the behavior of extreme values of phenomena based on independent and identically distributed random variables and vectors It presents a coherent treatment of the distributional and sample path fundamental properties of extremes and records It emphasizes the core primacy of three topics necessary for understanding extremes the analytical theory of regularly varying functions; the probabilistic theory of point processes and random measures; and the link to asymptotic distribution approximations provided by the theory of weak convergence of probability measures in metric spaces The book is self-contained and requires an introductory measure-theoretic course in probability as a prerequisite Almost all sections have an extensive list of exercises which extend developments in the text, offer alternate approaches, test mastery and provide for enj...

  13. Point-of-Purchase Food Marketing and Policy Solutions

    OpenAIRE

    Soo, Jackie

    2016-01-01

    Background: Food marketing has been implicated as a driver of obesity. However, few studies have examined point-of-purchase marketing in supermarkets and restaurants, or marketing in lower-income countries. Furthermore, policy solutions to counteract marketing and provide consumers with objective nutritional information require evidence of efficacy. Paper 1. We documented child-oriented marketing practices, product claims, and health-evoking images on 106 cereals sold in Guatemala City, Gu...

  14. Interesting Interest Points

    DEFF Research Database (Denmark)

    Aanæs, Henrik; Dahl, Anders Lindbjerg; Pedersen, Kim Steenstrup

    2012-01-01

    on spatial invariance of interest points under changing acquisition parameters by measuring the spatial recall rate. The scope of this paper is to investigate the performance of a number of existing well-established interest point detection methods. Automatic performance evaluation of interest points is hard......Not all interest points are equally interesting. The most valuable interest points lead to optimal performance of the computer vision method in which they are employed. But a measure of this kind will be dependent on the chosen vision application. We propose a more general performance measure based...... position. The LED illumination provides the option for artificially relighting the scene from a range of light directions. This data set has given us the ability to systematically evaluate the performance of a number of interest point detectors. The highlights of the conclusions are that the fixed scale...

  15. Material-Point-Method Analysis of Collapsing Slopes

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars

    2009-01-01

    To understand the dynamic evolution of landslides and predict their physical extent, a computational model is required that is capable of analysing complex material behaviour as well as large strains and deformations. Here, a model is presented based on the so-called generalised-interpolation mat......To understand the dynamic evolution of landslides and predict their physical extent, a computational model is required that is capable of analysing complex material behaviour as well as large strains and deformations. Here, a model is presented based on the so-called generalised......, a deformed material description is introduced, based on time integration of the deformation gradient and utilising Gauss quadrature over the volume associated with each material point. The method has been implemented in a Fortran code and employed for the analysis of a landslide that took place during...

  16. Point to point multispectral light projection applied to cultural heritage

    Science.gov (United States)

    Vázquez, D.; Alvarez, A.; Canabal, H.; Garcia, A.; Mayorga, S.; Muro, C.; Galan, T.

    2017-09-01

    Use of new of light sources based on LED technology should allow the develop of systems that combine conservation and exhibition requirements and allow to make these art goods available to the next generations according to sustainability principles. The goal of this work is to develop light systems and sources with an optimized spectral distribution for each specific point of the art piece. This optimization process implies to maximize the color fidelity reproduction and the same time to minimize the photochemical damage. Perceived color under these sources will be similar (metameric) to technical requirements given by the restoration team uncharged of the conservation and exhibition of the goods of art. Depending of the fragility of the exposed art objects (i.e. spectral responsivity of the material) the irradiance must be kept under a critical level. Therefore, it is necessary to develop a mathematical model that simulates with enough accuracy both the visual effect of the illumination and the photochemical impact of the radiation. Spectral reflectance of a reference painting The mathematical model is based on a merit function that optimized the individual intensity of the LED-light sources taking into account the damage function of the material and color space coordinates. Moreover the algorithm used weights for damage and color fidelity in order to adapt the model to a specific museal application. In this work we show a sample of this technology applied to a picture of Sorolla (1863-1923) an important Spanish painter title "woman walking at the beach".

  17. Photovoltaics at Point Pelee Park

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    Case study of an Ontario Hydro-installed photovoltaic system at Point Pelee Park, a bird sanctuary located on Lake Erie, is described. The system consists of a 1080 W photovoltaic array used to supply electricity to one of the washrooms. The cost for installing the system was $30,000 which was considerably cheaper than the $100,000 estimate for an underground power line. The independent system is the only source of energy for the washroom, therefore it was necessary to reduce the total electrical demand required by the facility. Electricity was used for the water pump, chlorinator and lighting. Motion sensors were installed to further reduce electrical demand. Washroom heaters were converted to propane. 2 figs.

  18. High precision target center determination from a point cloud

    Directory of Open Access Journals (Sweden)

    K. Kregar

    2013-10-01

    Full Text Available Many applications of terrestrial laser scanners (TLS require the determination of a specific point from a point cloud. In this paper procedure of high precision planar target center acquisition from point cloud is presented. The process is based on an image matching algorithm but before we can deal with raster image to fit a target on it, we need to properly determine the best fitting plane and project points on it. The main emphasis of this paper is in the precision estimation and propagation through the whole procedure which allows us to obtain precision assessment of final results (target center coordinates. Theoretic precision estimations – obtained through the procedure were rather high so we compared them with the empiric precision estimations obtained as standard deviations of results of 60 independently scanned targets. An χ2-test confirmed that theoretic precisions are overestimated. The problem most probably lies in the overestimated precisions of the plane parameters due to vast redundancy of points. However, empirical precisions also confirmed that the proposed procedure can ensure a submillimeter precision level. The algorithm can automatically detect grossly erroneous results to some extent. It can operate when the incidence angles of a laser beam are as high as 80°, which is desirable property if one is going to use planar targets as tie points in scan registration. The proposed algorithm will also contribute to improve TLS calibration procedures.

  19. Zero-point oscillations, zero-point fluctuations, and fluctuations of zero-point oscillations

    International Nuclear Information System (INIS)

    Khalili, Farit Ya

    2003-01-01

    Several physical effects and methodological issues relating to the ground state of an oscillator are considered. Even in the simplest case of an ideal lossless harmonic oscillator, its ground state exhibits properties that are unusual from the classical point of view. In particular, the mean value of the product of two non-negative observables, kinetic and potential energies, is negative in the ground state. It is shown that semiclassical and rigorous quantum approaches yield substantially different results for the ground state energy fluctuations of an oscillator with finite losses. The dependence of zero-point fluctuations on the boundary conditions is considered. Using this dependence, it is possible to transmit information without emitting electromagnetic quanta. Fluctuations of electromagnetic pressure of zero-point oscillations are analyzed, and the corresponding mechanical friction is considered. This friction can be viewed as the most fundamental mechanism limiting the quality factor of mechanical oscillators. Observation of these effects exceeds the possibilities of contemporary experimental physics but almost undoubtedly will be possible in the near future. (methodological notes)

  20. Point-point and point-line moving-window correlation spectroscopy and its applications

    Science.gov (United States)

    Zhou, Qun; Sun, Suqin; Zhan, Daqi; Yu, Zhiwu

    2008-07-01

    In this paper, we present a new extension of generalized two-dimensional (2D) correlation spectroscopy. Two new algorithms, namely point-point (P-P) correlation and point-line (P-L) correlation, have been introduced to do the moving-window 2D correlation (MW2D) analysis. The new method has been applied to a spectral model consisting of two different processes. The results indicate that P-P correlation spectroscopy can unveil the details and re-constitute the entire process, whilst the P-L can provide general feature of the concerned processes. Phase transition behavior of dimyristoylphosphotidylethanolamine (DMPE) has been studied using MW2D correlation spectroscopy. The newly proposed method verifies that the phase transition temperature is 56 °C, same as the result got from a differential scanning calorimeter. To illustrate the new method further, a lysine and lactose mixture has been studied under thermo perturbation. Using the P-P MW2D, the Maillard reaction of the mixture was clearly monitored, which has been very difficult using conventional display of FTIR spectra.

  1. Tipping Point

    Medline Plus

    Full Text Available ... 60 Seconds of Safety (Videos) > The Tipping Point The Tipping Point by CPSC Blogger September 22, 2009 appliance child Childproofing CPSC danger death electrical fall furniture head injury product safety television tipover tv Watch the video in Adobe Flash ...

  2. Mission Concept for the Single Aperture Far-Infrared (SAFIR) Observatory

    Science.gov (United States)

    Benford, Dominic J.; Amato, Michael J.; Mather, John C.; Moseley, S. Harvey, Jr.

    2004-01-01

    We have developed a preliminary but comprehensive mission concept for SAFIR, as a 10 m-class far-infrared and submillimeter observatory that would begin development later in this decade to meet the needs outlined above. Its operating temperature ( or approx. 40 microns. This would provide a point source sensitivity improvement of several orders of magnitude over that of the Spitzer Space Telescope (previously SIRTF) or the Herschel Space Observatory. Additionally, it would have an angular resolution 12 times finer than that of Spitzer and three times finer than Herschel. This sensitivity and angular resolution are necessary to perform imaging and spectroscopic studies of individual galaxies in the early universe. We have considered many aspects of the SAFIR mission, including the telescope technology (optical design, materials, and packaging), detector needs and technologies, cooling method and required technology developments, attitude and pointing, power systems, launch vehicle, and mission operations. The most challenging requirements for this mission are operating temperature and aperture size of the telescope, and the development of detector arrays. SAFIR can take advantage of much of the technology under development for JWST, but with much less stringent requirements on optical accuracy.

  3. A Limit on the Warm Dark Matter Particle Mass from the Redshifted 21 cm Absorption Line

    Science.gov (United States)

    Safarzadeh, Mohammadtaher; Scannapieco, Evan; Babul, Arif

    2018-06-01

    The recent Experiment to Detect the Global Epoch of Reionization Signature (EDGES) collaboration detection of an absorption signal at a central frequency of ν = 78 ± 1 MHz points to the presence of a significant Lyα background by a redshift of z = 18. The timing of this signal constrains the dark matter particle mass (m χ ) in the warm dark matter (WDM) cosmological model. WDM delays the formation of small-scale structures, and therefore a stringent lower limit can be placed on m χ based on the presence of a sufficiently strong Lyα background due to star formation at z = 18. Our results show that coupling the spin temperature to the gas through Lyα pumping requires a minimum mass of m χ > 3 keV if atomic cooling halos dominate the star formation rate at z = 18, and m χ > 2 keV if {{{H}}}2 cooling halos also form stars efficiently at this redshift. These limits match or exceed the most stringent limits cited to date in the literature, even in the face of the many uncertainties regarding star formation at high redshift.

  4. Experimental study of precisely selected evaporation chains in the decay of excited 25Mg

    Science.gov (United States)

    Camaiani, A.; Casini, G.; Morelli, L.; Barlini, S.; Piantelli, S.; Baiocco, G.; Bini, M.; Bruno, M.; Buccola, A.; Cinausero, M.; Cicerchia, M.; D'Agostino, M.; Degelier, M.; Fabris, D.; Frosin, C.; Gramegna, F.; Gulminelli, F.; Mantovani, G.; Marchi, T.; Olmi, A.; Ottanelli, P.; Pasquali, G.; Pastore, G.; Valdré, S.; Verde, G.

    2018-04-01

    The reaction 12C+13C at 95 MeV bombarding energy is studied using the Garfield + Ring Counter apparatus located at the INFN Laboratori Nazionali di Legnaro. In this paper we want to investigate the de-excitation of 25Mg aiming both at a new stringent test of the statistical description of nuclear decay and a direct comparison with the decay of the system 24Mg formed through 12C+12C reactions previously studied. Thanks to the large acceptance of the detector and to its good fragment identification capabilities, we could apply stringent selections on fusion-evaporation events, requiring their completeness in charge. The main decay features of the evaporation residues and of the emitted light particles are overall well described by a pure statistical model; however, as for the case of the previously studied 24Mg, we observed some deviations in the branching ratios, in particular for those chains involving only the evaporation of α particles. From this point of view the behavior of the 24Mg and 25Mg decay cases appear to be rather similar. An attempt to obtain a full mass balance even without neutron detection is also discussed.

  5. Role of the pharmacist in delivering point-of-care therapy for ...

    African Journals Online (AJOL)

    The wide variation in biological effect, narrow therapeutic range and pharmacokinetic and pharmacodynamic characteristics of warfarin require monitoring of the international normalised ratio (INR). Point-of-care results that are readily accessible for interpretation, allows the pharmacist to make dose adjustments ...

  6. Projecting India's energy requirements for policy formulation

    International Nuclear Information System (INIS)

    Parikh, Kirit S.; Karandikar, Vivek; Rana, Ashish; Dani, Prasanna

    2009-01-01

    Energy policy has to have a long-term perspective. To formulate it one needs to know the contours of energy requirements and options. Different approaches have been followed in literature, each with their own problems. A top down econometric approach provides little guidance on policies, while a bottom up approval requires too much knowledge and too many assumptions. Using top-down econometric approach for aggregate overall benchmarking and a detailed activity analysis model, Integrated Energy System Model, for a few large sectors, provides a unique combination for easing the difficulties of policy formulation. The model is described in this paper. Eleven alternate scenarios are built, designed to map out extreme points of feasible options. Results show that even after employing all domestic energy resource to their full potential, there will be a continued rise of fossil fuel use, continued importance of coal, and continued rise of import dependence. Energy efficiency emerges as a major option with a potential to reduce energy requirement by as much as 17%. Scenario results point towards pushing for development of alternative sources. (author)

  7. Fixed Points

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 5. Fixed Points - From Russia with Love - A Primer of Fixed Point Theory. A K Vijaykumar. Book Review Volume 5 Issue 5 May 2000 pp 101-102. Fulltext. Click here to view fulltext PDF. Permanent link:

  8. Demerit points systems.

    NARCIS (Netherlands)

    2006-01-01

    In 2012, 21 of the 27 EU Member States had some form of demerit points system. In theory, demerit points systems contribute to road safety through three mechanisms: 1) prevention of unsafe behaviour through the risk of receiving penalty points, 2) selection and suspension of the most frequent

  9. Safety and licensing of MHTGR [Modular High Temperature Gas Cooled Reactor

    International Nuclear Information System (INIS)

    Silady, F.A.; Millunzi, A.C.; Kelley, A.P. Jr.; Cunliffe, J.

    1987-07-01

    The Modular High Temperature Gas Cooled Reactor (MHTGR) design meets stringent top-level regulatory and user safety requirements that require that the normal and off-normal operation of the plant not disturb the public's day-to-day activities. Quantitative, top-level regulatory criteria have been specified from US NRC and EPA sources to guide the design. The user/utility group has further specified that these criteria be met at the plant boundary. The focus of the safety approach has then been centered on retaining the radionuclide inventory within the fuel by removing core heat, controlling chemical attack, and by controlling heat generation. The MHTGR is shown to passively meet the stringent requirements with margin. No operator action is required and the plant is insensitive to operator error

  10. Goals, requirements and prerequisites for teleradiology

    International Nuclear Information System (INIS)

    Walz, M.; Wein, B.; Lehmann, K.J.; Bolte, R.; Kilbinger, M.; Loose, R.; Guenther, R.W.; Georgi, M.

    1997-01-01

    Specific radiological requirements have to be considered for the realization of telemedicine. In this article the goals and requirements for an extensive introduction of teleradiology will be defined from the radiological user's point of view. Necessary medical, legal and professional prerequisites for teleradiology are presented. Essential requirements, such as data security maintenance of personal rights and standardization, must be realized. Application-specific requirements, e.g. quality and extent of teleradiological functions, as well as technological alternatives, are discussed. Each project must be carefully planned in relation to one's own needs, extent of functions and system selection. Topics, such as acknowledgement of electronic documentation, reimbursement of teleradiology and liability, must be clarified. Legal advice and the observance of quality guidelines are recommended. (orig.) [de

  11. Challenges and Opportunities of Centrifugal Microfluidics for Extreme Point-of-Care Testing

    Directory of Open Access Journals (Sweden)

    Issac J. Michael

    2016-02-01

    Full Text Available The advantages offered by centrifugal microfluidic systems have encouraged its rapid adaptation in the fields of in vitro diagnostics, clinical chemistry, immunoassays, and nucleic acid tests. Centrifugal microfluidic devices are currently used in both clinical and point-of-care settings. Recent studies have shown that this new diagnostic platform could be potentially used in extreme point-of-care settings like remote villages in the Indian subcontinent and in Africa. Several technological inventions have decentralized diagnostics in developing countries; however, very few microfluidic technologies have been successful in meeting the demand. By identifying the finest difference between the point-of-care testing and extreme point-of-care infrastructure, this review captures the evolving diagnostic needs of developing countries paired with infrastructural challenges with technological hurdles to healthcare delivery in extreme point-of-care settings. In particular, the requirements for making centrifugal diagnostic devices viable in developing countries are discussed based on a detailed analysis of the demands in different clinical settings including the distinctive needs of extreme point-of-care settings.

  12. Methods for registration laser scanner point clouds in forest stands

    International Nuclear Information System (INIS)

    Bienert, A.; Pech, K.; Maas, H.-G.

    2011-01-01

    Laser scanning is a fast and efficient 3-D measurement technique to capture surface points describing the geometry of a complex object in an accurate and reliable way. Besides airborne laser scanning, terrestrial laser scanning finds growing interest for forestry applications. These two different recording platforms show large differences in resolution, recording area and scan viewing direction. Using both datasets for a combined point cloud analysis may yield advantages because of their largely complementary information. In this paper, methods will be presented to automatically register airborne and terrestrial laser scanner point clouds of a forest stand. In a first step, tree detection is performed in both datasets in an automatic manner. In a second step, corresponding tree positions are determined using RANSAC. Finally, the geometric transformation is performed, divided in a coarse and fine registration. After a coarse registration, the fine registration is done in an iterative manner (ICP) using the point clouds itself. The methods are tested and validated with a dataset of a forest stand. The presented registration results provide accuracies which fulfill the forestry requirements [de

  13. 40 CFR 63.2535 - What compliance options do I have if part of my plant is subject to both this subpart and another...

    Science.gov (United States)

    2010-07-01

    ... stringent control requirements (e.g., design, operation, and inspection requirements for waste management...., organic chemicals subject to § 63.2435(b)(1), pharmaceutical products subject to § 63.1250, or pesticide...

  14. Some properties of point processes in statistical optics

    International Nuclear Information System (INIS)

    Picinbono, B.; Bendjaballah, C.

    2010-01-01

    The analysis of the statistical properties of the point process (PP) of photon detection times can be used to determine whether or not an optical field is classical, in the sense that its statistical description does not require the methods of quantum optics. This determination is, however, more difficult than ordinarily admitted and the first aim of this paper is to illustrate this point by using some results of the PP theory. For example, it is well known that the analysis of the photodetection of classical fields exhibits the so-called bunching effect. But this property alone cannot be used to decide the nature of a given optical field. Indeed, we have presented examples of point processes for which a bunching effect appears and yet they cannot be obtained from a classical field. These examples are illustrated by computer simulations. Similarly, it is often admitted that for fields with very low light intensity the bunching or antibunching can be described by using the statistical properties of the distance between successive events of the point process, which simplifies the experimental procedure. We have shown that, while this property is valid for classical PPs, it has no reason to be true for nonclassical PPs, and we have presented some examples of this situation also illustrated by computer simulations.

  15. Variation and Evolution of the Meiotic Requirement for Crossing Over in Mammals.

    Science.gov (United States)

    Dumont, Beth L

    2017-01-01

    The segregation of homologous chromosomes at the first meiotic division is dependent on the presence of at least one well-positioned crossover per chromosome. In some mammalian species, however, the genomic distribution of crossovers is consistent with a more stringent baseline requirement of one crossover per chromosome arm. Given that the meiotic requirement for crossing over defines the minimum frequency of recombination necessary for the production of viable gametes, determining the chromosomal scale of this constraint is essential for defining crossover profiles predisposed to aneuploidy and understanding the parameters that shape patterns of recombination rate evolution across species. Here, I use cytogenetic methods for in situ imaging of crossovers in karyotypically diverse house mice (Mus musculus domesticus) and voles (genus Microtus) to test how chromosome number and configuration constrain the distribution of crossovers in a genome. I show that the global distribution of crossovers in house mice is thresholded by a minimum of one crossover per chromosome arm, whereas the crossover landscape in voles is defined by a more relaxed requirement of one crossover per chromosome. I extend these findings in an evolutionary metaanalysis of published recombination and karyotype data for 112 mammalian species and demonstrate that the physical scale of the genomic crossover distribution has undergone multiple independent shifts from one crossover per chromosome arm to one per chromosome during mammalian evolution. Together, these results indicate that the chromosomal scale constraint on crossover rates is itself a trait that evolves among species, a finding that casts light on an important source of crossover rate variation in mammals. Copyright © 2017 by the Genetics Society of America.

  16. Publication point indicators

    DEFF Research Database (Denmark)

    Elleby, Anita; Ingwersen, Peter

    2010-01-01

    ; the Cumulated Publication Point Indicator (CPPI), which graphically illustrates the cumulated gain of obtained vs. ideal points, both seen as vectors; and the normalized Cumulated Publication Point Index (nCPPI) that represents the cumulated gain of publication success as index values, either graphically......The paper presents comparative analyses of two publication point systems, The Norwegian and the in-house system from the interdisciplinary Danish Institute of International Studies (DIIS), used as case in the study for publications published 2006, and compares central citation-based indicators...... with novel publication point indicators (PPIs) that are formalized and exemplified. Two diachronic citation windows are applied: 2006-07 and 2006-08. Web of Science (WoS) as well as Google Scholar (GS) are applied to observe the cite delay and citedness for the different document types published by DIIS...

  17. On the meaning of sink capture efficiency and sink strength for point defects

    International Nuclear Information System (INIS)

    Mansur, L.K.; Wolfer, W.G.

    1982-01-01

    The concepts of sink capture efficiency and sink strength for point defects are central to the theory of point defect reactions in materials undergoing irradiation. Two fundamentally different definitions of the capture efficiency are in current use. The essential difference can be stated simply. The conventional meaning denotes a measure of the loss rate of point defects to sinks per unit mean point defect concentration. A second definition of capture efficiency, introduced recently, gives a measure of the point defect loss rate without normalization to the mean point defect concentration. The relationship between the two capture efficiencies is here derived. By stating the relationship we hope to eliminate confusion caused by comparisons of the two types of capture efficiencies at face value and to provide a method of obtaining one from the other. Internally consistent usage of either of the capture efficiencies leads to the same results for the calculation of measuable quantities, as is required physically. (orig.)

  18. Factors influencing superimposition error of 3D cephalometric landmarks by plane orientation method using 4 reference points: 4 point superimposition error regression model.

    Science.gov (United States)

    Hwang, Jae Joon; Kim, Kee-Deog; Park, Hyok; Park, Chang Seo; Jeong, Ho-Gul

    2014-01-01

    Superimposition has been used as a method to evaluate the changes of orthodontic or orthopedic treatment in the dental field. With the introduction of cone beam CT (CBCT), evaluating 3 dimensional changes after treatment became possible by superimposition. 4 point plane orientation is one of the simplest ways to achieve superimposition of 3 dimensional images. To find factors influencing superimposition error of cephalometric landmarks by 4 point plane orientation method and to evaluate the reproducibility of cephalometric landmarks for analyzing superimposition error, 20 patients were analyzed who had normal skeletal and occlusal relationship and took CBCT for diagnosis of temporomandibular disorder. The nasion, sella turcica, basion and midpoint between the left and the right most posterior point of the lesser wing of sphenoidal bone were used to define a three-dimensional (3D) anatomical reference co-ordinate system. Another 15 reference cephalometric points were also determined three times in the same image. Reorientation error of each landmark could be explained substantially (23%) by linear regression model, which consists of 3 factors describing position of each landmark towards reference axes and locating error. 4 point plane orientation system may produce an amount of reorientation error that may vary according to the perpendicular distance between the landmark and the x-axis; the reorientation error also increases as the locating error and shift of reference axes viewed from each landmark increases. Therefore, in order to reduce the reorientation error, accuracy of all landmarks including the reference points is important. Construction of the regression model using reference points of greater precision is required for the clinical application of this model.

  19. 40 CFR 437.4 - Monitoring requirements.

    Science.gov (United States)

    2010-07-01

    ... wastewater resulting from the treatment of metal-bearing waste, oily waste, or organic-bearing waste must... STANDARDS THE CENTRALIZED WASTE TREATMENT POINT SOURCE CATEGORY § 437.4 Monitoring requirements. (a) Permit... compliance for each subpart after treatment and before mixing of the waste with wastes of any other subpart...

  20. Legal requirements governing proxy voting in Denmark

    DEFF Research Database (Denmark)

    Werlauff, Erik

    2008-01-01

    The requirements in Danish company law concerning proxy voting in companies whose shares have been accepted for listing on a regulated market have been successively tightened in recent years, and corporate governance principles have also led to the introduction of several requirements concerning...... proxy holders. A thorough knowledge of these requirements is important not only for the listed companies but also for their advisers and investors in Denmark and abroad. This article considers these requirements as well as the additional requirements which will derive from Directive 2007....../36 on the exercise of shareholders' rights in listed companies, which must be implemented by 3 August 2009. It is pointed out that companies may provide with advantage in their articles of association for both the existing and the forthcoming requirements at this early stage....

  1. A three-point Taylor algorithm for three-point boundary value problems

    NARCIS (Netherlands)

    J.L. López; E. Pérez Sinusía; N.M. Temme (Nico)

    2011-01-01

    textabstractWe consider second-order linear differential equations $\\varphi(x)y''+f(x)y'+g(x)y=h(x)$ in the interval $(-1,1)$ with Dirichlet, Neumann or mixed Dirichlet-Neumann boundary conditions given at three points of the interval: the two extreme points $x=\\pm 1$ and an interior point

  2. Characterizing costs and benefits of uncertain future regulatory requirements on the U.S. natural gas industry

    International Nuclear Information System (INIS)

    Godec, M.L.; Smith, G.E.; Fitzgibbon, T.

    1995-01-01

    Environmental regulatory requirements at both the state and federal level are constantly changing, making it difficult for industry and R ampersand D program managers to project future compliance requirements and costs. Even if a company is trying to keep abreast of various proposed regulatory initiatives, the number of possible combinations of initiatives that could occur in the future seems virtually limitless. Uncertainty associated with potential future environmental compliance requirements makes the identification and evaluation of future investment and R ampersand D opportunities exceedingly difficult, and makes the process of systematic strategic planning increasingly complex. This paper describes a methodology for accounting for uncertain future environmental compliance costs in a systematic, comprehensive manner. Through analysis of proposed initiatives for making future environmental requirements more stringent, forecasting the likelihood of occurrence and potential timing of each initiative, and estimating potential future compliance costs associated with each initiative, a thorough process for incorporating regulatory uncertainty into strategic planning and project evaluation is described. This approach can be used for evaluating R ampersand D opportunities to determine where development of new technologies or assessment of risks posed by industry operations may have the greatest impact on future industry costs of compliance. This approach could also be used to account for the uncertainty of future environmental costs in corporate strategic planning or for factoring future compliance costs into project evaluation. This approach could also be enhanced through use in conjunction with other modeling and forecasting systems that could consider a broad range of impacts, including impacts on gas production, industry activity levels, and tax revenues

  3. DNA and Protein Requirements for Substrate Conformational Changes Necessary for Human Flap Endonuclease-1-catalyzed Reaction*

    Science.gov (United States)

    Algasaier, Sana I.; Exell, Jack C.; Bennet, Ian A.; Thompson, Mark J.; Gotham, Victoria J. B.; Shaw, Steven J.; Craggs, Timothy D.; Finger, L. David; Grasby, Jane A.

    2016-01-01

    Human flap endonuclease-1 (hFEN1) catalyzes the essential removal of single-stranded flaps arising at DNA junctions during replication and repair processes. hFEN1 biological function must be precisely controlled, and consequently, the protein relies on a combination of protein and substrate conformational changes as a prerequisite for reaction. These include substrate bending at the duplex-duplex junction and transfer of unpaired reacting duplex end into the active site. When present, 5′-flaps are thought to thread under the helical cap, limiting reaction to flaps with free 5′-termini in vivo. Here we monitored DNA bending by FRET and DNA unpairing using 2-aminopurine exciton pair CD to determine the DNA and protein requirements for these substrate conformational changes. Binding of DNA to hFEN1 in a bent conformation occurred independently of 5′-flap accommodation and did not require active site metal ions or the presence of conserved active site residues. More stringent requirements exist for transfer of the substrate to the active site. Placement of the scissile phosphate diester in the active site required the presence of divalent metal ions, a free 5′-flap (if present), a Watson-Crick base pair at the terminus of the reacting duplex, and the intact secondary structure of the enzyme helical cap. Optimal positioning of the scissile phosphate additionally required active site conserved residues Tyr40, Asp181, and Arg100 and a reacting duplex 5′-phosphate. These studies suggest a FEN1 reaction mechanism where junctions are bound and 5′-flaps are threaded (when present), and finally the substrate is transferred onto active site metals initiating cleavage. PMID:26884332

  4. Generation of the Data Required by AGNPS

    Institute of Scientific and Technical Information of China (English)

    于苏俊

    2003-01-01

    Remote sensing techniques and geographic information systems offer a good means of collecting and manipulating the data required to assess conservation practices. A method for automatic generation of most of the data required by the agricultural non-point source (AGNPS) erosion model is put forward from three sources: (1) files with contour lines from topographic maps, (2) soil mapping units from soil surveys, and (3) land cover from land-sat TM image classifications.

  5. Georeferenced Point Clouds: A Survey of Features and Point Cloud Management

    Directory of Open Access Journals (Sweden)

    Johannes Otepka

    2013-10-01

    Full Text Available This paper presents a survey of georeferenced point clouds. Concentration is, on the one hand, put on features, which originate in the measurement process themselves, and features derived by processing the point cloud. On the other hand, approaches for the processing of georeferenced point clouds are reviewed. This includes the data structures, but also spatial processing concepts. We suggest a categorization of features into levels that reflect the amount of processing. Point clouds are found across many disciplines, which is reflected in the versatility of the literature suggesting specific features.

  6. A primal-dual interior point method for large-scale free material optimization

    DEFF Research Database (Denmark)

    Weldeyesus, Alemseged Gebrehiwot; Stolpe, Mathias

    2015-01-01

    Free Material Optimization (FMO) is a branch of structural optimization in which the design variable is the elastic material tensor that is allowed to vary over the design domain. The requirements are that the material tensor is symmetric positive semidefinite with bounded trace. The resulting...... optimization problem is a nonlinear semidefinite program with many small matrix inequalities for which a special-purpose optimization method should be developed. The objective of this article is to propose an efficient primal-dual interior point method for FMO that can robustly and accurately solve large...... of iterations the interior point method requires is modest and increases only marginally with problem size. The computed optimal solutions obtain a higher precision than other available special-purpose methods for FMO. The efficiency and robustness of the method is demonstrated by numerical experiments on a set...

  7. Effect of fuel fabrication parameters on performance- designer's point of view

    International Nuclear Information System (INIS)

    Prasad, P.N.; Ravi, M.; Soni, R.; Bajaj, S.S.; Bhardwaj, S.A.

    2004-01-01

    The fuel bundle performance in reactor depends upon the material properties, dimensions of the different components and their inter-compatibility. This paper brings out the fuel parameters required to be optimised to achieve better fuel reliability, operational flexibility, safety and economics from the designer point of view

  8. Requirements for the retrofitting an extension of the maximum voltage power grid from the point of view of environmental protection and cultivated landscape work; Anforderungen an den Um- und Ausbau des Hoechstspannungsstromnetzes. Aus der Sicht von Naturschutz und Kulturlandschaftspflege

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-06-15

    The project on the requirements for the retrofitting an extension of the maximum voltage power grid from the point of view of environmental protection and cultivated landscape work includes contributions on the following topics: the development of the European transmission grid, the grid extension law, restrictions for the power grid and their infrastructure, requirements for the regulations concerning the realization of the transnational grid extension, inclusion of the public - public acceptance - communication, requirements concerning the environmental compensation law, overhead line - underground cable - health hazards, ecological effects of overhead lines and underground cables, infrastructural projects, power supply in the future, structural relief by photovoltaics.

  9. DETECTION OF SLOPE MOVEMENT BY COMPARING POINT CLOUDS CREATED BY SFM SOFTWARE

    Directory of Open Access Journals (Sweden)

    K. Oda

    2016-06-01

    Full Text Available This paper proposes movement detection method between point clouds created by SFM software, without setting any onsite georeferenced points. SfM software, like Smart3DCaputure, PhotoScan, and Pix4D, are convenient for non-professional operator of photogrammetry, because these systems require simply specification of sequence of photos and output point clouds with colour index which corresponds to the colour of original image pixel where the point is projected. SfM software can execute aerial triangulation and create dense point clouds fully automatically. This is useful when monitoring motion of unstable slopes, or loos rocks in slopes along roads or railroads. Most of existing method, however, uses mesh-based DSM for comparing point clouds before/after movement and it cannot be applied in such cases that part of slopes forms overhangs. And in some cases movement is smaller than precision of ground control points and registering two point clouds with GCP is not appropriate. Change detection method in this paper adopts CCICP (Classification and Combined ICP algorithm for registering point clouds before / after movement. The CCICP algorithm is a type of ICP (Iterative Closest Points which minimizes point-to-plane, and point-to-point distances, simultaneously, and also reject incorrect correspondences based on point classification by PCA (Principle Component Analysis. Precision test shows that CCICP method can register two point clouds up to the 1 pixel size order in original images. Ground control points set in site are useful for initial setting of two point clouds. If there are no GCPs in site of slopes, initial setting is achieved by measuring feature points as ground control points in the point clouds before movement, and creating point clouds after movement with these ground control points. When the motion is rigid transformation, in case that a loose Rock is moving in slope, motion including rotation can be analysed by executing CCICP for a

  10. Improved Power Decoding of One-Point Hermitian Codes

    DEFF Research Database (Denmark)

    Puchinger, Sven; Bouw, Irene; Rosenkilde, Johan Sebastian Heesemann

    2017-01-01

    We propose a new partial decoding algorithm for one-point Hermitian codes that can decode up to the same number of errors as the Guruswami–Sudan decoder. Simulations suggest that it has a similar failure probability as the latter one. The algorithm is based on a recent generalization of the power...... decoding algorithm for Reed–Solomon codes and does not require an expensive root-finding step. In addition, it promises improvements for decoding interleaved Hermitian codes....

  11. Influence of the burning point on the dew point in a diesel engine

    Energy Technology Data Exchange (ETDEWEB)

    Teetz, C.

    1982-06-01

    A computation on the influence of the ignition point on the dew point in a cylinder of a diesel engine is presented. The cylinder-pressure diagrams are shown. The results of computation are given. A later ignition point diminishes the area with cylinder wall temperatures below the dew point. The endangering by cylinder wall temperatures below the dew point is illustrated.

  12. Point specificity in acupuncture

    Directory of Open Access Journals (Sweden)

    Choi Emma M

    2012-02-01

    Full Text Available Abstract The existence of point specificity in acupuncture is controversial, because many acupuncture studies using this principle to select control points have found that sham acupoints have similar effects to those of verum acupoints. Furthermore, the results of pain-related studies based on visual analogue scales have not supported the concept of point specificity. In contrast, hemodynamic, functional magnetic resonance imaging and neurophysiological studies evaluating the responses to stimulation of multiple points on the body surface have shown that point-specific actions are present. This review article focuses on clinical and laboratory studies supporting the existence of point specificity in acupuncture and also addresses studies that do not support this concept. Further research is needed to elucidate the point-specific actions of acupuncture.

  13. 33 CFR 161.18 - Reporting requirements.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Reporting requirements. 161.18 Section 161.18 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED... call. H HOTEL Date, time and point of entry system Entry time expressed as in (B) and into the entry...

  14. Pressure transmitters: Addressing post-Fukushima regulations and requirements with Bibloc technology by Rolls-Royce

    International Nuclear Information System (INIS)

    Fabbro, Herve; Desgeorge, Romain; Chowanek, Michel

    2013-06-01

    Nuclear power stations are designed to withstand substantial seismic activity and as such represent some of the most robust buildings in the world. However The Fukushima nuclear incident highlighted the potential vulnerability of nuclear power plants when multiple natural events of historic proportions happen simultaneously. Following the incident, the worldwide nuclear industry quite rightly called for an immediate review and a targeted reassessment of the safety margins of nuclear reactors. Several recommendations have been given by international safety authorities, including a significant toughening of the already stringent regulations and requirements, with respect to earthquakes, extreme temperatures, pressure and radiation resistance. In the event of an accident, a quick response is imperative and to act efficiently, a correct knowledge of the situation as well as an accurate estimation of its severity are required. Thus, it is essential to be able to rely on the most reliable sensors possible, in particular for the 50 to 100 classified pressure transmitters. Equipment used in nuclear plants all over the world, such pressure transmitters, are implemented following one of two different types of design: - The Monobloc design where almost all the equipment or system is installed very close to the reactor, within the reactor building. - The Bibloc design where the most sensitive parts (in particular the electronics) are removed from the harsh environment present in the vicinity of the reactor to be placed outside of the reactor building. The paper will present the advantages of the Bibloc technology and will show how this technology meets the 'Post Fukushima' requirements. (authors)

  15. Triana Safehold: A New Gyroless, Sun-Pointing Attitude Controller

    Science.gov (United States)

    Chen, J.; Morgenstern, Wendy; Garrick, Joseph

    2001-01-01

    Triana is a single-string spacecraft to be placed in a halo orbit about the sun-earth Ll Lagrangian point. The Attitude Control Subsystem (ACS) hardware includes four reaction wheels, ten thrusters, six coarse sun sensors, a star tracker, and a three-axis Inertial Measuring Unit (IMU). The ACS Safehold design features a gyroless sun-pointing control scheme using only sun sensors and wheels. With this minimum hardware approach, Safehold increases mission reliability in the event of a gyroscope anomaly. In place of the gyroscope rate measurements, Triana Safehold uses wheel tachometers to help provide a scaled estimation of the spacecraft body rate about the sun vector. Since Triana nominally performs momentum management every three months, its accumulated system momentum can reach a significant fraction of the wheel capacity. It is therefore a requirement for Safehold to maintain a sun-pointing attitude even when the spacecraft system momentum is reasonably large. The tachometer sun-line rate estimation enables the controller to bring the spacecraft close to its desired sun-pointing attitude even with reasonably high system momentum and wheel drags. This paper presents the design rationale behind this gyroless controller, stability analysis, and some time-domain simulation results showing performances with various initial conditions. Finally, suggestions for future improvements are briefly discussed.

  16. Four-point probe measurements of a direct current potential drop on layered conductive cylinders

    International Nuclear Information System (INIS)

    Lu, Yi; Bowler, John R

    2012-01-01

    We have determined the steady state electric field due to direct current flowing via point contacts at the cylindrical surface of a uniformly layered conductive rod of finite length. The solution allows one to use four-point probe potential drop measurements to estimate the conductivity or thickness of the layer assuming that the other parameters are known. The electrical potential in the rod has a zero radial derivative at its surface except at the injection and extractions points. This means that the required solution can be expressed in terms of a Green’s function satisfying a Neumann boundary condition. Four-point measurements have been made to demonstrate the validity of theoretical results. (paper)

  17. Four-point probe measurements of a direct current potential drop on layered conductive cylinders

    Science.gov (United States)

    Lu, Yi; Bowler, John R.

    2012-11-01

    We have determined the steady state electric field due to direct current flowing via point contacts at the cylindrical surface of a uniformly layered conductive rod of finite length. The solution allows one to use four-point probe potential drop measurements to estimate the conductivity or thickness of the layer assuming that the other parameters are known. The electrical potential in the rod has a zero radial derivative at its surface except at the injection and extractions points. This means that the required solution can be expressed in terms of a Green’s function satisfying a Neumann boundary condition. Four-point measurements have been made to demonstrate the validity of theoretical results.

  18. Bio-fuels for diesel engines: Experience in Italy and Europe

    International Nuclear Information System (INIS)

    Rocchietta, C.

    1992-01-01

    With the aim of meeting stringent European Communities air pollution regulations, reducing the necessity of petroleum imports and creating new markets for agricultural products, Italy's Ferruzzi-Montedison Group is developing diesel engine fuels derived from vegetable oils. The innovative feature of these fuels, from the environmental protection stand-point, is that they don't contain any sulfur, the main cause of acid rain. This paper provides brief notes of the key chemical-physical properties of these diesel fuels, whose application doesn't require any modifications to diesel engines, and assesses the relative production technologies and commercialization prospects. Reference is made to the results of recent performance tests conducted on buses and taxis

  19. Thermal management issues in a PEMFC stack - A brief review of current status

    International Nuclear Information System (INIS)

    Kandlikar, Satish G.; Lu Zijie

    2009-01-01

    Understanding the thermal effects is critical in optimizing the performance and durability of proton exchange membrane fuel cells (PEMFCs). A PEMFC produces a similar amount of waste heat to its electric power output and tolerates only a small deviation in temperature from its design point. The balance between the heat production and its removal determines the operating temperature of a PEMFC. These stringent thermal requirements present a significant heat transfer challenge. In this work, the fundamental heat transfer mechanisms at PEMFC component level (including polymer electrolyte, catalyst layers, gas diffusion media and bipolar plates) are briefly reviewed. The current status of PEMFC cooling technology is also reviewed and research needs are identified

  20. Ultrasound Picture Archiving And Communication Systems

    Science.gov (United States)

    Koestner, Ken; Hottinger, C. F.

    1982-01-01

    The ideal ultrasonic image communication and storage system must be flexible in order to optimize speed and minimize storage requirements. Various ultrasonic imaging modalities are quite different in data volume and speed requirements. Static imaging, for example B-Scanning, involves acquisition of a large amount of data that is averaged or accumulated in a desired manner. The image is then frozen in image memory before transfer and storage. Images are commonly a 512 x 512 point array, each point 6 bits deep. Transfer of such an image over a serial line at 9600 baud would require about three minutes. Faster transfer times are possible; for example, we have developed a parallel image transfer system using direct memory access (DMA) that reduces the time to 16 seconds. Data in this format requires 256K bytes for storage. Data compression can be utilized to reduce these requirements. Real-time imaging has much more stringent requirements for speed and storage. The amount of actual data per frame in real-time imaging is reduced due to physical limitations on ultrasound. For example, 100 scan lines (480 points long, 6 bits deep) can be acquired during a frame at a 30 per second rate. In order to transmit and save this data at a real-time rate requires a transfer rate of 8.6 Megabaud. A real-time archiving system would be complicated by the necessity of specialized hardware to interpolate between scan lines and perform desirable greyscale manipulation on recall. Image archiving for cardiology and radiology would require data transfer at this high rate to preserve temporal (cardiology) and spatial (radiology) information.

  1. 41 CFR 101-25.101-3 - Supply through consolidated purchase for direct delivery to use points.

    Science.gov (United States)

    2010-07-01

    ... consolidated purchase for direct delivery to use points. 101-25.101-3 Section 101-25.101-3 Public Contracts and... purchase for direct delivery to use points. The following criteria shall govern in determining whether an... following factors requires consolidated purchasing of such items for direct delivery to use points— (1...

  2. Nomogram for Determining Shield Thickness for Point and Line Sources of Gamma Rays

    Energy Technology Data Exchange (ETDEWEB)

    Joenemalm, C; Malen, K

    1966-10-15

    A set of nomograms is given for the determination of the required shield thickness against gamma radiation. The sources handled are point and infinite line sources with shields of Pb, Fe, magnetite concrete (p = 3.6), ordinary concrete (p = 2.3) or water. The gamma energy range covered is 0.5 - 10 MeV. The nomograms are directly applicable for source and dose points on the surfaces of the shield. They can easily be extended to source and dose points in other positions by applying a geometrical correction. Also included are data for calculation of the source strength for the most common materials and for fission product sources.

  3. Nomogram for Determining Shield Thickness for Point and Line Sources of Gamma Rays

    International Nuclear Information System (INIS)

    Joenemalm, C.; Malen, K

    1966-10-01

    A set of nomograms is given for the determination of the required shield thickness against gamma radiation. The sources handled are point and infinite line sources with shields of Pb, Fe, magnetite concrete (p = 3.6), ordinary concrete (p = 2.3) or water. The gamma energy range covered is 0.5 - 10 MeV. The nomograms are directly applicable for source and dose points on the surfaces of the shield. They can easily be extended to source and dose points in other positions by applying a geometrical correction. Also included are data for calculation of the source strength for the most common materials and for fission product sources

  4. Critical Point Cancellation in 3D Vector Fields: Robustness and Discussion.

    Science.gov (United States)

    Skraba, Primoz; Rosen, Paul; Wang, Bei; Chen, Guoning; Bhatia, Harsh; Pascucci, Valerio

    2016-02-29

    Vector field topology has been successfully applied to represent the structure of steady vector fields. Critical points, one of the essential components of vector field topology, play an important role in describing the complexity of the extracted structure. Simplifying vector fields via critical point cancellation has practical merit for interpreting the behaviors of complex vector fields such as turbulence. However, there is no effective technique that allows direct cancellation of critical points in 3D. This work fills this gap and introduces the first framework to directly cancel pairs or groups of 3D critical points in a hierarchical manner with a guaranteed minimum amount of perturbation based on their robustness, a quantitative measure of their stability. In addition, our framework does not require the extraction of the entire 3D topology, which contains non-trivial separation structures, and thus is computationally effective. Furthermore, our algorithm can remove critical points in any subregion of the domain whose degree is zero and handle complex boundary configurations, making it capable of addressing challenging scenarios that may not be resolved otherwise. We apply our method to synthetic and simulation datasets to demonstrate its effectiveness.

  5. Blister pouches for effective reagent storage and release for low-cost point-of-care diagnostic applications

    CSIR Research Space (South Africa)

    Smith, S

    2016-02-01

    Full Text Available Lab-on-a-chip devices are often applied to point-of-care diagnostic solutions as they are low-cost, compact, disposable, and require only small sample volumes. For such devices, various reagents are required for sample preparation and analysis and...

  6. AN IMPROVEMENT ON GEOMETRY-BASED METHODS FOR GENERATION OF NETWORK PATHS FROM POINTS

    Directory of Open Access Journals (Sweden)

    Z. Akbari

    2014-10-01

    Full Text Available Determining network path is important for different purposes such as determination of road traffic, the average speed of vehicles, and other network analysis. One of the required input data is information about network path. Nevertheless, the data collected by the positioning systems often lead to the discrete points. Conversion of these points to the network path have become one of the challenges which different researchers, presents many ways for solving it. This study aims at investigating geometry-based methods to estimate the network paths from the obtained points and improve an existing point to curve method. To this end, some geometry-based methods have been studied and an improved method has been proposed by applying conditions on the best method after describing and illustrating weaknesses of them.

  7. Dynamics of Multibody Systems Near Lagrangian Points

    Science.gov (United States)

    Wong, Brian

    This thesis examines the dynamics of a physically connected multi-spacecraft system in the vicinity of the Lagrangian points of a Circular Restricted Three-Body System. The spacecraft system is arranged in a wheel-spoke configuration with smaller and less massive satellites connected to a central hub using truss/beams or tether connectors. The kinematics of the system is first defined, and the kinetic, gravitational potential energy and elastic potential energy of the system are derived. The Assumed Modes Method is used to discretize the continuous variables of the system, and a general set of ordinary differential equations describing the dynamics of the connectors and the central hub are obtained using the Lagrangian method. The flexible body dynamics of the tethered and truss connected systems are examined using numerical simulations. The results show that these systems experienced only small elastic deflections when they are naturally librating or rotating at moderate angular velocities, and these deflections have relatively small effect on the attitude dynamics of the systems. Based on these results, it is determined that the connectors can be modeled as rigid when only the attitude dynamics of the system is of interest. The equations of motion of rigid satellites stationed at the Lagrangian points are linearized, and the stability conditions of the satellite are obtained from the linear equations. The required conditions are shown to be similar to those of geocentric satellites. Study of the linear equations also revealed the resonant conditions of rigid Lagrangian point satellites, when a librational natural frequency of the satellite matches the frequency of its station-keeping orbit leading to large attitude motions. For tethered satellites, the linear analysis shows that the tethers are in stable equilibrium when they lie along a line joining the two primary celestial bodies of the Three-Body System. Numerical simulations are used to study the long term

  8. Critical-point nuclei

    International Nuclear Information System (INIS)

    Clark, R.M.

    2004-01-01

    It has been suggested that a change of nuclear shape may be described in terms of a phase transition and that specific nuclei may lie close to the critical point of the transition. Analytical descriptions of such critical-point nuclei have been introduced recently and they are described briefly. The results of extensive searches for possible examples of critical-point behavior are presented. Alternative pictures, such as describing bands in the candidate nuclei using simple ΔK = 0 and ΔK = 2 rotational-coupling models, are discussed, and the limitations of the different approaches highlighted. A possible critical-point description of the transition from a vibrational to rotational pairing phase is suggested

  9. New definitions of pointing stability - ac and dc effects. [constant and time-dependent pointing error effects on image sensor performance

    Science.gov (United States)

    Lucke, Robert L.; Sirlin, Samuel W.; San Martin, A. M.

    1992-01-01

    For most imaging sensors, a constant (dc) pointing error is unimportant (unless large), but time-dependent (ac) errors degrade performance by either distorting or smearing the image. When properly quantified, the separation of the root-mean-square effects of random line-of-sight motions into dc and ac components can be used to obtain the minimum necessary line-of-sight stability specifications. The relation between stability requirements and sensor resolution is discussed, with a view to improving communication between the data analyst and the control systems engineer.

  10. Time-of-flight data acquisition unit (DAU) for neutron scattering experiments. Specification of the requirements and design concept. Version 3.1

    International Nuclear Information System (INIS)

    Herdam, G.; Klessmann, H.; Wawer, W.; Adebayo, J.; David, G.; Szatmari, F.

    1989-12-01

    This specification describes the requirements for the Data Acquisition Unit (DAU) and defines the design concept for the functional units involved. The Data Acquisition Unit will be used in the following neutron scattering experiments: Time-of-Flight Spectrometer NEAT, Time-of-Flight Spectrometer SPAN. In addition, the data of the SPAN spectrometer in Spin Echo experiments will be accumulated. The Data Acquisition Unit can be characterised by the following requirements: Time-of-flight measurement with high time resolution (125 ns), sorting the time-of-flight in up to 4096 time channels (channel width ≥ 1 μs), selection of different time channel widths for peak and background, on-line time-of-flight correction for neutron flight paths of different lengths, sorting the detector position information in up to 4096 position channels, accumulation of two-dimensional spectra in a 32 Mbyte RAM memory (4 K time channels*4 K position channels*16 bits). Because of the stringent timing requirements the functional units of the DAU are hardware controlled via tables. The DAU is part of a process control system which has access to the functional units via the VMEbus in order to initialise, to load tables and control information, and to read status information and spectra. (orig.) With 18 figs

  11. Mined Geologic Disposal System Requirements Document

    International Nuclear Information System (INIS)

    1994-03-01

    This Mined Geologic Disposal System Requirements Document (MGDS-RD) describes the functions to be performed by, and the requirements for, a Mined Geologic Disposal System (MGDS) for the permanent disposal of spent nuclear fuel (SNF) (including SNF loaded in multi-purpose canisters (MPCs)) and commercial and defense high-level radioactive waste (HLW) in support of the Civilian Radioactive Waste Management System (CRWMS). The purpose of the MGDS-RD is to define the program-level requirements for the design of the Repository, the Exploratory Studies Facility (ESF), and Surface Based Testing Facilities (SBTF). These requirements include design, operation, and decommissioning requirements to the extent they impact on the physical development of the MGDS. The document also presents an overall description of the MGDS, its functions (derived using the functional analysis documented by the Physical System Requirements (PSR) documents as a starting point), its segments as described in Section 3.1.3, and the requirements allocated to the segments. In addition, the program-level interfaces of the MGDS are identified. As such, the MGDS-RD provides the technical baseline for the design of the MGDS

  12. A tri-reference point theory of decision making under risk.

    Science.gov (United States)

    Wang, X T; Johnson, Joseph G

    2012-11-01

    The tri-reference point (TRP) theory takes into account minimum requirements (MR), the status quo (SQ), and goals (G) in decision making under risk. The 3 reference points demarcate risky outcomes and risk perception into 4 functional regions: success (expected value of x ≥ G), gain (SQ G > SQ. We present TRP assumptions and value functions and a mathematical formalization of the theory. We conducted empirical tests of crucial TRP predictions using both explicit and implicit reference points. We show that decision makers consider both G and MR and give greater weight to MR than G, indicating failure aversion (i.e., the disutility of a failure is greater than the utility of a success in the same task) in addition to loss aversion (i.e., the disutility of a loss is greater than the utility of the same amount of gain). Captured by a double-S shaped value function with 3 inflection points, risk preferences switched between risk seeking and risk aversion when the distribution of a gamble straddled a different reference point. The existence of MR (not G) significantly shifted choice preference toward risk aversion even when the outcome distribution of a gamble was well above the MR. Single reference point based models such as prospect theory cannot consistently account for these findings. The TRP theory provides simple guidelines for evaluating risky choices for individuals and organizational management. (PsycINFO Database Record (c) 2012 APA, all rights reserved).

  13. [Consent and confidentiality in occupational health practice: balance between legal requirements and ethical values].

    Science.gov (United States)

    Mora, Erika; Franco, G

    2010-01-01

    The recently introduced Italian law on the protection of workers' health states that the occupational health physician (competent physician) is required to act according to the Code of Ethics of the International Commission on Occupational Health (ICOH). This paper aims at examining the articles of legislative decree 81/2008 dealing with informed consent and confidentiality compared with the corresponding points of the ICOH Ethics Code. Analysis of the relationship between articles 25 and 39 (informed consent) and 18, 20 and 39 (confidentiality) of the decree shows that there are some points of disagreement between the legal requirements and the Code of Ethics, in particular concerning prescribed health surveillance, consent based on appropriate information (points 8, 10 and 12 of the Code) and some aspects of confidentiality (points 10, 20, 21, 22 and 23 of the Code). Although the competent physician is required to act according to the law, the decisional process could lead to a violation of workers' autonomy.

  14. 16 CFR 1500.48 - Technical requirements for determining a sharp point in toys and other articles intended for use...

    Science.gov (United States)

    2010-01-01

    ... Section 1500.48 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION FEDERAL HAZARDOUS SUBSTANCES ACT...-quarter times the minor dimension of the probe, recess, or opening, measured from any point in the plane...

  15. Is a wind turbine a point source? (L).

    Science.gov (United States)

    Makarewicz, Rufin

    2011-02-01

    Measurements show that practically all noise of wind turbine noise is produced by turbine blades, sometimes a few tens of meters long, despite that the model of a point source located at the hub height is commonly used. The plane of rotating blades is the critical location of the receiver because the distances to the blades are the shortest. It is shown that such location requires certain condition to be met. The model is valid far away from the wind turbine as well.

  16. Overview of Maximum Power Point Tracking Techniques for Photovoltaic Energy Production Systems

    DEFF Research Database (Denmark)

    Koutroulis, Eftichios; Blaabjerg, Frede

    2015-01-01

    A substantial growth of the installed photovoltaic systems capacity has occurred around the world during the last decade, thus enhancing the availability of electric energy in an environmentally friendly way. The maximum power point tracking technique enables maximization of the energy production...... of photovoltaic sources during stochastically varying solar irradiation and ambient temperature conditions. Thus, the overall efficiency of the photovoltaic energy production system is increased. Numerous techniques have been presented during the last decade for implementing the maximum power point tracking...... process in a photovoltaic system. This article provides an overview of the operating principles of these techniques, which are suited for either uniform or non-uniform solar irradiation conditions. The operational characteristics and implementation requirements of these maximum power point tracking...

  17. Automated and continuously operating acid dew point measuring instrument for flue gases

    Energy Technology Data Exchange (ETDEWEB)

    Reckmann, D.; Naundorf, G.

    1986-06-01

    Design and operation is explained for a sulfuric acid dew point indicator for continuous flue gas temperature control. The indicator operated successfully in trial tests over several years with brown coal, gas and oil combustion in a measurement range of 60 to 180 C. The design is regarded as uncomplicated and easy to manufacture. Its operating principle is based on electric conductivity measurement on a surface on which sulfuric acid vapor has condensed. A ring electrode and a PtRh/Pt thermal element as central electrode are employed. A scheme of the equipment design is provided. Accuracy of the indicator was compared to manual dew point sondes manufactured by Degussa and showed a maximum deviation of 5 C. Manual cleaning after a number of weeks of operation is required. Fly ash with a high lime content increases dust buildup and requires more frequent cleaning cycles.

  18. The registration of non-cooperative moving targets laser point cloud in different view point

    Science.gov (United States)

    Wang, Shuai; Sun, Huayan; Guo, Huichao

    2018-01-01

    Non-cooperative moving target multi-view cloud registration is the key technology of 3D reconstruction of laser threedimension imaging. The main problem is that the density changes greatly and noise exists under different acquisition conditions of point cloud. In this paper, firstly, the feature descriptor is used to find the most similar point cloud, and then based on the registration algorithm of region segmentation, the geometric structure of the point is extracted by the geometric similarity between point and point, The point cloud is divided into regions based on spectral clustering, feature descriptors are created for each region, searching to find the most similar regions in the most similar point of view cloud, and then aligning the pair of point clouds by aligning their minimum bounding boxes. Repeat the above steps again until registration of all point clouds is completed. Experiments show that this method is insensitive to the density of point clouds and performs well on the noise of laser three-dimension imaging.

  19. Generalized Mann Iterations for Approximating Fixed Points of a Family of Hemicontractions

    Directory of Open Access Journals (Sweden)

    Jin Liang

    2008-06-01

    Full Text Available This paper concerns common fixed points for a finite family of hemicontractions or a finite family of strict pseudocontractions on uniformly convex Banach spaces. By introducing a new iteration process with error term, we obtain sufficient and necessary conditions, as well as sufficient conditions, for the existence of a fixed point. As one will see, we derive these strong convergence theorems in uniformly convex Banach spaces and without any requirement of the compactness on the domain of the mapping. The results given in this paper extend some previous theorems.

  20. A formalism for scattering of complex composite structures. II. Distributed reference points

    DEFF Research Database (Denmark)

    Svaneborg, Carsten; Pedersen, Jan Skov

    2012-01-01

    Recently we developed a formalism for the scattering from linear and acyclic branched structures build of mutually non-interacting sub-units.[C. Svaneborg and J. S. Pedersen, J. Chem. Phys. 136, 104105 (2012)] We assumed each sub-unit has reference points associated with it. These are well defined...... positions where sub-units can be linked together. In the present paper, we generalize the formalism to the case where each reference point can represent a distribution of potential link positions. We also present a generalized diagrammatic representation of the formalism. Scattering expressions required...

  1. Three-dimensional point-cloud room model in room acoustics simulations

    DEFF Research Database (Denmark)

    Markovic, Milos; Olesen, Søren Krarup; Hammershøi, Dorte

    2013-01-01

    acquisition and its representation with a 3D point-cloud model, as well as utilization of such a model for the room acoustics simulations. A room is scanned with a commercially available input device (Kinect for Xbox360) in two different ways; the first one involves the device placed in the middle of the room...... and rotated around the vertical axis while for the second one the device is moved within the room. Benefits of both approaches were analyzed. The device's depth sensor provides a set of points in a three-dimensional coordinate system which represents scanned surfaces of the room interior. These data are used...... to build a 3D point-cloud model of the room. Several models are created to meet requirements of different room acoustics simulation algorithms: plane fitting and uniform voxel grid for geometric methods and triangulation mesh for the numerical methods. Advantages of the proposed method over the traditional...

  2. Three-dimensional point-cloud room model for room acoustics simulations

    DEFF Research Database (Denmark)

    Markovic, Milos; Olesen, Søren Krarup; Hammershøi, Dorte

    2013-01-01

    acquisition and its representation with a 3D point-cloud model, as well as utilization of such a model for the room acoustics simulations. A room is scanned with a commercially available input device (Kinect for Xbox360) in two different ways; the first one involves the device placed in the middle of the room...... and rotated around the vertical axis while for the second one the device is moved within the room. Benefits of both approaches were analyzed. The device's depth sensor provides a set of points in a three-dimensional coordinate system which represents scanned surfaces of the room interior. These data are used...... to build a 3D point-cloud model of the room. Several models are created to meet requirements of different room acoustics simulation algorithms: plane fitting and uniform voxel grid for geometric methods and triangulation mesh for the numerical methods. Advantages of the proposed method over the traditional...

  3. 47 CFR 15.319 - General technical requirements.

    Science.gov (United States)

    2010-10-01

    ...-equivalent voltage. The measurement results shall be properly adjusted for any instrument limitations, such... bandwidth, sensitivity, etc., so as to obtain a true peak measurement for the emission in question over the... point. (i) Unlicensed PCS devices are subject to the radiofrequency radiation exposure requirements...

  4. Recent Developments in Maximum Power Point Tracking Technologies for Photovoltaic Systems

    Directory of Open Access Journals (Sweden)

    Nevzat Onat

    2010-01-01

    Full Text Available In photovoltaic (PV system applications, it is very important to design a system for operating of the solar cells (SCs under best conditions and highest efficiency. Maximum power point (MPP varies depending on the angle of sunlight on the surface of the panel and cell temperature. Hence, the operating point of the load is not always MPP of PV system. Therefore, in order to supply reliable energy to the load, PV systems are designed to include more than the required number of modules. The solution to this problem is that switching power converters are used, that is called maximum power point tracker (MPPT. In this study, the various aspects of these algorithms have been analyzed in detail. Classifications, definitions, and basic equations of the most widely used MPPT technologies are given. Moreover, a comparison was made in the conclusion.

  5. [Dancing with Pointe Shoes: Characteristics and Assessment Criteria for Pointe Readiness].

    Science.gov (United States)

    Wanke, Eileen M; Exner-Grave, Elisabeth

    2017-12-01

    Training with pointe shoes is an integral part of professional dance education and ambitious hobby dancing. Pointe shoes - developed more than hundred years ago and almost unaltered since then - are highly specific and strike a balance between aesthetics, function, protection, and health care. Therefore, pointe readiness should be tested prior to all dance training or career training. Medical specialists are often confronted with this issue. Specific anatomical dance technique-orientated general conditional and coordinative preconditions as well as dance-technical prerequisites must be met by pointe readiness tests in order to keep traumatic injuries or long-term damage at a minimum. In addition to a (training) history, medical counselling sessions have come to include various tests that enable a reliable decision for or against pointe work. This article suggests adequate testing procedures (STT TEST), taking account of professional dancing as well as hobby dancing. © Georg Thieme Verlag KG Stuttgart · New York.

  6. Developing a Business Intelligence Process for a Training Module in SharePoint 2010

    Science.gov (United States)

    Schmidtchen, Bryce; Solano, Wanda M.; Albasini, Colby

    2015-01-01

    Prior to this project, training information for the employees of the National Center for Critical Processing and Storage (NCCIPS) was stored in an array of unrelated spreadsheets and SharePoint lists that had to be manually updated. By developing a content management system through a web application platform named SharePoint, this training system is now highly automated and provides a much less intensive method of storing training data and scheduling training courses. This system was developed by using SharePoint Designer and laying out the data structure for the interaction between different lists of data about the employees. The automation of data population inside of the lists was accomplished by implementing SharePoint workflows which essentially lay out the logic for how data is connected and calculated between certain lists. The resulting training system is constructed from a combination of five lists of data with a single list acting as the user-friendly interface. This interface is populated with the courses required for each employee and includes past and future information about course requirements. The employees of NCCIPS now have the ability to view, log, and schedule their training information and courses with much more ease. This system will relieve a significant amount of manual input and serve as a powerful informational resource for the employees of NCCIPS in the future.

  7. The OPALS Plan for Operations: Use of ISS Trajectory and Attitude Models in the OPALS Pointing Strategy

    Science.gov (United States)

    Abrahamson, Matthew J.; Oaida, Bogdan; Erkmen, Baris

    2013-01-01

    This paper will discuss the OPALS pointing strategy, focusing on incorporation of ISS trajectory and attitude models to build pointing predictions. Methods to extrapolate an ISS prediction based on past data will be discussed and will be compared to periodically published ISS predictions and Two-Line Element (TLE) predictions. The prediction performance will also be measured against GPS states available in telemetry. The performance of the pointing products will be compared to the allocated values in the OPALS pointing budget to assess compliance with requirements.

  8. Prediction of the Flash Point of Binary and Ternary Straight-Chain Alkane Mixtures

    Directory of Open Access Journals (Sweden)

    X. Li

    2014-01-01

    Full Text Available The flash point is an important physical property used to estimate the fire hazard of a flammable liquid. To avoid the occurrence of fire or explosion, many models are used to predict the flash point; however, these models are complex, and the calculation process is cumbersome. For pure flammable substances, the research for predicting the flash point is systematic and comprehensive. For multicomponent mixtures, especially a hydrocarbon mixture, the current research is insufficient to predict the flash point. In this study, a model was developed to predict the flash point of straight-chain alkane mixtures using a simple calculation process. The pressure, activity coefficient, and other associated physicochemical parameters are not required for the calculation in the proposed model. A series of flash points of binary and ternary mixtures of straight-chain alkanes were determined. The results of the model present consistent experimental results with an average absolute deviation for the binary mixtures of 0.7% or lower and an average absolute deviation for the ternary mixtures of 1.03% or lower.

  9. The Human Rights Context for Ethical Requirements for Involving People with Intellectual Disability in Medical Research

    Science.gov (United States)

    Iacono, T.; Carling-Jenkins, R.

    2012-01-01

    Background: The history of ethical guidelines addresses protection of human rights in the face of violations. Examples of such violations in research involving people with intellectual disabilities (ID) abound. We explore this history in an effort to understand the apparently stringent criteria for the inclusion of people with ID in research, and…

  10. Comparison of Dose When Prescribed to Point A and Point H for Brachytherapy in Cervical Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Gang, Ji Hyeong; Gim, Il Hwan; Hwang, Seon Boong; Kim, Woong; Im, Hyeong Seo; Gang, Jin Mook; Gim, Gi Hwan; Lee, Ah Ram [Dept. of Radiation Oncology, Korea Institute of Radiological and Medical Sciences, Seou (Korea, Republic of)

    2012-09-15

    The purpose of this study is to compare plans prescribed to point A with these prescribed to point H recommended by ABS (American Brachytherapy Society) in high dose rate intracavitary brachytherapy for cervical carcinoma. This study selected 103 patients who received HDR (High Dose Rate) brachytherapy using tandem and ovoids from March 2010 to January 2012. Point A, bladder point, and rectal point conform with Manchester System. Point H conforms with ABS recommendation. Also Sigmoid colon point, and vagina point were established arbitrarily. We examined distance between point A and point H. The percent dose at point A was calculated when 100% dose was prescribed to point H. Additionally, the percent dose at each reference points when dose is prescribed to point H and point A were calculated. The relative dose at point A was lower when point H was located inferior to point A. The relative doses at bladder, rectal, sigmoid colon, and vagina points were higher when point H was located superior to point A, and lower when point H was located inferior to point A. This study found out that as point H got located much superior to point A, the absorbed dose of surrounding normal organs became higher, and as point H got located much inferior to point A, the absorbed dose of surrounding normal organs became lower. This differences dose not seem to affect the treatment. However, we suggest this new point is worth being considered for the treatment of HDR if dose distribution and absorbed dose at normal organs have large differences between prescribed to point A and H.

  11. Estimating Function Approaches for Spatial Point Processes

    Science.gov (United States)

    Deng, Chong

    second-order intensity function of spatial point processes. However, the original second-order quasi-likelihood is barely feasible due to the intense computation and high memory requirement needed to solve a large linear system. Motivated by the existence of geometric regular patterns in the stationary point processes, we find a lower dimension representation of the optimal weight function and propose a reduced second-order quasi-likelihood approach. Through a simulation study, we show that the proposed method not only demonstrates superior performance in fitting the clustering parameter but also merits in the relaxation of the constraint of the tuning parameter, H. Third, we studied the quasi-likelihood type estimating funciton that is optimal in a certain class of first-order estimating functions for estimating the regression parameter in spatial point process models. Then, by using a novel spectral representation, we construct an implementation that is computationally much more efficient and can be applied to more general setup than the original quasi-likelihood method.

  12. So much to do, so little time. To accomplish the mandatory initiatives of ARRA, healthcare organizations will require significant and thoughtful planning, prioritization and execution.

    Science.gov (United States)

    Klein, Kimberly

    2010-01-01

    The American Recovery and Reinvestment Act of 2009 (ARRA) has set forth legislation for the healthcare community to achieve adoption of electronic health records (EHR), as well as form data standards, health information exchanges (HIE) and compliance with more stringent security and privacy controls under the HITECH Act. While the Office of the National Coordinator for Health Information Technology (ONCHIT) works on the definition of both "meaningful use" and "certification" of information technology systems, providers in particular must move forward with their IT initiatives to achieve the basic requirements for Medicare and Medicaid incentives starting in 2011, and avoid penalties that will reduce reimbursement beginning in 2015. In addition, providers, payors, government and non-government stakeholders will all have to balance the implementation of EHRs, working with HIEs, at the same time that they must upgrade their systems to be in compliance with ICD-10 and HIPAA 5010 code sets. Compliance deadlines for EHRs and HIEs begin in 2011, while ICD-10 diagnosis and procedure code sets compliance is required by October 2013 and HIPAA 5010 transaction sets, with one exception, is required by January 1, 2012. In order to accomplish these strategic and mandatory initiatives successfully and simultaneously, healthcare organizations will require significant and thoughtful planning, prioritization and execution.

  13. PowerPoint 2010 Bible

    CERN Document Server

    Wempen, Faithe

    2010-01-01

    Master PowerPoint and improve your presentation skills-with one book!. It's no longer enough to have slide after slide of text, bullets, and charts. It's not even enough to have good speaking skills if your PowerPoint slides bore your audience. Get the very most out of all that PowerPoint 2010 has to offer while also learning priceless tips and techniques for making good presentations in this new PowerPoint 2010 Bible. Well-known PowerPoint expert and author Faithe Wempen provides formatting tips; shows you how to work with drawings, tables, and SmartArt; introduces new collaboration tools; wa

  14. Expected Number of Fixed Points in Boolean Networks with Arbitrary Topology.

    Science.gov (United States)

    Mori, Fumito; Mochizuki, Atsushi

    2017-07-14

    Boolean network models describe genetic, neural, and social dynamics in complex networks, where the dynamics depend generally on network topology. Fixed points in a genetic regulatory network are typically considered to correspond to cell types in an organism. We prove that the expected number of fixed points in a Boolean network, with Boolean functions drawn from probability distributions that are not required to be uniform or identical, is one, and is independent of network topology if only a feedback arc set satisfies a stochastic neutrality condition. We also demonstrate that the expected number is increased by the predominance of positive feedback in a cycle.

  15. Energy policies avoiding a tipping point in the climate system

    International Nuclear Information System (INIS)

    Bahn, Olivier; Edwards, Neil R.; Knutti, Reto; Stocker, Thomas F.

    2011-01-01

    Paleoclimate evidence and climate models indicate that certain elements of the climate system may exhibit thresholds, with small changes in greenhouse gas emissions resulting in non-linear and potentially irreversible regime shifts with serious consequences for socio-economic systems. Such thresholds or tipping points in the climate system are likely to depend on both the magnitude and rate of change of surface warming. The collapse of the Atlantic thermohaline circulation (THC) is one example of such a threshold. To evaluate mitigation policies that curb greenhouse gas emissions to levels that prevent such a climate threshold being reached, we use the MERGE model of Manne, Mendelsohn and Richels. Depending on assumptions on climate sensitivity and technological progress, our analysis shows that preserving the THC may require a fast and strong greenhouse gas emission reduction from today's level, with transition to nuclear and/or renewable energy, possibly combined with the use of carbon capture and sequestration systems. - Research Highlights: → Preserving the THC may require a fast and strong greenhouse gas emission reduction. → This could be achieved through strong changes in the energy mix. → Similar results would apply to any climate system tipping points.

  16. Critical points for finite Fibonacci chains of point delta-interactions and orthogonal polynomials

    International Nuclear Information System (INIS)

    De Prunele, E

    2011-01-01

    For a one-dimensional Schroedinger operator with a finite number n of point delta-interactions with a common intensity, the parameters are the intensity, the n - 1 intercenter distances and the mass. Critical points are points in the parameters space of the Hamiltonian where one bound state appears or disappears. The study of critical points for Hamiltonians with point delta-interactions arranged along a Fibonacci chain is shown to be closely related to the study of the so-called Fibonacci operator, a discrete one-dimensional Schroedinger-type operator, which occurs in the context of tight binding Hamiltonians. These critical points are the zeros of orthogonal polynomials previously studied in the context of special diatomic linear chains with elastic nearest-neighbor interaction. Properties of the zeros (location, asymptotic behavior, gaps, ...) are investigated. The perturbation series from the solvable periodic case is determined. The measure which yields orthogonality is investigated numerically from the zeros. It is shown that the transmission coefficient at zero energy can be expressed in terms of the orthogonal polynomials and their associated polynomials. In particular, it is shown that when the number of point delta-interactions is equal to a Fibonacci number minus 1, i.e. when the intervals between point delta-interactions form a palindrome, all the Fibonacci chains at critical points are completely transparent at zero energy. (paper)

  17. Poisson branching point processes

    International Nuclear Information System (INIS)

    Matsuo, K.; Teich, M.C.; Saleh, B.E.A.

    1984-01-01

    We investigate the statistical properties of a special branching point process. The initial process is assumed to be a homogeneous Poisson point process (HPP). The initiating events at each branching stage are carried forward to the following stage. In addition, each initiating event independently contributes a nonstationary Poisson point process (whose rate is a specified function) located at that point. The additional contributions from all points of a given stage constitute a doubly stochastic Poisson point process (DSPP) whose rate is a filtered version of the initiating point process at that stage. The process studied is a generalization of a Poisson branching process in which random time delays are permitted in the generation of events. Particular attention is given to the limit in which the number of branching stages is infinite while the average number of added events per event of the previous stage is infinitesimal. In the special case when the branching is instantaneous this limit of continuous branching corresponds to the well-known Yule--Furry process with an initial Poisson population. The Poisson branching point process provides a useful description for many problems in various scientific disciplines, such as the behavior of electron multipliers, neutron chain reactions, and cosmic ray showers

  18. Estimating Aircraft Heading Based on Laserscanner Derived Point Clouds

    Science.gov (United States)

    Koppanyi, Z.; Toth, C., K.

    2015-03-01

    Using LiDAR sensors for tracking and monitoring an operating aircraft is a new application. In this paper, we present data processing methods to estimate the heading of a taxiing aircraft using laser point clouds. During the data acquisition, a Velodyne HDL-32E laser scanner tracked a moving Cessna 172 airplane. The point clouds captured at different times were used for heading estimation. After addressing the problem and specifying the equation of motion to reconstruct the aircraft point cloud from the consecutive scans, three methods are investigated here. The first requires a reference model to estimate the relative angle from the captured data by fitting different cross-sections (horizontal profiles). In the second approach, iterative closest point (ICP) method is used between the consecutive point clouds to determine the horizontal translation of the captured aircraft body. Regarding the ICP, three different versions were compared, namely, the ordinary 3D, 3-DoF 3D and 2-DoF 3D ICP. It was found that 2-DoF 3D ICP provides the best performance. Finally, the last algorithm searches for the unknown heading and velocity parameters by minimizing the volume of the reconstructed plane. The three methods were compared using three test datatypes which are distinguished by object-sensor distance, heading and velocity. We found that the ICP algorithm fails at long distances and when the aircraft motion direction perpendicular to the scan plane, but the first and the third methods give robust and accurate results at 40m object distance and at ~12 knots for a small Cessna airplane.

  19. Laser Dew-Point Hygrometer

    Science.gov (United States)

    Matsumoto, Shigeaki; Toyooka, Satoru

    1995-01-01

    A rough-surface-type automatic dew-point hygrometer was developed using a laser diode and an optical fiber cable. A gold plate with 0.8 µ m average surface roughness was used as a surface for deposition of dew to facilitate dew deposition and prevent supersaturation of water vapor at the dew point. It was shown experimentally that the quantity of dew deposited can be controlled to be constant at any predetermined level, and is independent of the dew point to be measured. The dew points were measured in the range from -15° C to 54° C in which the temperature ranged from 0° C to 60° C. The measurement error of the dew point was ±0.5° C which was equal to below ±2% in relative humidity in the above dew-point range.

  20. The Motion of Point Particles in Curved Spacetime

    Directory of Open Access Journals (Sweden)

    Eric Poisson

    2011-09-01

    Full Text Available This review is concerned with the motion of a point scalar charge, a point electric charge, and a point mass in a specified background spacetime. In each of the three cases the particle produces a field that behaves as outgoing radiation in the wave zone, and therefore removes energy from the particle. In the near zone the field acts on the particle and gives rise to a self-force that prevents the particle from moving on a geodesic of the background spacetime. The self-force contains both conservative and dissipative terms, and the latter are responsible for the radiation reaction. The work done by the self-force matches the energy radiated away by the particle. The field's action on the particle is difficult to calculate because of its singular nature: the field diverges at the position of the particle. But it is possible to isolate the field's singular part and show that it exerts no force on the particle -- its only effect is to contribute to the particle's inertia. What remains after subtraction is a regular field that is fully responsible for the self-force. Because this field satisfies a homogeneous wave equation, it can be thought of as a free field that interacts with the particle; it is this interaction that gives rise to the self-force. The mathematical tools required to derive the equations of motion of a point scalar charge, a point electric charge, and a point mass in a specified background spacetime are developed here from scratch. The review begins with a discussion of the basic theory of bitensors (Part I. It then applies the theory to the construction of convenient coordinate systems to chart a neighbourhood of the particle's word line (Part II. It continues with a thorough discussion of Green's functions in curved spacetime (Part III. The review presents a detailed derivation of each of the three equations of motion (Part IV. Because the notion of a point mass is problematic in general relativity, the review concludes (Part V

  1. Design of a six-component side-wall balance using optical fibre sensors

    CSIR Research Space (South Africa)

    Pieterse, FF

    2017-01-01

    Full Text Available The requirements that a wind tunnel balance need to satisfy have become increasingly stringent. These requirements, as set out by the wind tunnel testing community, include: improved static force accuracy and resolution, increased stiffness...

  2. Environmental Life Cycle Assessment and Cost Analysis of Bath, NY Wastewater Treatment Plant: Potential Upgrade Implications

    Science.gov (United States)

    Many communities across the U.S. are required to upgrade wastewater treatment plants (WWTP) to meet increasingly stringent nutrient effluent standards. However, increased capital, energy and chemical requirements of upgrades create potential trade-offs between eutrophication pot...

  3. Surrogate end points in clinical research: hazardous to your health.

    Science.gov (United States)

    Grimes, David A; Schulz, Kenneth F

    2005-05-01

    Surrogate end points in clinical research pose real danger. A surrogate end point is an outcome measure, commonly a laboratory test, that substitutes for a clinical event of true importance. Resistance to activated protein C, for example, has been used as a surrogate for venous thrombosis in women using oral contraceptives. Other examples of inappropriate surrogate end points in contraception include the postcoital test instead of pregnancy to evaluate new spermicides, breakage and slippage instead of pregnancy to evaluate condoms, and bone mineral density instead of fracture to assess the safety of depo-medroxyprogesterone acetate. None of these markers captures the effect of the treatment on the true outcome. A valid surrogate end point must both correlate with and accurately predict the outcome of interest. Although many surrogate markers correlate with an outcome, few have been shown to capture the effect of a treatment (for example, oral contraceptives) on the outcome (venous thrombosis). As a result, thousands of useless and misleading reports on surrogate end points litter the medical literature. New drugs have been shown to benefit a surrogate marker, but, paradoxically, triple the risk of death. Thousands of patients have died needlessly because of reliance on invalid surrogate markers. Researchers should avoid surrogate end points unless they have been validated; that requires at least one well done trial using both the surrogate and true outcome. The clinical maxim that "a difference to be a difference must make a difference" applies to research as well. Clinical research should focus on outcomes that matter.

  4. CD36 is required for myoblast fusion during myogenic differentiation

    Energy Technology Data Exchange (ETDEWEB)

    Park, Seung-Yoon [Department of Biochemistry, College of Medicine, Dongguk University and Medical Institute of Dongguk University, Gyeongju 780-714 (Korea, Republic of); Yun, Youngeun [Department of Biochemistry and Cell Biology, Cell and Matrix Research Institute, School of Medicine, Kyungpook National University, Daegu 700-422 (Korea, Republic of); Kim, In-San, E-mail: iskim@knu.ac.kr [Department of Biochemistry and Cell Biology, Cell and Matrix Research Institute, School of Medicine, Kyungpook National University, Daegu 700-422 (Korea, Republic of); Biomedical Research Institute, Korea Institute Science and Technology, Seoul (Korea, Republic of)

    2012-11-02

    Highlights: Black-Right-Pointing-Pointer CD36 expression was induced during myogenic differentiation. Black-Right-Pointing-Pointer CD36 expression was localized in multinucleated myotubes. Black-Right-Pointing-Pointer The expression of myogenic markers is attenuated in CD36 knockdown C2C12 cells. Black-Right-Pointing-Pointer Knockdown of CD36 significantly inhibited myotube formation during differentiation. -- Abstract: Recently, CD36 has been found to be involved in the cytokine-induced fusion of macrophage. Myoblast fusion to form multinucleated myotubes is required for myogenesis and muscle regeneration. Because a search of gene expression database revealed the attenuation of CD36 expression in the muscles of muscular dystrophy patients, the possibility that CD36 could be required for myoblast fusion was investigated. CD36 expression was markedly up-regulated during myoblast differentiation and localized in multinucleated myotubes. Knockdown of endogenous CD36 significantly decreased the expression of myogenic markers as well as myotube formation. These results support the notion that CD36 plays an important role in cell fusion during myogenic differentiation. Our finding will aid the elucidation of the common mechanism governing cell-to-cell fusion in various fusion models.

  5. A Discussion on the Concept of Water Resources from the Perspective of the Most Stringent Water Management System%最严格水资源管理制度视野下水资源概念探讨

    Institute of Scientific and Technical Information of China (English)

    胡德胜

    2015-01-01

    In order to solve the increasingly serious water problems and solve this major bottleneck in China’s sustainable economic and so-cial development,China has decided to establish and implement the most stringent water management system. In order to refine and imple-ment this system,it is necessary to clarify the basic term of“water resources”with its definition and extension according to legal theory. By comparing and analyzing different understandings and/or legal definitions of“water resources”,at home and abroad,from different disci-plines,and from different legal systems in different countries,based on the perspective of the most stringent water management system,the scope of“water resources”should be legitimately delimited in a way of considering the substance of water itself as well as its relevant roles and functions comprehensively,and therefore:(i)in terms of quantity,“water resources”should be limited to freshwater in principle,only surface water and groundwater should be calculated,and atmospheric water,soil water and green water should be excluded;(ii)the man-agement of water quality and self-purification capacity should be emphasized;(iii)the objects of the most stringent water management sys-tem should include hydraulic power resources,water transport resources and those carrier of water resources,such as river/waterway,lake, wetland,reservoir/dam,pond,spring,well,glacier,etc.%为了解决日益严峻的水问题,我国决定建立并正在实施最严格水资源管理制度。为了细化和便于实施最严格水资源管理制度,需要对“水资源”这一基础术语的内涵和外延有一个符合法学理论的界定。通过比较和分析国内外、不同学科、不同国家法律对于水资源的不同认识、理解或者法律界定,基于最严格水资源管理制度的视野,需要综合考虑水

  6. Pediatric Lung Abscess: Immediate Diagnosis by Point-of-Care Ultrasound.

    Science.gov (United States)

    Kraft, Clara; Lasure, Benjamin; Sharon, Melinda; Patel, Paulina; Minardi, Joseph

    2018-06-01

    The diagnosis of lung abscess can be difficult to make and often requires imaging beyond plain chest x-ray. The decision to further image with computed tomography should be weighed against the risks of radiation exposure, especially in pediatric patients. In addition, the cost and potential impact on length of stay from obtaining computed tomography scans should be considered. In this report, we describe a case of lung abscess made immediately using point-of-care ultrasound in the emergency department. To our knowledge, there are no previous cases describing lung abscess diagnosed by point-of-care ultrasound. This case report aims to describe a case of pediatric lung abscess, review the ultrasound findings, and discuss relevant literature on the topic.

  7. Learning power point 2000 easily

    Energy Technology Data Exchange (ETDEWEB)

    Mon, In Su; Je, Jung Suk

    2000-05-15

    This book introduces power point 2000, which gives descriptions what power point is, what we can do with power point 2000, is it possible to install power point 2000 in my computer? Let's run power point, basic of power point such as new presentation, writing letter, using text box, changing font size, color and shape, catching power user, insertion of word art and creating of new file. It also deals with figure, chart, graph, making multimedia file, presentation, know-how of power point for teachers and company workers.

  8. CONTOURS BASED APPROACH FOR THERMAL IMAGE AND TERRESTRIAL POINT CLOUD REGISTRATION

    Directory of Open Access Journals (Sweden)

    A. Bennis

    2013-07-01

    Full Text Available Building energetic performances strongly depend on the thermal insulation. However the performance of the insulation materials tends to decrease over time which necessitates the continuous monitoring of the building in order to detect and repair the anomalous zones. In this paper, it is proposed to couple 2D infrared images representing the surface temperature of the building with 3D point clouds acquired with Terrestrial Laser Scanner (TLS resulting in a semi-automatic approach allowing the texturation of TLS data with infrared image of buildings. A contour-based algorithm is proposed whose main features are : 1 the extraction of high level primitive is not required 2 the use of projective transform allows to handle perspective effects 3 a point matching refinement procedure allows to cope with approximate control point selection. The procedure is applied to test modules aiming at investigating the thermal properties of material.

  9. Quantifying regional cerebral blood flow with N-isopropyl-p-[123I]iodoamphetamine and SPECT by one-point sampling method

    International Nuclear Information System (INIS)

    Odano, Ikuo; Takahashi, Naoya; Noguchi, Eikichi; Ohtaki, Hiro; Hatano, Masayoshi; Yamazaki, Yoshihiro; Higuchi, Takeshi; Ohkubo, Masaki.

    1994-01-01

    We developed a new non-invasive technique; one-point sampling method, for quantitative measurement of regional cerebral blood flow (rCBF) with N-isopropyl-p-[ 123 I]iodoamphetamine and SPECT. Although the continuous withdrawal of arterial blood and octanol treatment of the blood are required in the conventional microsphere method, the new technique dose not require these two procedures. The total activity of 123 I-IMP obtained by the continuous withdrawal of arterial blood is inferred by the activity of 133 I-IMP obtained by the one point arterial sample using a regression line. To determine when one point sampling time was optimum for inferring integral input function of the continuous withdrawal and whether the treatment of sampled blood for octanol fraction was required, we examined a correlation between the total activity of arterial blood withdrawn from 0 to 5 min after the injection and the activity of one point sample obtained at time t, and calculated a regression line. As a result, the minimum % error for the inference using the regression line was obtained at 6 min after the 123 I-IMP injection, moreover, the octanol treatment was not required. Then examining an effect on the values of rCBF when the sampling time was deviated from 6 min, we could correct the values in approximately 3% error when the sample was obtained at 6±1 min after the injection. The one-point sampling method provides accurate and relatively non-invasive measurement of rCBF without octanol extraction of arterial blood. (author)

  10. Potential impact of environmental requirements on petroleum products derived from synthetic crude

    International Nuclear Information System (INIS)

    1997-01-01

    Fuel quality proposals regarding gasoline and diesel fuels were discussed. Strict regulations on air emissions will mean changes in transportation fuel specifications which will ultimately impact on the refining industry. As fuel quality requirements become more stringent, refiners will need to look more closely at increasing the use of Canadian synthetic crude as a refinery feed. The fuel quality specifications with the potentially highest impact for the continued use of synthetic crude are those relating to sulphur, aromatics (including benzene), and olefins in gasoline and sulphur, aromatics and cetane in diesel fuel. Synthetic crude has an advantage in terms of gasoline sulphur content. The FCC feed is at a low enough sulphur level to result in gasoline components that would allow refiners to meet final gasoline sulphur levels of less than 100 ppm. In either case, synthetic middle distillate must be upgraded. Options that face the synthetic crude and refining industries are: (1) synthetic crude producers may install the process equipment needed to upgrade the middle distillate portion of their synthetic crude stream, (2) refiners may install equipment to upgrade just the diesel fuel portion of the middle distillate pool and jet fuel, and (3) a joint effort may be made by the two industries. The National Centre for Upgrading Technology (NCUT) and the Western Research Centre of Natural Resources Canada will continue to assist with research into improved catalysts for hydrotreating of middle distillates, and new lower cost processes for upgrading middle distillates from synthetic and conventional crude oils to meet future product requirements. 5 refs., 1 tab

  11. From Contrapuntal Music to Polyphonic Novel: Aldous Huxley’s Point Counter Point

    Directory of Open Access Journals (Sweden)

    Mevlüde ZENGİN

    2015-06-01

    Full Text Available Taken at face value Point Counter Point (1928 written by Aldous Huxley seems to be a novel including many stories of various and sundry people and reflecting their points of view about the world in which they live and about the life they have been leading. However, it is this very quality of the novel that provides grounds for the study of the novel as a polyphonic one. The novel presents to its reader an aggregate of strikingly different characters and thus a broad spectrum of contemporary society. The characters in the novel are all characterized by and individualized with easily recognizable physical, intellectual, emotional, psychological and moral qualities. Each of them is well-contrived through their differences in social status, political views, wealth, etc. Thus, many different viewpoints, conflicting voices, contrasting insights and ideas are heard and seen synchronically in Point Counter Point, which makes it polyphonic. Polyphony is a musical motif referring to different notes and chords played at the same time to create a rhythm. It was first adopted by M. M. Bakhtin to analyze F. M. Dostoyevsky’s fiction. The aim of this study is firstly to elucidate, in Bakhtinian thought, polyphony and then dialogism and heteroglossia closely related to his concept of polyphony; and then to put the polyphonic qualities in Point Counter Point forth, studying the novel’s dialogism and heteroglot qualities

  12. Design requirements for the new reactor

    International Nuclear Information System (INIS)

    Koski, S.

    2005-01-01

    This presentation deals with the safety related design requirements specified for the new nuclear power plant to be built in Finland (FINS). The legislation, codes and standards, on which the design requirements are based, can be arranged into a hierarchical pyramid as follows: The safety related design criteria are based on the three uppermost hierarchical levels: Finnish legislation (e.g. decisions of the State Council) Basic Regulations (75-INSAG-3, USNRC General Design Criteria) Process oriented nuclear documents (YVL- guides or corresponding US/German rules). The European Utility Requirements (EUR) document was used as the starting point for the writing of the design requirements document. The structure and headlines of EUR could be kept, but in many cases the contents had to be deleted and rewritten to correspond to the requirement level of the above codes and standards. This was the case, for example, with the requirements concerning safety classification or application of failure criteria. In the presentation, the most important safety related design criteria are reviewed, with an emphasis on those requirements which exceed the requirement level applied on the existing plant units. Some hints are also given on the main differences between Finnish and international safety requirements. (orig.)

  13. One-dimensional gravity in infinite point distributions

    Science.gov (United States)

    Gabrielli, A.; Joyce, M.; Sicard, F.

    2009-10-01

    The dynamics of infinite asymptotically uniform distributions of purely self-gravitating particles in one spatial dimension provides a simple and interesting toy model for the analogous three dimensional problem treated in cosmology. In this paper we focus on a limitation of such models as they have been treated so far in the literature: the force, as it has been specified, is well defined in infinite point distributions only if there is a centre of symmetry (i.e., the definition requires explicitly the breaking of statistical translational invariance). The problem arises because naive background subtraction (due to expansion, or by “Jeans swindle” for the static case), applied as in three dimensions, leaves an unregulated contribution to the force due to surface mass fluctuations. Following a discussion by Kiessling of the Jeans swindle in three dimensions, we show that the problem may be resolved by defining the force in infinite point distributions as the limit of an exponentially screened pair interaction. We show explicitly that this prescription gives a well defined (finite) force acting on particles in a class of perturbed infinite lattices, which are the point processes relevant to cosmological N -body simulations. For identical particles the dynamics of the simplest toy model (without expansion) is equivalent to that of an infinite set of points with inverted harmonic oscillator potentials which bounce elastically when they collide. We discuss and compare with previous results in the literature and present new results for the specific case of this simplest (static) model starting from “shuffled lattice” initial conditions. These show qualitative properties of the evolution (notably its “self-similarity”) like those in the analogous simulations in three dimensions, which in turn resemble those in the expanding universe.

  14. Perceptions of point-of-care infectious disease testing among European medical personnel, point-of-care test kit manufacturers, and the general public

    Directory of Open Access Journals (Sweden)

    Kaman WE

    2013-06-01

    Full Text Available Wendy E Kaman,1 Eleni-Rosalina Andrinopoulou,2 John P Hays11Department of Medical Microbiology and Infectious Diseases, Erasmus Medical Center, Rotterdam, The Netherlands; 2Department of Biostatistics, Erasmus Medical Center, Rotterdam, The NetherlandsBackground: The proper development and implementation of point-of-care (POC diagnostics requires knowledge of the perceived requirements and barriers to their implementation. To determine the current requirements and perceived barriers to the introduction of POC diagnostics in the field of medical microbiology (MM-POC a prospective online survey (TEMPOtest-QC was established.Methods and results: The TEMPOtest-QC survey was online between February 2011 and July 2012 and targeted the medical community, POC test diagnostic manufacturers, general practitioners, and the general public. In total, 293 individuals responded to the survey, including 91 (31% medical microbiologists, 39 (13% nonmedical microbiologists, 25 (9% employees of POC test manufacturers, and 138 (47% members of the general public. Responses were received from 18 different European countries, with the largest percentage of these living in The Netherlands (52%. The majority (>50% of medical specialists regarded the development of MM-POC for blood culture and hospital acquired infections as “absolutely necessary”, but were much less favorable towards their use in the home environment. Significant differences in perceptions between medical specialists and the general public included the: (1 Effect on quality of patient care; (2 Ability to better monitor patients; (3 Home testing and the doctor-patient relationship; and (4 MM-POC interpretation. Only 34.7% of the general public is willing to pay more than €10 ($13 for a single MM-POC test, with 85.5% preferring to purchase their MM-POC test from a pharmacy.Conclusion: The requirements for the proper implementation of MM-POC were found to be generally similar between medical

  15. Flow area optimization in point to area or area to point flows

    International Nuclear Information System (INIS)

    Ghodoossi, Lotfollah; Egrican, Niluefer

    2003-01-01

    This paper deals with the constructal theory of generation of shape and structure in flow systems connecting one point to a finite size area. The flow direction may be either from the point to the area or the area to the point. The formulation of the problem remains the same if the flow direction is reversed. Two models are used in optimization of the point to area or area to point flow problem: cost minimization and revenue maximization. The cost minimization model enables one to predict the shape of the optimized flow areas, but the geometric sizes of the flow areas are not predictable. That is, as an example, if the area of flow is a rectangle with a fixed area size, optimization of the point to area or area to point flow problem by using the cost minimization model will only predict the height/length ratio of the rectangle not the height and length itself. By using the revenue maximization model in optimization of the flow problems, all optimized geometric aspects of the interested flow areas will be derived as well. The aim of this paper is to optimize the point to area or area to point flow problems in various elemental flow area shapes and various structures of the flow system (various combinations of elemental flow areas) by using the revenue maximization model. The elemental flow area shapes used in this paper are either rectangular or triangular. The forms of the flow area structure, made up of an assembly of optimized elemental flow areas to obtain bigger flow areas, are rectangle-in-rectangle, rectangle-in-triangle, triangle-in-triangle and triangle-in-rectangle. The global maximum revenue, revenue collected per unit flow area and the shape and sizes of each flow area structure have been derived in optimized conditions. The results for each flow area structure have been compared with the results of the other structures to determine the structure that provides better performance. The conclusion is that the rectangle-in-triangle flow area structure

  16. Automated extraction and analysis of rock discontinuity characteristics from 3D point clouds

    Science.gov (United States)

    Bianchetti, Matteo; Villa, Alberto; Agliardi, Federico; Crosta, Giovanni B.

    2016-04-01

    A reliable characterization of fractured rock masses requires an exhaustive geometrical description of discontinuities, including orientation, spacing, and size. These are required to describe discontinuum rock mass structure, perform Discrete Fracture Network and DEM modelling, or provide input for rock mass classification or equivalent continuum estimate of rock mass properties. Although several advanced methodologies have been developed in the last decades, a complete characterization of discontinuity geometry in practice is still challenging, due to scale-dependent variability of fracture patterns and difficult accessibility to large outcrops. Recent advances in remote survey techniques, such as terrestrial laser scanning and digital photogrammetry, allow a fast and accurate acquisition of dense 3D point clouds, which promoted the development of several semi-automatic approaches to extract discontinuity features. Nevertheless, these often need user supervision on algorithm parameters which can be difficult to assess. To overcome this problem, we developed an original Matlab tool, allowing fast, fully automatic extraction and analysis of discontinuity features with no requirements on point cloud accuracy, density and homogeneity. The tool consists of a set of algorithms which: (i) process raw 3D point clouds, (ii) automatically characterize discontinuity sets, (iii) identify individual discontinuity surfaces, and (iv) analyse their spacing and persistence. The tool operates in either a supervised or unsupervised mode, starting from an automatic preliminary exploration data analysis. The identification and geometrical characterization of discontinuity features is divided in steps. First, coplanar surfaces are identified in the whole point cloud using K-Nearest Neighbor and Principal Component Analysis algorithms optimized on point cloud accuracy and specified typical facet size. Then, discontinuity set orientation is calculated using Kernel Density Estimation and

  17. Man as a protective barrier in nuclear power technology: the requirements, viewed by the Federal Minister for Home Affairs

    International Nuclear Information System (INIS)

    Fechner, J.B.

    1981-06-01

    Evaluation of nuclear power plant incidents frequently reveals man as a major element of risk. Yet, in a nuclear power plant man has the function of an important protective barrier, either by maintaining the plant, by detecting and limiting faults or incidents, or by taking proper measures in accidents. This is true despite, or perhaps because of, the high degree of plant automation. For this reason, it is indispensable that a high level of engineered plant safeguards be accompanied by a minimum of faults contributed by human action. This implies that the staff and their working conditions must meet the same stringent safety requirements as the nuclear power plant proper. Reactor manufacturers, nuclear power plant operators and the responsible authorities try to optimize this human contribution. The Federal Ministry of the Interior, through its Special Technical Guidelines and its continuation training measures, occupies an important position in this respect. Further measures and ordinances are being prepared by that Ministry

  18. Fermat Point for a Triangle in Three Dimensions Using the Taxicab Metric

    Science.gov (United States)

    Hanson, J. R.

    2017-01-01

    This article explores the process of finding the Fermat point for a triangle ABC in three dimensions. Three examples are presented in detail using geometrical methods. A delightfully simple general method is then presented that requires only the comparison of coordinates of the vertices A, B and C.

  19. Development of sensor augmented robotic weld systems for aerospace propulsion system fabrication

    Science.gov (United States)

    Jones, C. S.; Gangl, K. J.

    1986-01-01

    In order to meet stringent performance goals for power and reuseability, the Space Shuttle Main Engine was designed with many complex, difficult welded joints that provide maximum strength and minimum weight. To this end, the SSME requires 370 meters of welded joints. Automation of some welds has improved welding productivity significantly over manual welding. Application has previously been limited by accessibility constraints, requirements for complex process control, low production volumes, high part variability, and stringent quality requirements. Development of robots for welding in this application requires that a unique set of constraints be addressed. This paper shows how robotic welding can enhance production of aerospace components by addressing their specific requirements. A development program at the Marshall Space Flight Center combining industrial robots with state-of-the-art sensor systems and computer simulation is providing technology for the automation of welds in Space Shuttle Main Engine production.

  20. Point Lepreau refurbishment: plant condition assessment

    International Nuclear Information System (INIS)

    Allen, P.J.; Soulard, M.R.; David, F.; Clefton, G.; Weeks, R.

    2001-01-01

    New Brunswick Power (NB Power) has initiated a study into the refurbishment of the Point Lepreau Generating Station, with the objective to extend plant operation another 25 to 30 years. The end product of this study will be a business case that compares the costs of refurbishing Point Lepreau with costs of alternate means of generation. The Project Execution Plan and business case are being developed by an integrated team of AECL, NB Power and subcontractor staff under the project management of AECL. The refurbishment scope will include replacement of the pressure tubes, calandria tubes and part of the feeder piping. Planning of these replacements is part of the refurbishment study work. Planning is also underway for the environmental, safety and licensing issues that would need to be addressed to ensure future operation of the unit. In addition to these studies, a systematic review of the plant has been carried out to determine what other equipment refurbishment or replacement will be required due to ageing or obsolescence of plant equipment. This Plant Condition Assessment (PCA) follows a highly structured approach to ensure consistency. This paper presents an overview of the engineering process and the main findings from the work. (author)

  1. Professional SharePoint 2010 Development

    CERN Document Server

    Rizzo, Tom; Fried, Jeff

    2010-01-01

    Learn to leverage the features of the newest version of SharePoint, in this update to the bestseller. More than simply a portal, SharePoint is Microsoft's popular content management solution for building intranets and Web sites or hosting wikis and blogs. Offering broad coverage on all aspects of development for the SharePoint platform, this comprehensive book shows you exactly what SharePoint does, how to build solutions, and what features are accessible within SharePoint. Written by one of the most recognized names in SharePoint development, Professional SharePoint 2010 Development offers an

  2. PointCom: semi-autonomous UGV control with intuitive interface

    Science.gov (United States)

    Rohde, Mitchell M.; Perlin, Victor E.; Iagnemma, Karl D.; Lupa, Robert M.; Rohde, Steven M.; Overholt, James; Fiorani, Graham

    2008-04-01

    Unmanned ground vehicles (UGVs) will play an important role in the nation's next-generation ground force. Advances in sensing, control, and computing have enabled a new generation of technologies that bridge the gap between manual UGV teleoperation and full autonomy. In this paper, we present current research on a unique command and control system for UGVs named PointCom (Point-and-Go Command). PointCom is a semi-autonomous command system for one or multiple UGVs. The system, when complete, will be easy to operate and will enable significant reduction in operator workload by utilizing an intuitive image-based control framework for UGV navigation and allowing a single operator to command multiple UGVs. The project leverages new image processing algorithms for monocular visual servoing and odometry to yield a unique, high-performance fused navigation system. Human Computer Interface (HCI) techniques from the entertainment software industry are being used to develop video-game style interfaces that require little training and build upon the navigation capabilities. By combining an advanced navigation system with an intuitive interface, a semi-autonomous control and navigation system is being created that is robust, user friendly, and less burdensome than many current generation systems. mand).

  3. The Lagrangian Points

    Science.gov (United States)

    Linton, J. Oliver

    2017-01-01

    There are five unique points in a star/planet system where a satellite can be placed whose orbital period is equal to that of the planet. Simple methods for calculating the positions of these points, or at least justifying their existence, are developed.

  4. Cycles within micronets and at the gel point

    DEFF Research Database (Denmark)

    Armitage, David H.; Cameron, Colin; Fawcett, Allan H.

    2000-01-01

    three-dimensional space. Not only may cycles form in competition with branching growth, but if the statistics require it, segments of one cycle may be shared with those of any number of others, as Houwink indicated in 1935. After a small number of simple cycle-containing micronet molecules...... zero, when m≅24. The periodic boundaries of the model were not large enough to provide the exact behavior of the cycle number as m becomes larger, but an explosion is certainly indicated by k becoming negative. A resilient product is predicted at the gel point....

  5. 40 CFR 158.2030 - Biochemical pesticides product chemistry data requirements table.

    Science.gov (United States)

    2010-07-01

    ....7100 Viscosity CR MP EP 12 830.7200 Melting point/melting range CR TGAI TGAI 8, 13 830.7220 Boiling point/boiling range CR TGAI TGAI 8, 14 830.7300 Density/relative density/bulk density R TGAI and MP TGAI.... (2) Definitions in § 158.300 apply to data requirements in this section. (b) Use patterns. Product...

  6. Requirements for radiation oncology physics in Australia and New Zealand

    International Nuclear Information System (INIS)

    Oliver, L.; Fitchew, R.; Drew, J.

    2001-01-01

    technologies for radiation oncology also require a stringent approach to maintaining a satisfactory standard of practice in radiation oncology physics. Appropriate on-going education of radiation oncology physicists as well as the educating of registrar physicists is essential. Institutional management and the ACPSEM must both play a key role in providing a means for satisfactory staff tuition on the safe and expert use of existing and new radiotherapy equipment. Copyright (2001) Australasian College of Physical Scientists and Engineers in Medicine

  7. Technical evaluation of RETS-required reports for Turkey Point Units No. 3 and 4

    International Nuclear Information System (INIS)

    Magleby, E.H.; Young, T.E.; Henscheid, J.W.

    1985-01-01

    A review of the reports required by Federal regulations and the plant specific Radiological Effluent Technical Specifications (RETS) for operations conducted during 1983 was performed. The periodic reports reviewed were the Annual Radiological Environmental Operating Report for 1983 and Semiannual Radioactive Effluent Release Reports for 1983. The principal review guidelines were the plant's specific RETS, NUREG-0133, ''Preparation of Radiological Effluent Technical Specifications for Nuclear Power Plants'', and NRC Guidance on the Review of the Process Control Programs. The Licensee's submitted reports were found to be reasonably complete and consistent with the review guidelines

  8. Assessment of osteoporotic vertebral fractures using specialized workflow software for 6-point morphometry

    International Nuclear Information System (INIS)

    Guglielmi, Giuseppe; Palmieri, Francesco; Placentino, Maria Grazia; D'Errico, Francesco; Stoppino, Luca Pio

    2009-01-01

    Purpose: To evaluate the time required, the accuracy and the precision of a model-based image analysis software tool for the diagnosis of osteoporotic fractures using a 6-point morphometry protocol. Materials and methods: Lateral dorsal and lumbar radiographs were performed on 92 elderly women (mean age 69.2 ± 5.7 years). Institutional review board approval and patient informed consent were obtained for all subjects. The semi-automated and the manual correct annotations of 6-point placement were compared to calculate the time consumed and the accuracy of the software. Twenty test images were randomly selected and the data obtained by multiple perturbed initialisation points on the same image were compared to assess the precision of the system. Results: The time requirement data of the semi-automated system (420 ± 67 s) were statistically different (p < 0.05) from that of manual placement (900 ± 77 s). In the accuracy test, the mean reproducibility error for semi-automatic 6-point placement was 2.50 ± 0.72% [95% CI] for the anterior-posterior reference and 2.16 ± 0.5% [95% CI] for the superior-inferior reference. In the precision test the mean error resulted averaged over all vertebrae was 2.6 ± 1.3% in terms of vertebral width. Conclusions: The technique is time effective, accurate and precise and can, therefore, be recommended in large epidemiological studies and pharmaceutical trials for reporting of osteoporotic vertebral fractures.

  9. The End of Points

    Science.gov (United States)

    Feldman, Jo

    2018-01-01

    Have teachers become too dependent on points? This article explores educators' dependency on their points systems, and the ways that points can distract teachers from really analyzing students' capabilities and achievements. Feldman argues that using a more subjective grading system can help illuminate crucial information about students and what…

  10. Multispectral Image Feature Points

    Directory of Open Access Journals (Sweden)

    Cristhian Aguilera

    2012-09-01

    Full Text Available This paper presents a novel feature point descriptor for the multispectral image case: Far-Infrared and Visible Spectrum images. It allows matching interest points on images of the same scene but acquired in different spectral bands. Initially, points of interest are detected on both images through a SIFT-like based scale space representation. Then, these points are characterized using an Edge Oriented Histogram (EOH descriptor. Finally, points of interest from multispectral images are matched by finding nearest couples using the information from the descriptor. The provided experimental results and comparisons with similar methods show both the validity of the proposed approach as well as the improvements it offers with respect to the current state-of-the-art.

  11. Dew point measurements of flue gases in steam generators with brown coal combustion

    Energy Technology Data Exchange (ETDEWEB)

    Schinkel, W.

    1980-01-01

    This paper examines empirical data on sulfuric acid condensation and resulting internal corrosion in brown coal fired steam generators. Due to the high sulfur content in brown coal (0.5% to 5.0%) and relative short duration of the gases in the combustion chamber the concentrations of sulfur trioxide present in the flue gases can condense at the heat exchange surfaces of the steam generators. A number of diagrams show sulfuric acid dew point temperatures depending on brown coal sulfur content, the influence of combustion air supply on the dew point, and condensing speed and the rate of corrosion in relation to different heat exchange surface temperatures. The conclusion is made that a five-fold increase in corrosion can be caused by a 10 K higher flue gas dew point, a 5 K cooling of heating surfaces can also cause heavy corrosion at a certain dew point. Maximum corrosion results at 20 to 50 K differences between flue gas dew point and heat exchange surfaces. Optimum operation of steam generators with minimal internal corrosion requires the consideration of flue gas and heating surface temperatures as well as flue gas sulfur acid dew points. (10 refs.) (In German)

  12. Inertially Stabilized Platforms for Precision Pointing Applications to Directed-Energy Weapons and Space-Based Lasers (Preprint)

    National Research Council Canada - National Science Library

    Negro, J; Griffin, S

    2006-01-01

    .... This article addresses directed-energy-weapon (DEW) precision pointing requirements and implementation alternatives in the context of strapdown and stable-platform inertial-reference technologies...

  13. Design and evaluation of aircraft heat source systems for use with high-freezing point fuels

    Science.gov (United States)

    Pasion, A. J.

    1979-01-01

    The objectives were the design, performance and economic analyses of practical aircraft fuel heating systems that would permit the use of high freezing-point fuels on long-range aircraft. Two hypothetical hydrocarbon fuels with freezing points of -29 C and -18 C were used to represent the variation from current day jet fuels. A Boeing 747-200 with JT9D-7/7A engines was used as the baseline aircraft. A 9300 Km mission was used as the mission length from which the heat requirements to maintain the fuel above its freezing point was based.

  14. Dosimetric adaptive IMRT driven by fiducial points

    International Nuclear Information System (INIS)

    Crijns, Wouter; Van Herck, Hans; Defraene, Gilles; Van den Bergh, Laura; Haustermans, Karin; Slagmolen, Pieter; Maes, Frederik; Van den Heuvel, Frank

    2014-01-01

    (CTV mean dose, conformity index) and clinical (tumor control probability, and normal tissue complication probability) measures. Results: Based on the current experiments, the intended target dose and tumor control probability could be assured by the proposed method (TCP ≥ TCP intended ). Additionally, the conformity index error was more than halved compared to the current clinical practice (ΔCI 95% from 40% to 16%) resulting in improved organ at risk protection. All the individual correction steps had an added value to the full correction. Conclusions: A limited number of fiducial points (no organ contours required) and an in-room (CB)CT are sufficient to perform a full dosimetric correction for IMRT plans. In the presence of interfraction variation, the corrected plans show superior dose distributions compared to our current clinical practice

  15. Dosimetric adaptive IMRT driven by fiducial points

    Energy Technology Data Exchange (ETDEWEB)

    Crijns, Wouter, E-mail: wouter.crijns@uzleuven.be [Department of Oncology, Laboratory of Experimental Radiotherapy, KU Leuven, Herestraat 49, 3000 Leuven, Belgium and Medical Imaging Research Center, KU Leuven, Herestraat 49, 3000 Leuven (Belgium); Van Herck, Hans [Medical Imaging Research Center, KU Leuven, Herestraat 49, 3000 Leuven, Belgium and Department of Electrical Engineering (ESAT) – PSI, Center for the Processing of Speech and Images, KU Leuven, 3000 Leuven (Belgium); Defraene, Gilles; Van den Bergh, Laura; Haustermans, Karin [Department of Oncology, Laboratory of Experimental Radiotherapy, KU Leuven, Herestraat 49, 3000 Leuven (Belgium); Slagmolen, Pieter [Medical Imaging Research Center, KU Leuven, Herestraat 49, 3000 Leuven (Belgium); Department of Electrical Engineering (ESAT) – PSI, Center for the Processing of Speech and Images, KU Leuven, 3000 Leuven (Belgium); iMinds-KU Leuven Medical IT Department, KU Leuven, 3000 Leuven (Belgium); Maes, Frederik [Medical Imaging Research Center, KU Leuven, Herestraat 49, 3000 Leuven (Belgium); Department of Electrical Engineering (ESAT) – PSI, Center for the Processing of Speech and Images, KU Leuven and iMinds, 3000 Leuven (Belgium); Van den Heuvel, Frank [Department of Oncology, Laboratory of Experimental Radiotherapy, KU Leuven, Herestraat 49, 3000 Leuven, Belgium and Department of Oncology, MRC-CR-UK Gray Institute of Radiation Oncology and Biology, University of Oxford, Oxford OX1 2JD (United Kingdom)

    2014-06-15

    (CTV mean dose, conformity index) and clinical (tumor control probability, and normal tissue complication probability) measures. Results: Based on the current experiments, the intended target dose and tumor control probability could be assured by the proposed method (TCP ≥ TCP{sub intended}). Additionally, the conformity index error was more than halved compared to the current clinical practice (ΔCI{sub 95%} from 40% to 16%) resulting in improved organ at risk protection. All the individual correction steps had an added value to the full correction. Conclusions: A limited number of fiducial points (no organ contours required) and an in-room (CB)CT are sufficient to perform a full dosimetric correction for IMRT plans. In the presence of interfraction variation, the corrected plans show superior dose distributions compared to our current clinical practice.

  16. Engineering assessment of inactive uranium mill tailings, Ray Point Site, Ray Point, Texas. Phase II, Title I

    International Nuclear Information System (INIS)

    1977-12-01

    Results are reported from an engineering assessment of the problems resulting from the existence of radioactive uranium mill tailings at Ray Point, Texas. The Phase II--Title I services generally include the preparation of topographic maps, the performance of soil sampling and radiometric measurements sufficient to determine areas and volumes of tailings and other radium-contaminated materials, the evaluation of resulting radiation exposures of individuals and nearby populations, the investigation of site hydrology and meteorology and the evaluation and costing of alternative corrective actions. About 490,000 tons of ore were processed at this mill with all of the uranium sold on the commercial market. None was sold to the AEC; therefore, this report focuses on a physical description of the site and the identification of radiation pathways. No remedial action options were formulated for the site, inasmuch as none of the uranium was sold to the AEC and Exxon Corporation has agreed to perform all actions required by the State of Texas. Radon gas release from the tailings at the Ray Point site constitutes the most significant environmental impact. Windblown tailings, external gamma radiation and localized contamination of surface waters are other environmental effects. Exxon is also studying the feasibility of reprocessing the tailings

  17. The Pointing Self-calibration Algorithm for Aperture Synthesis Radio Telescopes

    Energy Technology Data Exchange (ETDEWEB)

    Bhatnagar, S.; Cornwell, T. J., E-mail: sbhatnag@nrao.edu [National Radio Astronomy Observatory, 1003 Lopezville Road, Socorro, NM 87801 (United States)

    2017-11-01

    This paper is concerned with algorithms for calibration of direction-dependent effects (DDE) in aperture synthesis radio telescopes (ASRT). After correction of direction-independent effects (DIE) using self-calibration, imaging performance can be limited by the imprecise knowledge of the forward gain of the elements in the array. In general, the forward gain pattern is directionally dependent and varies with time due to a number of reasons. Some factors, such as rotation of the primary beam with Parallactic Angle for Azimuth–Elevation mount antennas are known a priori. Some, such as antenna pointing errors and structural deformation/projection effects for aperture-array elements cannot be measured a priori. Thus, in addition to algorithms to correct for DD effects known a priori, algorithms to solve for DD gains are required for high dynamic range imaging. Here, we discuss a mathematical framework for antenna-based DDE calibration algorithms and show that this framework leads to computationally efficient optimal algorithms that scale well in a parallel computing environment. As an example of an antenna-based DD calibration algorithm, we demonstrate the Pointing SelfCal (PSC) algorithm to solve for the antenna pointing errors. Our analysis show that the sensitivity of modern ASRT is sufficient to solve for antenna pointing errors and other DD effects. We also discuss the use of the PSC algorithm in real-time calibration systems and extensions for antenna Shape SelfCal algorithm for real-time tracking and corrections for pointing offsets and changes in antenna shape.

  18. The Pointing Self-calibration Algorithm for Aperture Synthesis Radio Telescopes

    Science.gov (United States)

    Bhatnagar, S.; Cornwell, T. J.

    2017-11-01

    This paper is concerned with algorithms for calibration of direction-dependent effects (DDE) in aperture synthesis radio telescopes (ASRT). After correction of direction-independent effects (DIE) using self-calibration, imaging performance can be limited by the imprecise knowledge of the forward gain of the elements in the array. In general, the forward gain pattern is directionally dependent and varies with time due to a number of reasons. Some factors, such as rotation of the primary beam with Parallactic Angle for Azimuth-Elevation mount antennas are known a priori. Some, such as antenna pointing errors and structural deformation/projection effects for aperture-array elements cannot be measured a priori. Thus, in addition to algorithms to correct for DD effects known a priori, algorithms to solve for DD gains are required for high dynamic range imaging. Here, we discuss a mathematical framework for antenna-based DDE calibration algorithms and show that this framework leads to computationally efficient optimal algorithms that scale well in a parallel computing environment. As an example of an antenna-based DD calibration algorithm, we demonstrate the Pointing SelfCal (PSC) algorithm to solve for the antenna pointing errors. Our analysis show that the sensitivity of modern ASRT is sufficient to solve for antenna pointing errors and other DD effects. We also discuss the use of the PSC algorithm in real-time calibration systems and extensions for antenna Shape SelfCal algorithm for real-time tracking and corrections for pointing offsets and changes in antenna shape.

  19. Geomorphic tipping points: convenient metaphor or fundamental landscape property?

    Science.gov (United States)

    Lane, Stuart

    2016-04-01

    will think through what this understanding means for geomorphology in a tipping point world arguing that if it indeed holds, it presents profound challenges for data collection and modelling that we do not fully appreciate, and will require very different kinds of analyses to those that we normally are accustomed to.

  20. H2FIRST Hydrogen Contaminant Detector Task: Requirements Document and Market Survey

    Energy Technology Data Exchange (ETDEWEB)

    Terlip, Danny [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ainscough, Chris [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Buttner, William [National Renewable Energy Laboratory (NREL), Golden, CO (United States); McWhorter, Scott [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-04-20

    The rollout of hydrogen fueling stations, and the fuel cell electric vehicles (FCEV) they support, requires the assurance of high quality hydrogen at the dispensing point. Automotive fuel cells are sensitive to a number of chemicals that can be introduced into the dispensed fuel at multiple points. Quality assurance and quality control methods are employed by the industry to ensure product quality, but they are not completely comprehensive and can fail at various points in the hydrogen pathway from production to dispensing. This reality leaves open the possibility of a station unknowingly dispensing harmful contaminants to a FCEV which, depending on the contaminant, may not be discovered until the FCEV is irreparably damaged. This situation is unacceptable. A hydrogen contaminant detector (HCD), defined as a combination of a gas analyzer and the components necessary for fuel stream integration, installed at hydrogen stations is one method for preventing poor quality gas from reaching an FCEV. This document identifies the characteristics required of such a device by industry and compares those requirements with the current state of commercially available gas analysis technology.

  1. Design of Neutral-Point Voltage Controller of a Three-level NPC Inverter with Small DC-Link Capacitors

    DEFF Research Database (Denmark)

    Maheshwari, Ram Krishan; Munk-Nielsen, Stig; Busquets-Monge, S.

    2013-01-01

    A Neutral-Point-Clamped (NPC) three-level inverter with small dc-link capacitors is presented in this paper. The inverter requires zero average neutral-point current for stable neutral-point voltage. The small dc-link capacitors may not maintain capacitor voltage balance, even with zero neutral......-point voltage control on the basis of the continuous model. The design method for optimum performance is discussed. The implementation of the proposed modulation strategy and the controller is very simple. The controller is implemented in a 7.5 kW induction machine based drive with only 14 ìF dc-link capacitors...

  2. Quantitative structure-property relationships for prediction of boiling point, vapor pressure, and melting point.

    Science.gov (United States)

    Dearden, John C

    2003-08-01

    Boiling point, vapor pressure, and melting point are important physicochemical properties in the modeling of the distribution and fate of chemicals in the environment. However, such data often are not available, and therefore must be estimated. Over the years, many attempts have been made to calculate boiling points, vapor pressures, and melting points by using quantitative structure-property relationships, and this review examines and discusses the work published in this area, and concentrates particularly on recent studies. A number of software programs are commercially available for the calculation of boiling point, vapor pressure, and melting point, and these have been tested for their predictive ability with a test set of 100 organic chemicals.

  3. Translating silicon nanowire BioFET sensor-technology to embedded point-of-care medical diagnostics

    DEFF Research Database (Denmark)

    Pfreundt, Andrea; Zulfiqar, Azeem; Patou, François

    2013-01-01

    Silicon nanowire and nanoribbon biosensors have shown great promise in the detection of biomarkers at very low concentrations. Their high sensitivity makes them ideal candidates for use in early-stage medical diagnostics and further disease monitoring where low amounts of biomarkers need to be de......Silicon nanowire and nanoribbon biosensors have shown great promise in the detection of biomarkers at very low concentrations. Their high sensitivity makes them ideal candidates for use in early-stage medical diagnostics and further disease monitoring where low amounts of biomarkers need...... to be detected. However, in order to translate this technology from the bench to the bedside, a number of key issues need to be taken into consideration: Integrating nanobiosensors-based technology requires to overcome the difficult tradeoff between imperatives for high device reproducibilty and associated...... rising fabrication costs. Also the translation of nano-scale sensor technology into daily-use point-of-care devices requires acknowledgement of the end-user requirements, making device portability and human-interfacing a focus point in device development. Sample handling or purification for instance...

  4. Key requirements for future control room functionality

    DEFF Research Database (Denmark)

    Tornelli, Carlo; Zuelli, Roberto; Marinelli, Mattia

    2016-01-01

    ) and the observability needs highlighted within WP5 led to the definition of the requirements with a Web of Cell (WoC) point of view. The main European Distribution System Operators (DSOs) provided a valuable contribution in the definition of the evolvDSO Use Cases. Their analysis lead to the definition of further...

  5. 47 CFR 74.536 - Directional antenna required.

    Science.gov (United States)

    2010-10-01

    ..., which is specified in the table in paragraph (c) of this section, upon a showing that said antenna... interference. (c) Licensees shall comply with the antenna standards table shown in this paragraph in the following manner: (1) With either the maximum beamwidth to 3 dB points requirement or with the minimum...

  6. Professional SharePoint 2010 Development

    CERN Document Server

    Rizzo, Tom; Fried, Jeff; Swider, Paul J; Hillier, Scot; Schaefer, Kenneth

    2012-01-01

    Updated guidance on how to take advantage of the newest features of SharePoint programmability More than simply a portal, SharePoint is Microsoft's popular content management solution for building intranets and websites or hosting wikis and blogs. Offering broad coverage on all aspects of development for the SharePoint platform, this comprehensive book shows you exactly what SharePoint does, how to build solutions, and what features are accessible within SharePoint. Written by a team of SharePoint experts, this new edition offers an extensive selection of field-tested best practices that shows

  7. SharePoint 2010 For Dummies

    CERN Document Server

    Williams, Vanessa L

    2012-01-01

    Here's the bestselling guide on SharePoint 2010, updated to cover Office 365 SharePoint Portal Server is an essential part of the enterprise infrastructure for many businesses. The Office 365 version includes significantly enhanced cloud capabilities. This second edition of the bestselling guide to SharePoint covers getting a SharePoint site up and running, branded, populated with content, and more. It explains ongoing site management and offers plenty of advice for administrators who want to leverage SharePoint and Office 365 in various ways.Many businesses today rely on SharePoint Portal Ser

  8. ESTIMATING AIRCRAFT HEADING BASED ON LASERSCANNER DERIVED POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    Z. Koppanyi

    2015-03-01

    Full Text Available Using LiDAR sensors for tracking and monitoring an operating aircraft is a new application. In this paper, we present data processing methods to estimate the heading of a taxiing aircraft using laser point clouds. During the data acquisition, a Velodyne HDL-32E laser scanner tracked a moving Cessna 172 airplane. The point clouds captured at different times were used for heading estimation. After addressing the problem and specifying the equation of motion to reconstruct the aircraft point cloud from the consecutive scans, three methods are investigated here. The first requires a reference model to estimate the relative angle from the captured data by fitting different cross-sections (horizontal profiles. In the second approach, iterative closest point (ICP method is used between the consecutive point clouds to determine the horizontal translation of the captured aircraft body. Regarding the ICP, three different versions were compared, namely, the ordinary 3D, 3-DoF 3D and 2-DoF 3D ICP. It was found that 2-DoF 3D ICP provides the best performance. Finally, the last algorithm searches for the unknown heading and velocity parameters by minimizing the volume of the reconstructed plane. The three methods were compared using three test datatypes which are distinguished by object-sensor distance, heading and velocity. We found that the ICP algorithm fails at long distances and when the aircraft motion direction perpendicular to the scan plane, but the first and the third methods give robust and accurate results at 40m object distance and at ~12 knots for a small Cessna airplane.

  9. Throughput analysis of point-to-multi-point hybric FSO/RF network

    KAUST Repository

    Rakia, Tamer; Gebali, Fayez; Yang, Hong-Chuan; Alouini, Mohamed-Slim

    2017-01-01

    This paper presents and analyzes a point-to-multi-point (P2MP) network that uses a number of free-space optical (FSO) links for data transmission from the central node to the different remote nodes. A common backup radio-frequency (RF) link is used

  10. Simulation and Emulation of MIMO Wireless Baseband Transceivers

    Directory of Open Access Journals (Sweden)

    Andreas Burg

    2010-01-01

    Full Text Available The development of state-of-the-art wireless communication transceivers in semiconductor technology is a challenging process due to complexity and stringent requirements of modern communication standards such as IEEE 802.11n. This tutorial paper describes a complete design, verification, and performance characterization methodology that is tailored to the needs of the development of state-of-the-art wireless baseband transceivers for both research and industrial products. Compared to the methods widely used for the development of communication research testbeds, the described design flow focuses on the evolution of a given system specification to a final ASIC implementation through multiple design representations. The corresponding verification and characterization environment supports rapid floating-point and fixed-point performance characterization and ensures consistency across the entire design process and across all design representations. This framework has been successfully employed for the development and verification of an industrial-grade, fully standard compliant, 4-stream IEEE 802.11n MIMO-OFDM baseband transceiver.

  11. Point-of-sale alcohol promotions in the Perth and Sydney metropolitan areas.

    Science.gov (United States)

    Jones, Sandra C; Barrie, Lance; Robinson, Laura; Allsop, Steve; Chikritzhs, Tanya

    2012-09-01

    Point-of-sale (POS) is increasingly being used as a marketing tool for alcohol products, and there is a growing body of evidence suggesting that these materials are positively associated with drinking and contribute to creating a pro-alcohol environment. The purpose of the present study was to document the nature and extent of POS alcohol promotions in bottle shops in two Australian capital cities. A purposive sample of 24 hotel bottle shops and liquor stores was selected across Sydney (New South Wales) and Perth (Western Australia) and audited for the presence and nature of POS marketing. Point-of-sale promotions were found to be ubiquitous, with an average of 33 promotions per outlet. Just over half were classified as 'non-price' promotions (e.g. giveaways and competitions). Spirits were the most commonly promoted type of alcohol. The average number of standard drinks required to participate in the promotions ranged from 12 for ready to drinks to 22 for beer. Alcohol outlets that were part of supermarket chains had a higher number of promotions, more price-based promotions, and required a greater quantity of alcohol to be purchased to participate in the promotion. The data collected in this study provides a starting point for our understanding of POS promotions in Australia, and poses important questions for future research in this area. © 2012 Australasian Professional Society on Alcohol and other Drugs.

  12. Cost-optimal levels of minimum energy performance requirements in the Danish Building Regulations

    Energy Technology Data Exchange (ETDEWEB)

    Aggerholm, S.

    2013-09-15

    The purpose of the report is to analyse the cost optimality of the energy requirements in the Danish Building Regulations 2010, BR10 to new building and to existing buildings undergoing major renovation. The energy requirements in the Danish Building Regulations have by tradition always been based on the cost and benefits related to the private economical or financial perspective. Macro economical calculations have in the past only been made in addition. The cost optimum used in this report is thus based on the financial perspective. Due to the high energy taxes in Denmark there is a significant difference between the consumer price and the macro economical for energy. Energy taxes are also paid by commercial consumers when the energy is used for building operation e.g. heating, lighting, ventilation etc. In relation to the new housing examples the present minimum energy requirements in BR 10 all shows gaps that are negative with a deviation of up till 16 % from the point of cost optimality. With the planned tightening of the requirements to new houses in 2015 and in 2020, the energy requirements can be expected to be tighter than the cost optimal point, if the costs for the needed improvements don't decrease correspondingly. In relation to the new office building there is a gap of 31 % to the point of cost optimality in relation to the 2010 requirement. In relation to the 2015 and 2020 requirements there are negative gaps to the point of cost optimality based on today's prices. If the gaps for all the new buildings are weighted to an average based on mix of building types and heat supply for new buildings in Denmark there is a gap of 3 % in average for the new building. The excessive tightness with today's prices is 34 % in relation to the 2015 requirement and 49 % in relation to the 2020 requirement. The component requirement to elements in the building envelope and to installations in existing buildings adds up to significant energy efficiency

  13. Adhesion and friction in single asperity contact

    NARCIS (Netherlands)

    Yaqoob, Muhammad Adeel

    2012-01-01

    In the modern era, many mechanical systems require more stringent requirements in terms of performance and reliability. The applications of these systems can be found in medical instrumentation, electron microscopes, lithography systems, as well as in aviation and space applications. Instruments

  14. ERP in agriculture: Lessons learned from the Dutch horticulture

    NARCIS (Netherlands)

    Verdouw, C.N.; Robbemond, R.M.; Wolfert, J.

    2015-01-01

    Farming nowadays is a complex managerial task that imposes stringent requirements on farm management information systems. In other sectors, Enterprise Resource Planning (ERP) systems are widely implemented to meet such requirements. This paper assesses the applicability of ERP systems in the

  15. Kicker Magnet and Pulser

    Energy Technology Data Exchange (ETDEWEB)

    Bulos, Fatin [SLAC National Accelerator Lab., Menlo Park, CA (United States)

    2014-03-04

    The SLC Project utilizes several fast kicker magnets. Their requirements vary somewhat, however, the cooling rings kickers have the most stringent requirements. In this note we describe the design of the positron ring kickers, and the reasons that led to it.

  16. Pilot points method for conditioning multiple-point statistical facies simulation on flow data

    Science.gov (United States)

    Ma, Wei; Jafarpour, Behnam

    2018-05-01

    We propose a new pilot points method for conditioning discrete multiple-point statistical (MPS) facies simulation on dynamic flow data. While conditioning MPS simulation on static hard data is straightforward, their calibration against nonlinear flow data is nontrivial. The proposed method generates conditional models from a conceptual model of geologic connectivity, known as a training image (TI), by strategically placing and estimating pilot points. To place pilot points, a score map is generated based on three sources of information: (i) the uncertainty in facies distribution, (ii) the model response sensitivity information, and (iii) the observed flow data. Once the pilot points are placed, the facies values at these points are inferred from production data and then are used, along with available hard data at well locations, to simulate a new set of conditional facies realizations. While facies estimation at the pilot points can be performed using different inversion algorithms, in this study the ensemble smoother (ES) is adopted to update permeability maps from production data, which are then used to statistically infer facies types at the pilot point locations. The developed method combines the information in the flow data and the TI by using the former to infer facies values at selected locations away from the wells and the latter to ensure consistent facies structure and connectivity where away from measurement locations. Several numerical experiments are used to evaluate the performance of the developed method and to discuss its important properties.

  17. Digital microwave communication engineering point-to-point microwave systems

    CERN Document Server

    Kizer, George

    2013-01-01

    The first book to cover all engineering aspects of microwave communication path design for the digital age Fixed point-to-point microwave systems provide moderate-capacity digital transmission between well-defined locations. Most popular in situations where fiber optics or satellite communication is impractical, it is commonly used for cellular or PCS site interconnectivity where digital connectivity is needed but not economically available from other sources, and in private networks where reliability is most important. Until now, no book has adequately treated all en

  18. Pump testing in the nuclear industry: The comprehensive test and other considerations

    International Nuclear Information System (INIS)

    Hoyle, T.F.

    1992-01-01

    The American Society of Mechanical Engineers Operations and Maintenance Working Group on Pumps and Valves is working on a revision to their pump testing Code, ISTB-1990. This revision will change the basic philosophy of pump testing in the nuclear industry. Currently, all pumps are required to be tested quarterly, except those installed in dry sumps. In the future standby pumps will receive only a start test quarterly to ensure the pump comes up to speed and pressure or flow. Then, on a biennial basis all pumps would receive a more extensive test. This comprehensive test would require high accuracy test gauges to be used, and the pumps would be required to be tested near pump design flow. Testing on minimum flow loops would not be permitted except in rare cases. Additionally. during the comprehensive test, measurements of vibration, flow, and pressure would all be taken. The OM-6 standard (ISTB Code) will also require that reference values of flow rate and differential pressure be taken at several points instead of just one point, which is current practice. The comprehensive test is just one step in ensuring the adequacy of pump testing in the nuclear industry. This paper also addresses other concerns and makes recommendations for increased quality of testing of certain critical pumps and recommendations for less stringent or no tests on less critical pumps

  19. High-performance floating-point image computing workstation for medical applications

    Science.gov (United States)

    Mills, Karl S.; Wong, Gilman K.; Kim, Yongmin

    1990-07-01

    The medical imaging field relies increasingly on imaging and graphics techniques in diverse applications with needs similar to (or more stringent than) those of the military, industrial and scientific communities. However, most image processing and graphics systems available for use in medical imaging today are either expensive, specialized, or in most cases both. High performance imaging and graphics workstations which can provide real-time results for a number of applications, while maintaining affordability and flexibility, can facilitate the application of digital image computing techniques in many different areas. This paper describes the hardware and software architecture of a medium-cost floating-point image processing and display subsystem for the NeXT computer, and its applications as a medical imaging workstation. Medical imaging applications of the workstation include use in a Picture Archiving and Communications System (PACS), in multimodal image processing and 3-D graphics workstation for a broad range of imaging modalities, and as an electronic alternator utilizing its multiple monitor display capability and large and fast frame buffer. The subsystem provides a 2048 x 2048 x 32-bit frame buffer (16 Mbytes of image storage) and supports both 8-bit gray scale and 32-bit true color images. When used to display 8-bit gray scale images, up to four different 256-color palettes may be used for each of four 2K x 2K x 8-bit image frames. Three of these image frames can be used simultaneously to provide pixel selectable region of interest display. A 1280 x 1024 pixel screen with 1: 1 aspect ratio can be windowed into the frame buffer for display of any portion of the processed image or images. In addition, the system provides hardware support for integer zoom and an 82-color cursor. This subsystem is implemented on an add-in board occupying a single slot in the NeXT computer. Up to three boards may be added to the NeXT for multiple display capability (e

  20. A superlinear interior points algorithm for engineering design optimization

    Science.gov (United States)

    Herskovits, J.; Asquier, J.

    1990-01-01

    We present a quasi-Newton interior points algorithm for nonlinear constrained optimization. It is based on a general approach consisting of the iterative solution in the primal and dual spaces of the equalities in Karush-Kuhn-Tucker optimality conditions. This is done in such a way to have primal and dual feasibility at each iteration, which ensures satisfaction of those optimality conditions at the limit points. This approach is very strong and efficient, since at each iteration it only requires the solution of two linear systems with the same matrix, instead of quadratic programming subproblems. It is also particularly appropriate for engineering design optimization inasmuch at each iteration a feasible design is obtained. The present algorithm uses a quasi-Newton approximation of the second derivative of the Lagrangian function in order to have superlinear asymptotic convergence. We discuss theoretical aspects of the algorithm and its computer implementation.

  1. Maximum Power Point Tracking Based on Sliding Mode Control

    Directory of Open Access Journals (Sweden)

    Nimrod Vázquez

    2015-01-01

    Full Text Available Solar panels, which have become a good choice, are used to generate and supply electricity in commercial and residential applications. This generated power starts with the solar cells, which have a complex relationship between solar irradiation, temperature, and output power. For this reason a tracking of the maximum power point is required. Traditionally, this has been made by considering just current and voltage conditions at the photovoltaic panel; however, temperature also influences the process. In this paper the voltage, current, and temperature in the PV system are considered to be a part of a sliding surface for the proposed maximum power point tracking; this means a sliding mode controller is applied. Obtained results gave a good dynamic response, as a difference from traditional schemes, which are only based on computational algorithms. A traditional algorithm based on MPPT was added in order to assure a low steady state error.

  2. Poly-urea spray elastomer for waste containment applications

    International Nuclear Information System (INIS)

    Miller, C.J.; Cheng, S.C.J.; Tanis, R.

    1997-01-01

    Geomembrane usage in environmental applications has increased dramatically following the promulgation of federal regulations resulting from the Resource Conservation and Recovery Act of 1976 (RCRA). Subtitle D rules, formulated under the authority of RCRA, call for minimum performance standards to limit adverse effects of a solid waste disposal facility on human health or the environment (40 CFR 257,258, August 30, 1988). These rules set minimum standards requiring new landfill designs to include liner systems and final cover systems. Each state has the responsibility to develop rules that are at least as stringent as the Subtitle D rules. There are several types of geomembranes currently available for landfill applications, each offering particular advantages and disadvantages. For example, PVC does not show the yield point (point of instability) that HDPE shows, HDPE has a higher puncture resistance than PVC, and PVC will deform much more than HDPE before barrier properties of the geomembrane are lost. Because each geomembrane material exhibits its own particular characteristics the material selected should be chosen based on the individual project requirements. It is preferable to select a design that uses the least expensive material and meets the performance specifications of the project

  3. The goal of ape pointing.

    Science.gov (United States)

    Halina, Marta; Liebal, Katja; Tomasello, Michael

    2018-01-01

    Captive great apes regularly use pointing gestures in their interactions with humans. However, the precise function of this gesture is unknown. One possibility is that apes use pointing primarily to direct attention (as in "please look at that"); another is that they point mainly as an action request (such as "can you give that to me?"). We investigated these two possibilities here by examining how the looking behavior of recipients affects pointing in chimpanzees (Pan troglodytes) and bonobos (Pan paniscus). Upon pointing to food, subjects were faced with a recipient who either looked at the indicated object (successful-look) or failed to look at the indicated object (failed-look). We predicted that, if apes point primarily to direct attention, subjects would spend more time pointing in the failed-look condition because the goal of their gesture had not been met. Alternatively, we expected that, if apes point primarily to request an object, subjects would not differ in their pointing behavior between the successful-look and failed-look conditions because these conditions differed only in the looking behavior of the recipient. We found that subjects did differ in their pointing behavior across the successful-look and failed-look conditions, but contrary to our prediction subjects spent more time pointing in the successful-look condition. These results suggest that apes are sensitive to the attentional states of gestural recipients, but their adjustments are aimed at multiple goals. We also found a greater number of individuals with a strong right-hand than left-hand preference for pointing.

  4. Automatic markerless registration of point clouds with semantic-keypoint-based 4-points congruent sets

    Science.gov (United States)

    Ge, Xuming

    2017-08-01

    The coarse registration of point clouds from urban building scenes has become a key topic in applications of terrestrial laser scanning technology. Sampling-based algorithms in the random sample consensus (RANSAC) model have emerged as mainstream solutions to address coarse registration problems. In this paper, we propose a novel combined solution to automatically align two markerless point clouds from building scenes. Firstly, the method segments non-ground points from ground points. Secondly, the proposed method detects feature points from each cross section and then obtains semantic keypoints by connecting feature points with specific rules. Finally, the detected semantic keypoints from two point clouds act as inputs to a modified 4PCS algorithm. Examples are presented and the results compared with those of K-4PCS to demonstrate the main contributions of the proposed method, which are the extension of the original 4PCS to handle heavy datasets and the use of semantic keypoints to improve K-4PCS in relation to registration accuracy and computational efficiency.

  5. Point Cluster Analysis Using a 3D Voronoi Diagram with Applications in Point Cloud Segmentation

    Directory of Open Access Journals (Sweden)

    Shen Ying

    2015-08-01

    Full Text Available Three-dimensional (3D point analysis and visualization is one of the most effective methods of point cluster detection and segmentation in geospatial datasets. However, serious scattering and clotting characteristics interfere with the visual detection of 3D point clusters. To overcome this problem, this study proposes the use of 3D Voronoi diagrams to analyze and visualize 3D points instead of the original data item. The proposed algorithm computes the cluster of 3D points by applying a set of 3D Voronoi cells to describe and quantify 3D points. The decompositions of point cloud of 3D models are guided by the 3D Voronoi cell parameters. The parameter values are mapped from the Voronoi cells to 3D points to show the spatial pattern and relationships; thus, a 3D point cluster pattern can be highlighted and easily recognized. To capture different cluster patterns, continuous progressive clusters and segmentations are tested. The 3D spatial relationship is shown to facilitate cluster detection. Furthermore, the generated segmentations of real 3D data cases are exploited to demonstrate the feasibility of our approach in detecting different spatial clusters for continuous point cloud segmentation.

  6. SharePoint 2010 Field Guide

    CERN Document Server

    Mann, Steven; Gazmuri, Pablo; Caravajal, Steve; Wheeler, Christina

    2012-01-01

    Hands-on solutions for common SharePoint 2010 challenges Aimed at the more than 100 million licensed SharePoint 2010 users, this indispensable field guide addresses an abundance of common SharePoint 2010 problems and offers proven solutions. A team of authors encourages you to customize SharePoint beyond the out-of-the-box functionality so that you can build more complex solutions to these challenges. You?ll discover intricate details and specific full-scale solutions that you can then implement to your own SharePoint 2010 solutions.Tackles a variety of SharePoint 2010 problems ranging from si

  7. Professional SharePoint 2010 Administration

    CERN Document Server

    Klindt, Todd; Caravajal, Steve

    2010-01-01

    Thorough coverage of the improvements and changes to SharePoint 2010. SharePoint 2010 boasts a variety of incredible new features that will challenge even the most experienced administrator who is upgrading from SharePoint 2007. Written by a team of SharePoint experts, this book places a takes aim at showing you how to make these new features work right for you. Offering an in-depth look at SharePoint 2010, the authors focus on how SharePoint functionality has changed from its earliest version to its newest, and they provide you with detailed coverage of all the new features and capabilities.:

  8. Professional SharePoint 2013 administration

    CERN Document Server

    Young, Shane; Klindt, Todd

    2013-01-01

    SharePoint admin author gurus return to prepare you for working with the new features of SharePoint 2013! The new iteration of SharePoint boasts exciting new features. However, any new version also comes with its fair share of challenges and that's where this book comes in. The team of SharePoint admin gurus returns to presents a fully updated resource that prepares you for making all the new SharePoint 2013 features work right. They cover all of the administration components of SharePoint 2013 in detail, and present a clear understanding of how they affect the role of the adminis

  9. Bacteriophages in food fermentations: new frontiers in a continuous arms race.

    Science.gov (United States)

    Samson, Julie E; Moineau, Sylvain

    2013-01-01

    Phage contamination represents an important risk to any process requiring bacterial growth, particularly in the biotechnology and food industries. The presence of unwanted phages may lead to manufacturing delays, lower quality product, or, in the worst cases, total production loss. Thus, constant phage monitoring and stringent application of the appropriate control measures are indispensable. In fact, a systematic preventive approach to phage contamination [phage analysis and critical control points (PACCP)] should be put in place. In this review, sources of phage contamination and novel phage detection methods are described, with an emphasis on bacterial viruses that infect lactic acid bacteria used in food fermentations. Recent discoveries related to antiphage systems that are changing our views on phage-host interactions are highlighted. Finally, future directions are also discussed.

  10. Interpreting OPERA results on superluminal neutrino

    CERN Document Server

    Giudice, Gian F; Strumia, Alessandro

    2012-01-01

    OPERA has claimed the discovery of superluminal propagation of neutrinos. We analyze the consistency of this claim with previous tests of special relativity. We find that reconciling the OPERA measurement with information from SN1987a and from neutrino oscillations requires stringent conditions. The superluminal limit velocity of neutrinos must be nearly flavor independent, must decrease steeply in the low-energy domain, and its energy dependence must depart from a simple power law. We construct illustrative models that satisfy these conditions, by introducing Lorentz violation in a sector with light sterile neutrinos. We point out that, quite generically, electroweak quantum corrections transfer the information of superluminal neutrino properties into Lorentz violations in the electron and muon sector, in apparent conflict with experimental data.

  11. Inspection activities of other strategic points (OSPs) at Rokkasho Reprocessing Plant

    International Nuclear Information System (INIS)

    Kaifuki, Yukinobu; Ebata, Takashi; Nakano, Sadayuki; Fujimaki, Kazunori

    2008-01-01

    At Rokkasho Reprocessing Plant (RRP), Active Test (AT) using actual spent fuels for the final confirmation of the equipment and the system has been performed since March 31, 2006 toward the commercial operation. The safeguards inspection during AT is required in the same manner as commercial operation condition because plutonium is handled. In RRP automated verification systems are established by using unattended verification systems including a number of process monitoring systems along with main plutonium handling process from the spent fuel storage until the MOX product storages. Even under the modernized safeguards, inspection activities at Other Strategic Points (OSPs) are required to confirm plant status in accordance with requirements of the IAEA safeguards criteria. This paper presents procedures and inspection activities at OSPs which has been implemented in RRP since start of AT. (author)

  12. 17 CFR 4.24 - General disclosures required.

    Science.gov (United States)

    2010-04-01

    ... tabular format, an analysis setting forth how the break-even point for the pool was calculated. The... feature becomes operative; and (3) Disclose, in the break-even analysis required by § 4.24(i)(6), the... page number) AND A STATEMENT OF THE PERCENTAGE RETURN NECESSARY TO BREAK EVEN, THAT IS, TO RECOVER THE...

  13. Effect of grain size on the melting point of confined thin aluminum films

    Energy Technology Data Exchange (ETDEWEB)

    Wejrzanowski, Tomasz; Lewandowska, Malgorzata; Sikorski, Krzysztof; Kurzydlowski, Krzysztof J. [Materials Design Division, Faculty of Materials Science and Engineering, Warsaw University of Technology, Woloska 141, 02-507 Warsaw (Poland)

    2014-10-28

    The melting of aluminum thin film was studied by a molecular dynamics (MD) simulation technique. The effect of the grain size and type of confinement was investigated for aluminum film with a constant thickness of 4 nm. The results show that coherent intercrystalline interface suppress the transition of solid aluminum into liquid, while free-surface gives melting point depression. The mechanism of melting of polycrystalline aluminum thin film was investigated. It was found that melting starts at grain boundaries and propagates to grain interiors. The melting point was calculated from the Lindemann index criterion, taking into account only atoms near to grain boundaries. This made it possible to extend melting point calculations to bigger grains, which require a long time (in the MD scale) to be fully molten. The results show that 4 nm thick film of aluminum melts at a temperature lower than the melting point of bulk aluminum (933 K) only when the grain size is reduced to 6 nm.

  14. I See Your Point: Infants under 12 Months Understand that Pointing Is Communicative

    Science.gov (United States)

    Krehm, Madelaine; Onishi, Kristine H.; Vouloumanos, Athena

    2014-01-01

    Do young infants understand that pointing gestures allow the pointer to change the information state of a recipient? We used a third-party experimental scenario to examine whether 9- and 11-month-olds understand that a pointer's pointing gesture can inform a recipient about a target object. When the pointer pointed to a target, infants…

  15. Test and evaluation of the Fort St. Vrain dew point moisture monitor system

    International Nuclear Information System (INIS)

    Block, G.A.; Del Bene, J.V. Jr.; Gitterman, M.; Hastings, G.A.; Hawkins, W.M.; Hinz, R.F.; McCue, D.E.; Swanson, L.L.; Vavrina, J.; Zwetzig, G.B.

    1975-01-01

    Descriptions are given of the Fort St. Vrain Dew Point Moisture Monitor (DPMM) System; the bases for the DPMM system response time requirements for safety related functions at the required reactor operating conditions; the results and evaluation of recent testing which measured the performance of the current system at simulated operating conditions; predicted response times for reactor power operation from 0 to 100 percent and a modification to provide improved response times for low-load and plant start-up conditions

  16. Transient Point Infiltration In The Unsaturated Zone

    Science.gov (United States)

    Buecker-Gittel, M.; Mohrlok, U.

    The risk assessment of leaking sewer pipes gets more and more important due to urban groundwater management and environmental as well as health safety. This requires the quantification and balancing of transport and transformation processes based on the water flow in the unsaturated zone. The water flow from a single sewer leakage could be described as a point infiltration with time varying hydraulic conditions externally and internally. External variations are caused by the discharge in the sewer pipe as well as the state of the leakage itself. Internal variations are the results of microbiological clogging effects associated with the transformation processes. Technical as well as small scale laboratory experiments were conducted in order to investigate the water transport from an transient point infiltration. From the technical scale experiment there was evidence that the water flow takes place under transient conditions when sewage infiltrates into an unsaturated soil. Whereas the small scale experiments investigated the hydraulics of the water transport and the associated so- lute and particle transport in unsaturated soils in detail. The small scale experiment was a two-dimensional representation of such a point infiltration source where the distributed water transport could be measured by several tensiometers in the soil as well as by a selective measurement of the discharge at the bottom of the experimental setup. Several series of experiments were conducted varying the boundary and initial con- ditions in order to derive the important parameters controlling the infiltration of pure water from the point source. The results showed that there is a significant difference between the infiltration rate in the point source and the discharge rate at the bottom, that could be explained by storage processes due to an outflow resistance at the bottom. This effect is overlayn by a decreasing water content decreases over time correlated with a decreasing infiltration

  17. Proposing a Holistic Model for Formulating the Security Requirements of e-learning based on Stakeholders’ Point of Veiw

    Directory of Open Access Journals (Sweden)

    Abouzar Arabsorkhi Mishabi

    2016-03-01

    Full Text Available Development of e-learning applications and services in the context of information and communication networks –beside qualitative and quantitative improvement in the scope and range of services they provide – has increased veriety of threats which are emerged from these networks and telecommunications infrastructure. This kind of issue have mad the effective and accurate analysing of security issues nessesary to managers and decision makers. Accordingly, in this study, using findings of other studies in the field of e-learning security, using methasyntesis, attempted to define a holistic model for classification and organization of security requirements. A structure that defines the origin of security requirements of e-learning and rolplays as a reference for formulating security requirements for this area.

  18. Bubble-point and dew-point equation for binary refrigerant mixture R22-R142b

    Energy Technology Data Exchange (ETDEWEB)

    Liancheng Tan; Zhongyou Zhao; Yonghong Duan (Xi' an Jiaotong Univ., Xi' an (China). Dept. of Power Machinery Engineering)

    1992-01-01

    A bubble-point and dew-point equation (in terms either of temperature or of pressure is suggested for the refrigerant mixture R22-R142b), which is regarded as one of the alternatives to R12. This equation has been examined with experimental data. A modified Rackett equation for the calculation of the bubble-point volume is also proposed. Compared with the experimental data, the rms errors in the calculated values of the bubble-point temperature, the dew-point temperature, and the bubble-point volume are 1.093%, 0.947%, and 1.120%, respectively. The calculation covers a wide range of temperatures and pressures, even near the critical point. It is shown how the equations are extrapolated to calculate other binary refrigerant mixtures. (author)

  19. Effect of target color and scanning geometry on terrestrial LiDAR point-cloud noise and plane fitting

    Science.gov (United States)

    Bolkas, Dimitrios; Martinez, Aaron

    2018-01-01

    Point-cloud coordinate information derived from terrestrial Light Detection And Ranging (LiDAR) is important for several applications in surveying and civil engineering. Plane fitting and segmentation of target-surfaces is an important step in several applications such as in the monitoring of structures. Reliable parametric modeling and segmentation relies on the underlying quality of the point-cloud. Therefore, understanding how point-cloud errors affect fitting of planes and segmentation is important. Point-cloud intensity, which accompanies the point-cloud data, often goes hand-in-hand with point-cloud noise. This study uses industrial particle boards painted with eight different colors (black, white, grey, red, green, blue, brown, and yellow) and two different sheens (flat and semi-gloss) to explore how noise and plane residuals vary with scanning geometry (i.e., distance and incidence angle) and target-color. Results show that darker colors, such as black and brown, can produce point clouds that are several times noisier than bright targets, such as white. In addition, semi-gloss targets manage to reduce noise in dark targets by about 2-3 times. The study of plane residuals with scanning geometry reveals that, in many of the cases tested, residuals decrease with increasing incidence angles, which can assist in understanding the distribution of plane residuals in a dataset. Finally, a scheme is developed to derive survey guidelines based on the data collected in this experiment. Three examples demonstrate that users should consider instrument specification, required precision of plane residuals, required point-spacing, target-color, and target-sheen, when selecting scanning locations. Outcomes of this study can aid users to select appropriate instrumentation and improve planning of terrestrial LiDAR data-acquisition.

  20. A Review of Point-Wise Motion Tracking Algorithms for Fetal Magnetic Resonance Imaging.

    Science.gov (United States)

    Chikop, Shivaprasad; Koulagi, Girish; Kumbara, Ankita; Geethanath, Sairam

    2016-01-01

    We review recent feature-based tracking algorithms as applied to fetal magnetic resonance imaging (MRI). Motion in fetal MRI is an active and challenging area of research, but the challenge can be mitigated by strategies related to patient setup, acquisition, reconstruction, and image processing. We focus on fetal motion correction through methods based on tracking algorithms for registration of slices with similar anatomy in multiple volumes. We describe five motion detection algorithms based on corner detection and region-based methods through pseudocodes, illustrating the results of their application to fetal MRI. We compare the performance of these methods on the basis of error in registration and minimum number of feature points required for registration. Harris, a corner detection method, provides similar error when compared to the other methods and has the lowest number of feature points required at that error level. We do not discuss group-wise methods here. Finally, we attempt to communicate the application of available feature extraction methods to fetal MRI.

  1. Space telescope phase B definition study. Volume 2A: Science instruments, high speed point/area photometer

    Science.gov (United States)

    1976-01-01

    The analysis and preliminary design of a high speed point/area photometer for the space telescope are summarized. The scientific objectives, photometer requirements, and design concepts are presented.

  2. Leveraging finances for public health system improvement: results from the Turning Point initiative.

    Science.gov (United States)

    Bekemeier, Betty; Riley, Catharine M; Berkowitz, Bobbie

    2007-01-01

    Reforming the public health infrastructure requires substantial system changes at the state level; state health agencies, however, often lack the resources and support for strategic planning and systemwide improvement. The Turning Point Initiative provided support for states to focus on large-scale system changes that resulted in increased funding for public health capacity and infrastructure development. Turning Point provides a test case for obtaining financial and institutional resources focused on systems change and infrastructure development-areas for which it has been historically difficult to obtain long-term support. The purpose of this exploratory, descriptive survey research was to enumerate the actual resources leveraged toward public health system improvement through the partnerships, planning, and implementation activities funded by the Robert Wood Johnson Foundation as a part of the Turning Point Initiative.

  3. Maximum Power Point Tracking Control of Photovoltaic Systems: A Polynomial Fuzzy Model-Based Approach

    DEFF Research Database (Denmark)

    Rakhshan, Mohsen; Vafamand, Navid; Khooban, Mohammad Hassan

    2018-01-01

    This paper introduces a polynomial fuzzy model (PFM)-based maximum power point tracking (MPPT) control approach to increase the performance and efficiency of the solar photovoltaic (PV) electricity generation. The proposed method relies on a polynomial fuzzy modeling, a polynomial parallel......, a direct maximum power (DMP)-based control structure is considered for MPPT. Using the PFM representation, the DMP-based control structure is formulated in terms of SOS conditions. Unlike the conventional approaches, the proposed approach does not require exploring the maximum power operational point...

  4. Imaging study on acupuncture points

    Science.gov (United States)

    Yan, X. H.; Zhang, X. Y.; Liu, C. L.; Dang, R. S.; Ando, M.; Sugiyama, H.; Chen, H. S.; Ding, G. H.

    2009-09-01

    The topographic structures of acupuncture points were investigated by using the synchrotron radiation based Dark Field Image (DFI) method. Four following acupuncture points were studied: Sanyinjiao, Neiguan, Zusanli and Tianshu. We have found that at acupuncture point regions there exists the accumulation of micro-vessels. The images taken in the surrounding tissue out of the acupuncture points do not show such kind of structure. It is the first time to reveal directly the specific structure of acupuncture points by X-ray imaging.

  5. Imaging study on acupuncture points

    International Nuclear Information System (INIS)

    Yan, X H; Zhang, X Y; Liu, C L; Dang, R S; Ando, M; Sugiyama, H; Chen, H S; Ding, G H

    2009-01-01

    The topographic structures of acupuncture points were investigated by using the synchrotron radiation based Dark Field Image (DFI) method. Four following acupuncture points were studied: Sanyinjiao, Neiguan, Zusanli and Tianshu. We have found that at acupuncture point regions there exists the accumulation of micro-vessels. The images taken in the surrounding tissue out of the acupuncture points do not show such kind of structure. It is the first time to reveal directly the specific structure of acupuncture points by X-ray imaging.

  6. Magic Pointing for Eyewear Computers

    DEFF Research Database (Denmark)

    Jalaliniya, Shahram; Mardanbegi, Diako; Pederson, Thomas

    2015-01-01

    In this paper, we propose a combination of head and eye movements for touchlessly controlling the "mouse pointer" on eyewear devices, exploiting the speed of eye pointing and accuracy of head pointing. The method is a wearable computer-targeted variation of the original MAGIC pointing approach...... which combined gaze tracking with a classical mouse device. The result of our experiment shows that the combination of eye and head movements is faster than head pointing for far targets and more accurate than eye pointing....

  7. Pointing movements both impair and improve visuospatial working memory depending on serial position.

    Science.gov (United States)

    Rossi-Arnaud, Clelia; Longobardi, Emiddia; Spataro, Pietro

    2017-08-01

    Two experiments investigated the effects of pointing movements on the item and order recall of random, horizontal, and vertical arrays consisting of 6 and 7 squares (Experiment 1) or 8 and 9 squares (Experiment 2). In the encoding phase, participants either viewed the items passively (passive-view condition) or pointed towards them (pointing condition). Then, after a brief interval, they were requested to recall the locations of the studied squares in the correct order of presentation. The critical result was that, for all types of arrays, the effects of the encoding condition varied as a function of serial position: for the initial and central positions accuracy was higher in the passive-view than in the pointing condition (confirming the standard inhibitory effect of pointing movements on visuospatial working memory), whereas the reverse pattern occurred in the final positions-showing a significant advantage of the pointing condition over the passive-view condition. Findings are interpreted as showing that pointing can have two simultaneous effects on the recall of spatial locations, a positive one due to the addition of a motor code and a negative one due to the attentional requirements of hand movements, with the net impact on serial recall depending on the amount of attention resources needed for the encoding of each position. Implications for the item-order hypothesis and the perceptual-gestural account of working memory are also discussed.

  8. Three Boundary Conditions for Computing the Fixed-Point Property in Binary Mixture Data.

    Directory of Open Access Journals (Sweden)

    Leendert van Maanen

    Full Text Available The notion of "mixtures" has become pervasive in behavioral and cognitive sciences, due to the success of dual-process theories of cognition. However, providing support for such dual-process theories is not trivial, as it crucially requires properties in the data that are specific to mixture of cognitive processes. In theory, one such property could be the fixed-point property of binary mixture data, applied-for instance- to response times. In that case, the fixed-point property entails that response time distributions obtained in an experiment in which the mixture proportion is manipulated would have a common density point. In the current article, we discuss the application of the fixed-point property and identify three boundary conditions under which the fixed-point property will not be interpretable. In Boundary condition 1, a finding in support of the fixed-point will be mute because of a lack of difference between conditions. Boundary condition 2 refers to the case in which the extreme conditions are so different that a mixture may display bimodality. In this case, a mixture hypothesis is clearly supported, yet the fixed-point may not be found. In Boundary condition 3 the fixed-point may also not be present, yet a mixture might still exist but is occluded due to additional changes in behavior. Finding the fixed-property provides strong support for a dual-process account, yet the boundary conditions that we identify should be considered before making inferences about underlying psychological processes.

  9. Implementing EW Receivers Based on Large Point Reconfigured FFT on FPGA Platforms

    Directory of Open Access Journals (Sweden)

    He Chen

    2011-12-01

    Full Text Available This paper presents design and implementation of digital receiver based on large point fast Fourier transform (FFT suitable for electronic warfare (EW applications. When implementing the FFT algorithm on field-programmable gate array (FPGA platforms, the primary goal is to maximize throughput and minimize area. This algorithm adopts two-dimension, parallel and pipeline stream mode and implements the reconfiguration of FFT's points. Moreover, a double-sequence-separation FFT algorithm has been implemented in order to achieve faster real time processing in broadband digital receivers. The performance of the hardware implementation on the FPGA platforms of broadband digital receivers has been analyzed in depth. It reaches the requirement of high-speed digital signal processing, and reveals the designing this kind of digital signal processing systems on FPGA platforms. Keywords: digital receivers, field programmable gate array (FPGA, fast Fourier transform (FFT, large point reconfigured, signal processing system.

  10. Heaving buoys, point absorbers and arrays.

    Science.gov (United States)

    Falnes, Johannes; Hals, Jørgen

    2012-01-28

    Absorption of wave energy may be considered as a phenomenon of interference between incident and radiated waves generated by an oscillating object; a wave-energy converter (WEC) that displaces water. If a WEC is very small in comparison with one wavelength, it is classified as a point absorber (PA); otherwise, as a 'quasi-point absorber'. The latter may be a dipole-mode radiator, for instance an immersed body oscillating in the surge mode or pitch mode, while a PA is so small that it should preferably be a source-mode radiator, for instance a heaving semi-submerged buoy. The power take-off capacity, the WEC's maximum swept volume and preferably also its full physical volume should be reasonably matched to the wave climate. To discuss this matter, two different upper bounds for absorbed power are applied in a 'Budal diagram'. It appears that, for a single WEC unit, a power capacity of only about 0.3 MW matches well to a typical offshore wave climate, and the full physical volume has, unfortunately, to be significantly larger than the swept volume, unless phase control is used. An example of a phase-controlled PA is presented. For a sizeable wave-power plant, an array consisting of hundreds, or even thousands, of mass-produced WEC units is required.

  11. Remediation of a former tank farm : Saviktok Point, NWT

    Energy Technology Data Exchange (ETDEWEB)

    Gingras, P. [Biogenie Inc., Quebec City, PQ (Canada)

    2008-07-01

    A former tank farm was remediated in Saviktok Point, Northwest Territories (NWT). This presentation discussed the site characteristics and presented several photographs of the tank farm location. The remote location did not have any source of electrical power and was accessible only by sea. It had limited availability of equipment, materials and manpower. The preferred solution for the hydrocarbon contamination was biological treatment, which requires oxygen gas to maximize the degradation of contaminants. Other key aspects of biological treatment include the need for heat to sustain microbial activity; use of nitrogen and phosphorous; neutral pH and loose structure and moisture content. Several photographs were provided to illustrate treatment technologies; bench scale trials; and the use of wind turbines for soil aeration. A chart that demonstrated bioremediation efficiency at Saviktok Point was presented. The presentation revealed that over a 3 season period 17,000 cubic metres were treated to NWT industrial standards. The average temperatures during treatment was 30 degrees Celsius and soils were recycled as landfill cover material. The presentation concluded with a discussion Saviktok Point benefits, such as the minimization of soil handling; utilization of a wind-powered aeration system; adapted design of the biological treatment to site-specific conditions; and maximum use of local resources. figs.

  12. Point-Structured Human Body Modeling Based on 3D Scan Data

    Directory of Open Access Journals (Sweden)

    Ming-June Tsai

    2018-01-01

    Full Text Available A novel point-structured geometrical modelling for realistic human body is introduced in this paper. This technique is based on the feature extraction from the 3D body scan data. Anatomic feature such as the neck, the arm pits, the crotch points, and other major feature points are recognized. The body data is then segmented into 6 major parts. A body model is then constructed by re-sampling the scanned data to create a point-structured mesh. The body model contains body geodetic landmarks in latitudinal and longitudinal curves passing through those feature points. The body model preserves the perfect body shape and all the body dimensions but requires little space. Therefore, the body model can be used as a mannequin in garment industry, or as a manikin in various human factor designs, but the most important application is to use as a virtue character to animate the body motion in mocap (motion capture systems. By adding suitable joint freedoms between the segmented body links, kinematic and dynamic properties of the motion theories can be applied to the body model. As a result, a 3D virtual character that is fully resembled the original scanned individual is vividly animating the body motions. The gaps between the body segments due to motion can be filled up by skin blending technique using the characteristic of the point-structured model. The model has the potential to serve as a standardized datatype to archive body information for all custom-made products.

  13. Ultra-Low Noise Quad Photoreceiver for Space Based Laser Interferometric Gravity Wave Detection, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Gravity wave detection using space-based long-baseline laser interferometric sensors imposes stringent noise requirements on the system components, including the...

  14. Effect of saddle-point anisotropy on point-defect drift-diffusion into straight dislocations

    International Nuclear Information System (INIS)

    Skinner, B.C.; Woo, C.H.

    1983-02-01

    Effects on point-defect drift-diffusion in the strain fields of edge or screw dislocations, due to the anisotropy of the point defect in its saddle-point configuration, are investigated. Expressions for sink strength and bias that include the saddle-point shape effect are derived, both in the absence and presence of an externally applied stress. These are found to depend on intrinsic parameters such as the relaxation volume and the saddle-point shape of the point defects, and extrinsic parameters such as temperature and the magnitude and direction of the externally applied stress with respect to the line direction and Burgers vector direction of the dislocation. The theory is applied to fcc copper and bcc iron. It is found that screw dislocations are biased sinks and that the stress-induced bias differential for the edge dislocations depends much more on the line direction than the Burgers vector direction. Comparison with the stress-induced bias differential due to the usual SIPA effect is made. It is found that the present effect causes a bias differential that is more than an order of magnitude larger

  15. Handbook of floating-point arithmetic

    CERN Document Server

    Muller, Jean-Michel; de Dinechin, Florent; Jeannerod, Claude-Pierre; Joldes, Mioara; Lefèvre, Vincent; Melquiond, Guillaume; Revol, Nathalie; Torres, Serge

    2018-01-01

    This handbook is a definitive guide to the effective use of modern floating-point arithmetic, which has considerably evolved, from the frequently inconsistent floating-point number systems of early computing to the recent IEEE 754-2008 standard. Most of computational mathematics depends on floating-point numbers, and understanding their various implementations will allow readers to develop programs specifically tailored for the standard’s technical features. Algorithms for floating-point arithmetic are presented throughout the book and illustrated where possible by example programs which show how these techniques appear in actual coding and design. The volume itself breaks its core topic into four parts: the basic concepts and history of floating-point arithmetic; methods of analyzing floating-point algorithms and optimizing them; implementations of IEEE 754-2008 in hardware and software; and useful extensions to the standard floating-point system, such as interval arithmetic, double- and triple-word arithm...

  16. Solving Singular Two-Point Boundary Value Problems Using Continuous Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Omar Abu Arqub

    2012-01-01

    Full Text Available In this paper, the continuous genetic algorithm is applied for the solution of singular two-point boundary value problems, where smooth solution curves are used throughout the evolution of the algorithm to obtain the required nodal values. The proposed technique might be considered as a variation of the finite difference method in the sense that each of the derivatives is replaced by an appropriate difference quotient approximation. This novel approach possesses main advantages; it can be applied without any limitation on the nature of the problem, the type of singularity, and the number of mesh points. Numerical examples are included to demonstrate the accuracy, applicability, and generality of the presented technique. The results reveal that the algorithm is very effective, straightforward, and simple.

  17. Solving differential equations for Feynman integrals by expansions near singular points

    Science.gov (United States)

    Lee, Roman N.; Smirnov, Alexander V.; Smirnov, Vladimir A.

    2018-03-01

    We describe a strategy to solve differential equations for Feynman integrals by powers series expansions near singular points and to obtain high precision results for the corresponding master integrals. We consider Feynman integrals with two scales, i.e. non-trivially depending on one variable. The corresponding algorithm is oriented at situations where canonical form of the differential equations is impossible. We provide a computer code constructed with the help of our algorithm for a simple example of four-loop generalized sunset integrals with three equal non-zero masses and two zero masses. Our code gives values of the master integrals at any given point on the real axis with a required accuracy and a given order of expansion in the regularization parameter ɛ.

  18. Energy-dependent expansion of .177 caliber hollow-point air gun projectiles.

    Science.gov (United States)

    Werner, Ronald; Schultz, Benno; Bockholdt, Britta; Ekkernkamp, Axel; Frank, Matthias

    2017-05-01

    Amongst hundreds of different projectiles for air guns available on the market, hollow-point air gun pellets are of special interest. These pellets are characterized by a tip or a hollowed-out shape in their tip which, when fired, makes the projectiles expand to an increased diameter upon entering the target medium. This results in an increase in release of energy which, in turn, has the potential to cause more serious injuries than non-hollow-point projectiles. To the best of the authors' knowledge, reliable data on the terminal ballistic features of hollow-point air gun projectiles compared to standard diabolo pellets have not yet been published in the forensic literature. The terminal ballistic performance (energy-dependent expansion and penetration) of four different types of .177 caliber hollow-point pellets discharged at kinetic energy levels from approximately 3 J up to 30 J into water, ordnance gelatin, and ordnance gelatin covered with natural chamois as a skin simulant was the subject of this investigation. Energy-dependent expansion of the tested hollow-point pellets was observed after being shot into all investigated target media. While some hollow-point pellets require a minimum kinetic energy of approximately 10 J for sufficient expansion, there are also hollow-point pellets which expand at kinetic energy levels of less than 5 J. The ratio of expansion (RE, calculated by the cross-sectional area (A) after impact divided by the cross-sectional area (A 0 ) of the undeformed pellet) of hollow-point air gun pellets reached values up of to 2.2. The extent of expansion relates to the kinetic energy of the projectile with a peak for pellet expansion at the 15 to 20 J range. To conclude, this work demonstrates that the hollow-point principle, i.e., the design-related enlargement of the projectiles' frontal area upon impact into a medium, does work in air guns as claimed by the manufacturers.

  19. Requirements for electricity producing gas-cooled reactors in the Federal Republic of Germany

    International Nuclear Information System (INIS)

    Schwarz, D.K.J.

    1989-01-01

    The paper describes requirements to a high-temperature gas-cooled reactor from the view-point of a utility in the Federal Republic of Germany. The requirements presented in the paper address different areas including plant size, availability, safety and economics. (author)

  20. Modeling hard clinical end-point data in economic analyses.

    Science.gov (United States)

    Kansal, Anuraag R; Zheng, Ying; Palencia, Roberto; Ruffolo, Antonio; Hass, Bastian; Sorensen, Sonja V

    2013-11-01

    The availability of hard clinical end-point data, such as that on cardiovascular (CV) events among patients with type 2 diabetes mellitus, is increasing, and as a result there is growing interest in using hard end-point data of this type in economic analyses. This study investigated published approaches for modeling hard end-points from clinical trials and evaluated their applicability in health economic models with different disease features. A review of cost-effectiveness models of interventions in clinically significant therapeutic areas (CV diseases, cancer, and chronic lower respiratory diseases) was conducted in PubMed and Embase using a defined search strategy. Only studies integrating hard end-point data from randomized clinical trials were considered. For each study included, clinical input characteristics and modeling approach were summarized and evaluated. A total of 33 articles (23 CV, eight cancer, two respiratory) were accepted for detailed analysis. Decision trees, Markov models, discrete event simulations, and hybrids were used. Event rates were incorporated either as constant rates, time-dependent risks, or risk equations based on patient characteristics. Risks dependent on time and/or patient characteristics were used where major event rates were >1%/year in models with fewer health states (Models of infrequent events or with numerous health states generally preferred constant event rates. The detailed modeling information and terminology varied, sometimes requiring interpretation. Key considerations for cost-effectiveness models incorporating hard end-point data include the frequency and characteristics of the relevant clinical events and how the trial data is reported. When event risk is low, simplification of both the model structure and event rate modeling is recommended. When event risk is common, such as in high risk populations, more detailed modeling approaches, including individual simulations or explicitly time-dependent event rates, are

  1. Quantifying Biomass from Point Clouds by Connecting Representations of Ecosystem Structure

    Science.gov (United States)

    Hendryx, S. M.; Barron-Gafford, G.

    2017-12-01

    Quantifying terrestrial ecosystem biomass is an essential part of monitoring carbon stocks and fluxes within the global carbon cycle and optimizing natural resource management. Point cloud data such as from lidar and structure from motion can be effective for quantifying biomass over large areas, but significant challenges remain in developing effective models that allow for such predictions. Inference models that estimate biomass from point clouds are established in many environments, yet, are often scale-dependent, needing to be fitted and applied at the same spatial scale and grid size at which they were developed. Furthermore, training such models typically requires large in situ datasets that are often prohibitively costly or time-consuming to obtain. We present here a scale- and sensor-invariant framework for efficiently estimating biomass from point clouds. Central to this framework, we present a new algorithm, assignPointsToExistingClusters, that has been developed for finding matches between in situ data and clusters in remotely-sensed point clouds. The algorithm can be used for assessing canopy segmentation accuracy and for training and validating machine learning models for predicting biophysical variables. We demonstrate the algorithm's efficacy by using it to train a random forest model of above ground biomass in a shrubland environment in Southern Arizona. We show that by learning a nonlinear function to estimate biomass from segmented canopy features we can reduce error, especially in the presence of inaccurate clusterings, when compared to a traditional, deterministic technique to estimate biomass from remotely measured canopies. Our random forest on cluster features model extends established methods of training random forest regressions to predict biomass of subplots but requires significantly less training data and is scale invariant. The random forest on cluster features model reduced mean absolute error, when evaluated on all test data in leave

  2. Indexing Moving Points

    DEFF Research Database (Denmark)

    Agarwal, Pankaj K.; Arge, Lars Allan; Erickson, Jeff

    2003-01-01

    We propose three indexing schemes for storing a set S of N points in the plane, each moving along a linear trajectory, so that any query of the following form can be answered quickly: Given a rectangle R and a real value t, report all K points of S that lie inside R at time t. We first present an...

  3. 30 CFR 75.507-1 - Electric equipment other than power-connection points; outby the last open crosscut; return air...

    Science.gov (United States)

    2010-07-01

    ... points; outby the last open crosscut; return air; permissibility requirements. 75.507-1 Section 75.507-1... other than power-connection points; outby the last open crosscut; return air; permissibility... permit for noncompliance may be used in return air outby the last open crosscut for the duration of such...

  4. Inhibition effect of calcium hydroxide point and chlorhexidine point on root canal bacteria of necrosis teeth

    Directory of Open Access Journals (Sweden)

    Andry Leonard Je

    2006-03-01

    Full Text Available Calcium Hydroxide point and Chlorhexidine point are new drugs for eliminating bacteria in the root canal. The points slowly and controly realease Calcium Hydroxide and Chlorhexidine into root canal. The purpose of the study was to determined the effectivity of Calcium hydroxide point (Calcium hydroxide plus point and Chlorhexidine point in eleminating the root canal bacteria of nescrosis teeth. In this study 14 subjects were divided into 2 groups. The first group was treated with Calcium hydroxide point and the second was treated with Chlorhexidine poin. The bacteriological sampling were measured with spectrofotometry. The Paired T Test analysis (before and after showed significant difference between the first and second group. The Independent T Test which analysed the effectivity of both groups had not showed significant difference. Although there was no significant difference in statistical test, the result of second group eliminate more bacteria than the first group. The present finding indicated that the use of Chlorhexidine point was better than Calcium hydroxide point in seven days period. The conclusion is Chlorhexidine point and Calcium hydroxide point as root canal medicament effectively eliminate root canal bacteria of necrosis teeth.

  5. Pro SharePoint 2013 administration

    CERN Document Server

    Garrett, Robert

    2013-01-01

    Pro SharePoint 2013 Administration is a practical guide to SharePoint 2013 for intermediate to advanced SharePoint administrators and power users, covering the out-of-the-box feature set and capabilities of Microsoft's collaboration and business productivity platform. SharePoint 2013 is an incredibly complex product, with many moving parts, new features, best practices, and 'gotchas.' Author Rob Garrett distills SharePoint's portfolio of features, capabilities, and utilities into an in-depth professional guide-with no fluff and copious advice-that is designed from scratch to be the manual Micr

  6. Shifting between Third and First Person Points of View in EFL Narratives

    Science.gov (United States)

    Shokouhi, Hossein; Daram, Mahmood; Sabah, Somayeh

    2011-01-01

    This article reports on the difference between points of view in narrating a short story. The EFL learners taking part in the control group were required to recount the events from the third person perspective and the subjects in the experimental group from the first person perspective. The methodological frame of the study was based on Koven's…

  7. Homotopy analysis solutions of point kinetics equations with one delayed precursor group

    International Nuclear Information System (INIS)

    Zhu Qian; Luo Lei; Chen Zhiyun; Li Haofeng

    2010-01-01

    Homotopy analysis method is proposed to obtain series solutions of nonlinear differential equations. Homotopy analysis method was applied for the point kinetics equations with one delayed precursor group. Analytic solutions were obtained using homotopy analysis method, and the algorithm was analysed. The results show that the algorithm computation time and precision agree with the engineering requirements. (authors)

  8. Geodetic Control Points - Multi-State Control Point Database

    Data.gov (United States)

    NSGIC State | GIS Inventory — The Multi-State Control Point Database (MCPD) is a database of geodetic and mapping control covering Idaho and Montana. The control were submitted by registered land...

  9. Molecular symmetry: Why permutation-inversion (PI) groups don't render the point groups obsolete

    Science.gov (United States)

    Groner, Peter

    2018-01-01

    The analysis of spectra of molecules with internal large-amplitude motions (LAMs) requires molecular symmetry (MS) groups that are larger than and significantly different from the more familiar point groups. MS groups are described often by the permutation-inversion (PI) group method. It is shown that point groups still can and should play a significant role together with the PI groups for a class of molecules with internal rotors. In molecules of this class, several simple internal rotors are attached to a rigid molecular frame. The PI groups for this class are semidirect products like H ^ F, where the invariant subgroup H is a direct product of cyclic groups and F is a point group. This result is used to derive meaningful labels for MS groups, and to derive correlation tables between MS groups and point groups. MS groups of this class have many parallels to space groups of crystalline solids.

  10. Point-of-Care Test Equipment for Flexible Laboratory Automation.

    Science.gov (United States)

    You, Won Suk; Park, Jae Jun; Jin, Sung Moon; Ryew, Sung Moo; Choi, Hyouk Ryeol

    2014-08-01

    Blood tests are some of the core clinical laboratory tests for diagnosing patients. In hospitals, an automated process called total laboratory automation, which relies on a set of sophisticated equipment, is normally adopted for blood tests. Noting that the total laboratory automation system typically requires a large footprint and significant amount of power, slim and easy-to-move blood test equipment is necessary for specific demands such as emergency departments or small-size local clinics. In this article, we present a point-of-care test system that can provide flexibility and portability with low cost. First, the system components, including a reagent tray, dispensing module, microfluidic disk rotor, and photometry scanner, and their functions are explained. Then, a scheduler algorithm to provide a point-of-care test platform with an efficient test schedule to reduce test time is introduced. Finally, the results of diagnostic tests are presented to evaluate the system. © 2014 Society for Laboratory Automation and Screening.

  11. Line of Sight Stabilization of James Webb Space Telescope

    Science.gov (United States)

    Meza, Luis; Tung, Frank; Anandakrishnan, Satya; Spector, Victor; Hyde, Tupper

    2005-01-01

    The James Webb Space Telescope (JWST) builds upon the successful flight experience of the Chandra Xray Telescope by incorporating an additional LOS pointing servo to meet the more stringent pointing requirements. The LOS pointing servo, referred to in JWST as the Fine Guidance Control System (FGCS), will utilize a Fine Guidance Sensor (FGS) as the sensor, and a Fine Steering Mirror (FSM) as the actuator. The FSM is a part of the Optical Telescope Element (OTE) and is in the optical path between the tertiary mirror and the instrument focal plane, while the FGS is part of the Integrated Science Instrument Module (ISIM). The basic Chandra spacecraft bus attitude control and determination architecture, utilizing gyros, star trackers/aspect camera, and reaction wheels, is retained for JWST. This system has achieved pointing stability of better than 0.5 arcseconds. To reach the JWST requirements of milli-arcsecond pointing stability with this ACS hardware, the local FGCS loop is added to the optical path. The FGCS bandwidth is about 2.0 Hz and will therefore attenuate much of the spacecraft ACS induced low frequency jitter. In order to attenuate the higher frequency (greatet than 2.0 Hz) disturbances associated with reaction wheel static and dynamic imbalances, as well as bearing run-out, JWST will employ a two-stage passive vibration isolation system consisting of (1) 7.0 Hz reaction wheel isolators between each reaction wheel and the spacecraft bus, and (2) a 1.0 Hz tower isolator between the spacecraft bus and the Optical Telescope Element (OTE). In order to sense and measure the LOS, the FGS behaves much like an autonomous star tracker that has a very small field of view and uses the optics of the telescope. It performs the functions of acquisition, identification and tracking of stars in its 2.5 x 2.5 arcminute field of view (FOV), and provides the centroid and magnitude of the selected star for use in LOS control. However, since only a single star is being tracked

  12. eLISA Telescope In-field Pointing and Scattered Light Study

    Science.gov (United States)

    Livas, J.; Sankar, S.; West, G.; Seals, L.; Howard, J.; Fitzsimons, E.

    2017-05-01

    The orbital motion of the three spacecraft that make up the eLISA Observatory constellation causes long-arm line of sight variations of approximately ± one degree over the course of a year. The baseline solution is to package the telescope, the optical bench, and the gravitational reference sensor (GRS) into an optical assembly at each end of the measurement arm, and then to articulate the assembly. An optical phase reference is exchanged between the moving optical benches with a single mode optical fiber (“backlink” fiber). An alternative solution, referred to as in-field pointing, embeds a steering mirror into the optical design, fixing the optical benches and eliminating the backlink fiber, but requiring the additional complication of a two-stage optical design for the telescope. We examine the impact of an in-field pointing design on the scattered light performance.

  13. Hinkley Point 'C' power station public inquiry: statement of case

    International Nuclear Information System (INIS)

    1988-08-01

    This Statement of Case contains full particulars of the case which the Central Electricity Generating Board (CEGB) proposes to put forward at the Hinkley Point ''C'' Inquiry. It relates to the planning application made by the CEGB for the construction of a 1200 MW Pressurized Water Reactor (PWR) power station at Hinkley Point in the United Kingdom, adjacent to an existing nuclear power station. The inquiry will consider economic, safety, environmental and planning matters relevant to the application and the implications for agriculture and local amenities of re-aligning two power transmission lines. The Statement contains submissions on the following matters: Topic 1 The Requirement for the Station; Topic 2 Safety and Design, including Radioactive Discharges; Topic 3 The On-Site Management of Radioactive Waste and Decommissioning of the Station; Topic 4 Emergency Arrangements; Topic 5 Local and Environmental Issues. (author)

  14. Analytical Performance Requirements for Systems for Self-Monitoring of Blood Glucose With Focus on System Accuracy: Relevant Differences Among ISO 15197:2003, ISO 15197:2013, and Current FDA Recommendations.

    Science.gov (United States)

    Freckmann, Guido; Schmid, Christina; Baumstark, Annette; Rutschmann, Malte; Haug, Cornelia; Heinemann, Lutz

    2015-07-01

    In the European Union (EU), the ISO (International Organization for Standardization) 15197 standard is applicable for the evaluation of systems for self-monitoring of blood glucose (SMBG) before the market approval. In 2013, a revised version of this standard was published. Relevant revisions in the analytical performance requirements are the inclusion of the evaluation of influence quantities, for example, hematocrit, and some changes in the testing procedures for measurement precision and system accuracy evaluation, for example, number of test strip lots. Regarding system accuracy evaluation, the most important change is the inclusion of more stringent accuracy criteria. In 2014, the Food and Drug Administration (FDA) in the United States published their own guidance document for the premarket evaluation of SMBG systems with even more stringent system accuracy criteria than stipulated by ISO 15197:2013. The establishment of strict accuracy criteria applicable for the premarket evaluation is a possible approach to further improve the measurement quality of SMBG systems. However, the system accuracy testing procedure is quite complex, and some critical aspects, for example, systematic measurement difference between the reference measurement procedure and a higher-order procedure, may potentially limit the apparent accuracy of a given system. Therefore, the implementation of a harmonized reference measurement procedure for which traceability to standards of higher order is verified through an unbroken, documented chain of calibrations is desirable. In addition, the establishment of regular and standardized post-marketing evaluations of distributed test strip lots should be considered as an approach toward an improved measurement quality of available SMBG systems. © 2015 Diabetes Technology Society.

  15. 46 CFR 115.801 - Notice of inspection deficiencies and requirements.

    Science.gov (United States)

    2010-10-01

    ... the requirements of law or the regulations in this subchapter, the marine inspector will point out... presents a serious safety hazard to the vessel or it's passengers or crew, and exists through negligence or...

  16. Modelling point patterns with linear structures

    DEFF Research Database (Denmark)

    Møller, Jesper; Rasmussen, Jakob Gulddahl

    2009-01-01

    processes whose realizations contain such linear structures. Such a point process is constructed sequentially by placing one point at a time. The points are placed in such a way that new points are often placed close to previously placed points, and the points form roughly line shaped structures. We...... consider simulations of this model and compare with real data....

  17. Modelling point patterns with linear structures

    DEFF Research Database (Denmark)

    Møller, Jesper; Rasmussen, Jakob Gulddahl

    processes whose realizations contain such linear structures. Such a point process is constructed sequentially by placing one point at a time. The points are placed in such a way that new points are often placed close to previously placed points, and the points form roughly line shaped structures. We...... consider simulations of this model and compare with real data....

  18. Extractive Sampling and Optical Remote Sensing of F-100 Aircraft Engine Emissions (PREPRINT)

    National Research Council Canada - National Science Library

    Cowen, Kenneth; Goodwin, Bradley; Satola, Jan; Kagann, Robert; Hashmonay, Ram; Spicer, Chester; Holdren, Michael; Mayfield, Howard T

    2008-01-01

    ... from military aircraft, in order to meet increasingly stringent regulatory requirements. This paper describes the results of a recent field study using extractive and optical remote sensing (ORS...

  19. National waste terminal storage program. Supplementary quality-assurance requirements

    International Nuclear Information System (INIS)

    Garland, D.L.

    1980-01-01

    The basic Quality Assurance Program Requirements standard for the National Waste Terminal Storage Program has been developed primarily for nuclear reactors and other fairly well established nuclear facilities. In the case of waste isolation, however, there are many ongoing investigations for which quality assurance practices and requirements have not been well defined. This paper points out these problems which require supplementary requirements. Briefly these are: (1) the language barrier, that is geologists and scientists are not familiar with quality assurance (QA) terminology; (2) earth sciences deal with materials that cannot be characterized as easily as metals or other materials that are reasonably homogeneous; (3) development and control of mathematical models and associated computer programs; (4) research and development

  20. Laboratory testing in management of patients with suspected Ebolavirus disease: infection control and safety.

    Science.gov (United States)

    Gilbert, G L

    2015-08-01

    If routine laboratory safety precautions are followed, the risk of laboratory-acquired infection from handling specimens from patients with Ebolavirus disease (EVD) is very low, especially in the early 'dry' stage of disease. In Australia, border screening to identify travellers returning from EVD-affected west African countries during the 2014-2015 outbreak has made it unlikely that specimens from patients with unrecognised EVD would be sent to a routine diagnostic laboratory. Australian public health and diagnostic laboratories associated with hospitals designated for the care of patients with EVD have developed stringent safety precautions for EVD diagnostic and other tests likely to be required for supportive care of the sickest (and most infectious) patients with EVD, including as wide a range of point-of-care tests as possible. However, it is important that the stringent requirements for packaging, transport and testing of specimens that might contain Ebolavirus--which is a tier 1 security sensitive biology agent--do not delay the diagnosis and appropriate management of other potentially serious but treatable infectious diseases, which are far more likely causes of a febrile illness in people returning from west Africa. If necessary, urgent haematology, biochemistry and microbiological tests can be performed safely, whilst awaiting the results of EVD tests, in a PC-2 laboratory with appropriate precautions including: use of recommended personal protective equipment (PPE) for laboratory staff; handling any unsealed specimens in a class 1 or II biosafety cabinet; using only centrifuges with sealed rotors; and safe disposal or decontamination of all used equipment and laboratory waste.

  1. Three-level grid-connected photovoltaic inverter with maximum power point tracking

    International Nuclear Information System (INIS)

    Tsang, K.M.; Chan, W.L.

    2013-01-01

    Highlight: ► This paper reports a novel 3-level grid connected photovoltaic inverter. ► The inverter features maximum power point tracking and grid current shaping. ► The inverter can be acted as an active filter and a renewable power source. - Abstract: This paper presents a systematic way of designing control scheme for a grid-connected photovoltaic (PV) inverter featuring maximum power point tracking (MPPT) and grid current shaping. Unlike conventional design, only four power switches are required to achieve three output levels and it is not necessary to use any phase-locked-loop circuitry. For the proposed scheme, a simple integral controller has been designed for the tracking of the maximum power point of a PV array based on an improved extremum seeking control method. For the grid-connected inverter, a current loop controller and a voltage loop controller have been designed. The current loop controller is designed to shape the inverter output current while the voltage loop controller can maintain the capacitor voltage at a certain level and provide a reference inverter output current for the PV inverter without affecting the maximum power point of the PV array. Experimental results are included to demonstrate the effectiveness of the tracking and control scheme.

  2. Preferred design of transfer points

    Energy Technology Data Exchange (ETDEWEB)

    Firstbrook, J

    1983-06-01

    The paper reports the results of a study carried out at MRDE on the performance of transfer points. These sites still require a considerable deployment of manpower. The investigations were divided into 3 main sections: mineral handling; chute design; belt cleaning and spillage conveying. In order to achieve a sufficient level of reliability of a transfer system, it is considered that the coal size should not be greater than 250 mm. Different designs of gravity transfer chute were studied and 3 basic designs emerged as being effective : dump chute, curved site entry, and asymmetric side entry. Selection is dependent on conveyor sizes, speeds, and mineral throughput. Individual belt cleaners are not sufficient to clean belts carrying typical run-of-mine material: two cleaners followed by a set of squeeze rollers are recommended. Several designs of spillage conveyor were investigated; at the present time the electro/hydraulic type appears to be the most satisfactory.

  3. Tipping Points, Great and Small

    Science.gov (United States)

    Morrison, Foster

    2010-12-01

    The Forum by Jordan et al. [2010] addressed environmental problems of various scales in great detail, but getting the critical message through to the formulators of public policies requires going back to basics, namely, that exponential growth (of a population, an economy, or most anything else) is not sustainable. When have you heard any politician or economist from anywhere across the ideological spectrum say anything other than that more growth is essential? There is no need for computer models to demonstrate “limits to growth,” as was done in the 1960s. Of course, as one seeks more details, the complexity of modeling will rapidly outstrip the capabilities of both observation and computing. This is common with nonlinear systems, even simple ones. Thus, identifying all possible “tipping points,” as suggested by Jordan et al. [2010], and then stopping just short of them, is impractical if not impossible. The main thing needed to avoid environmental disasters is a bit of common sense.

  4. Beginning SharePoint Designer 2010

    CERN Document Server

    Windischman, Woodrow W; Rehmani, Asif

    2010-01-01

    Teaching Web designers, developers, and IT professionals how to use the new version of SharePoint Designer. Covering both the design and business applications of SharePoint Designer, this complete Wrox guide brings readers thoroughly up to speed on how to use SharePoint Designer in an enterprise. You'll learn to create and modify web pages, use CSS editing tools to modify themes, use Data View to create interactivity with SharePoint and other data, and much more. Coverage includes integration points with Visual Studio, Visio, and InfoPath.: Shows web designers, developers, and IT professionals

  5. SharePoint 2013 for dummies

    CERN Document Server

    Withee, Ken

    2013-01-01

    The bestselling guide on running SharePoint, now updated to cover all the new features of SharePoint 2013 SharePoint Portal Server is an essential part of the enterprise infrastructure for many businesses. Building on the success of previous versions of SharePoint For Dummies, this new edition covers all the latest features of SharePoint 2013 and provides you with an easy-to-understand resource for making the most of all that this version has to offer. You'll learn how to get a site up and running, branded, and populated with content, workflow, and management. In addition, t

  6. Pro SharePoint 2010 Search

    CERN Document Server

    Noble, J; Bakman-Mikalski, Dan

    2011-01-01

    Pro SharePoint 2010 Search gives you expert advice on planning, deploying and customizing searches in SharePoint 2010. Drawing on the authors' extensive experience of working with real-world SharePoint deployments, this book teaches everything you'll need to know to create well-designed SharePoint solutions that always keep the end-user's experience in mind. Increase your search efficiency with SharePoint 2010's search functionality: extend the search user interface using third-party tools, and utilize analytics to improve relevancy. This practical hands-on book is a must-have resource for any

  7. 47 CFR 27.14 - Construction requirements; Criteria for renewal.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false Construction requirements; Criteria for renewal. 27.14 Section 27.14 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER... Centroid Populations within the area, divided by the license area's total population. (2) For point-to...

  8. A mobile robot with parallel kinematics constructed under requirements for assembling and machining of the ITER vacuum vessel

    International Nuclear Information System (INIS)

    Pessi, P.; Huapeng Wu; Handroos, H.; Jones, L.

    2006-01-01

    ITER sectors require more stringent tolerances ± 5 mm than normally expected for the size of structure involved. The walls of ITER sectors are made of 60 mm thick stainless steel and are joined together by high efficiency structural and leak tight welds. In addition to the initial vacuum vessel assembly, sectors may have to be replaced for repair. Since commercially available machines are too heavy for the required machining operations and the lifting of a possible e-beam gun column system, and conventional robots lack the stiffness and accuracy in such machining condition, a new flexible, lightweight and mobile robotic machine is being considered. For the assembly of the ITER vacuum vessel sector, precise positioning of welding end-effectors, at some distance in a confined space from the available supports, will be required, which is not possible using conventional machines or robots. This paper presents a special robot, able to carry out welding and machining processes from inside the ITER vacuum vessel, consisting of a ten-degree-of-freedom parallel robot mounted on a carriage driven by electric motor/gearbox on a track. The robot consists of a Stewart platform based parallel mechanism. Water hydraulic cylinders are used as actuators to reach six degrees of freedom for parallel construction. Two linear and two rotational motions are used for enlargement the workspace of the manipulator. The robot carries both welding gun such as a TIG, hybrid laser or e-beam welding gun to weld the inner and outer walls of the ITER vacuum vessel sectors and machining tools to cut and milling the walls with necessary accuracy, it can also carry other tools and material to a required position inside the vacuum vessel . For assembling an on line six degrees of freedom seam finding algorithm has been developed, which enables the robot to find welding seam automatically in a very complex environment. In the machining multi flexible machining processes carried out automatically by

  9. Design of a basic angle monitoring system in Silicon Carbide

    NARCIS (Netherlands)

    Veggel, van A.A.; Rosielle, P.C.J.N.; Nijmeijer, H.; Wielders, A.A.; Vink, H.J.P.

    2005-01-01

    Due to the 10 microarcsecond accuracy, with which GAIA will measure the positions of stars using 2 astrometric telescopes, stability requirements on the payload module are extremely stringent. In order to achieve the required 10 microarcsecond accuracy, a metrology system could be installed on the

  10. The Influence of the Tri-reference Points on Fairness and Satisfaction Perception

    Directory of Open Access Journals (Sweden)

    Lei Zhao

    2018-02-01

    Full Text Available We examined the influence of three reference points (minimum requirements [MR], the status quo [SQ], and goal [G] proposed by the tri-reference point (TRP theory on fairness and satisfaction perceptions of pay in three laboratory experiments. To test the effects, we manipulated these three reference points both implicitly (Experiment 1 and explicitly (Experiments 2 and 3. We also provided the information of the salary offered to a peer person that was lower than, equal to, or higher than the salary offer to the participant. As hypothesized, the results demonstrated the important role of these reference points in judging the fairness of and satisfaction with pay when they were explicitly set (an interaction between reference points and social comparison in Experiments 2 and 3, but not in Experiment 1. Participants altered their judgments when the salary was in different regions. When the salary was below MR, participants perceived very low fairness and satisfaction, even when the offer was equal to/exceeded others. When the salary was above G, participants perceived much higher fairness and satisfaction, even with disadvantageous inequality. Participants were more impacted when they were explicitly instructed of the reference points (Experiments 2 and 3 than when they were not (Experiment 1. Moreover, MR appeared to be the most important, followed by G. A Salary below MR was judged as very unacceptable, with very low fairness and satisfaction ratings.

  11. On Motion Planning for Point-to-Point Maneuvers for a Class of Sailing Vehicles

    DEFF Research Database (Denmark)

    Xiao, Lin; Jouffroy, Jerome

    2011-01-01

    Despite their interesting dynamic and controllability properties, sailing vehicles have not been much studied in the control community. In this paper, we investigate motion planning of such vehicles. Starting from a simple dynamic model of sailing vessels in one dimension, this paper first...... considers their associated controllability issues, with the so-called no-sailing zone as a starting point, and it links them with a motion planning strategy using two-point boundary value problems as the main mathematical tool. This perspective is then expanded to do point-to-point maneuvers of sailing...

  12. Hinkley Point 'C' power station public inquiry: proof of evidence on agriculture

    International Nuclear Information System (INIS)

    Worthington, T.R.

    1988-09-01

    A public inquiry has been set up to examine the planning application made by the Central Electricity Generating Board (CEGB) for the construction of a 1200 MW Pressurized Water Reactor power station at Hinkley Point (Hinkley Point ''C'') in the United Kingdom. Agricultural land will need to be acquired for the proposed construction both on a temporary and permanent basis. The CEGB evidence to the Inquiry identifies the land which will be permanently lost for agricultural purposes and that which could eventually be returned to agriculture. All the land required is on a single holding but should leave a viable area to be farmed. The farming business would be compensated for loss of profits. (UK)

  13. A Research on Fast Face Feature Points Detection on Smart Mobile Devices

    Directory of Open Access Journals (Sweden)

    Xiaohe Li

    2018-01-01

    Full Text Available We explore how to leverage the performance of face feature points detection on mobile terminals from 3 aspects. First, we optimize the models used in SDM algorithms via PCA and Spectrum Clustering. Second, we propose an evaluation criterion using Linear Discriminative Analysis to choose the best local feature descriptions which plays a critical role in feature points detection. Third, we take advantage of multicore architecture of mobile terminal and parallelize the optimized SDM algorithm to improve the efficiency further. The experiment observations show that our final accomplished GPC-SDM (improved Supervised Descent Method using spectrum clustering, PCA, and GPU acceleration suppresses the memory usage, which is beneficial and efficient to meet the real-time requirements.

  14. Making Sense of Boiling Points and Melting Points

    Indian Academy of Sciences (India)

    GENERAL | ARTICLE. The boiling and melting points of a pure substance are char- ... bonds, which involves high energy and hence high temperatures. Among the .... with zero intermolecular force at all temperatures and pressures, which ...

  15. Hierarchical model generation for architecture reconstruction using laser-scanned point clouds

    Science.gov (United States)

    Ning, Xiaojuan; Wang, Yinghui; Zhang, Xiaopeng

    2014-06-01

    Architecture reconstruction using terrestrial laser scanner is a prevalent and challenging research topic. We introduce an automatic, hierarchical architecture generation framework to produce full geometry of architecture based on a novel combination of facade structures detection, detailed windows propagation, and hierarchical model consolidation. Our method highlights the generation of geometric models automatically fitting the design information of the architecture from sparse, incomplete, and noisy point clouds. First, the planar regions detected in raw point clouds are interpreted as three-dimensional clusters. Then, the boundary of each region extracted by projecting the points into its corresponding two-dimensional plane is classified to obtain detailed shape structure elements (e.g., windows and doors). Finally, a polyhedron model is generated by calculating the proposed local structure model, consolidated structure model, and detailed window model. Experiments on modeling the scanned real-life buildings demonstrate the advantages of our method, in which the reconstructed models not only correspond to the information of architectural design accurately, but also satisfy the requirements for visualization and analysis.

  16. Octopuses use a human-like strategy to control precise point-to-point arm movements.

    Science.gov (United States)

    Sumbre, Germán; Fiorito, Graziano; Flash, Tamar; Hochner, Binyamin

    2006-04-18

    One of the key problems in motor control is mastering or reducing the number of degrees of freedom (DOFs) through coordination. This problem is especially prominent with hyper-redundant limbs such as the extremely flexible arm of the octopus. Several strategies for simplifying these control problems have been suggested for human point-to-point arm movements. Despite the evolutionary gap and morphological differences, humans and octopuses evolved similar strategies when fetching food to the mouth. To achieve this precise point-to-point-task, octopus arms generate a quasi-articulated structure based on three dynamic joints. A rotational movement around these joints brings the object to the mouth . Here, we describe a peripheral neural mechanism-two waves of muscle activation propagate toward each other, and their collision point sets the medial-joint location. This is a remarkably simple mechanism for adjusting the length of the segments according to where the object is grasped. Furthermore, similar to certain human arm movements, kinematic invariants were observed at the joint level rather than at the end-effector level, suggesting intrinsic control coordination. The evolutionary convergence to similar geometrical and kinematic features suggests that a kinematically constrained articulated limb controlled at the level of joint space is the optimal solution for precise point-to-point movements.

  17. Experiences in certification of packages for transportation of fresh nuclear fuel in the context of new safety requirements established by IAEA regulations (IAEA-96 regulations, ST-1) for air transportation of nuclear materials (requirements to C-type packages)

    Energy Technology Data Exchange (ETDEWEB)

    Dudai, V.I.; Kovtun, A.D.; Matveev, V.Z.; Morenko, A.I.; Nilulin, V.M.; Shapovalov, V.I.; Yakushev, V.A.; Bobrovsky, V.S.; Rozhkov, V.V.; Agapov, A.M.; Kolesnikov, A.S. [Russian Federal Nuclear Centre - All-Russian Research Inst. of Experimental Physics, Sarov (Russian Federation)]|[JSC ' ' MSZ' ' , Electrostal (Russian Federation)]|[JSC ' ' NPCC' ' , Novosibirsk (Russian Federation)]|[Minatom of Russia, Moscow (Russian Federation)]|[Gosatomnadzor of Russia, Moscow (Russian Federation)

    2004-07-01

    Every year in Russia, a large amount of domestic and international transportation of fresh nuclear fuel (FNF) used in Russian and foreign energy and research atomic reactors and referred to fissile materials based on IAEA Regulations is performed. Here, bulk transportation is performed by air, and it concerns international transportation in particular. According to national ''Main Regulations for Safe Transport and physical Protection of Nuclear Materials (OPBZ- 83)'' and ''Regulations for the Safe Transport of Radioactive Materials'' of the International Atomic Energy Agency (IAEA Regulations), nuclear and radiation security under normal (accident free) and accident conditions of transport must be completely provided by the package design. In this context, high requirements to fissile packages exposed to heat and mechanical loads in transport accidents are imposed. A long-standing experience in accident free transportation of FM has shown that such approach to provide nuclear and radiation security pays for itself completely. Nevertheless, once in 10 years the International Atomic Energy Agency on every revision of the ''Regulations for the Safe Transport of Radioactive Materials'' places more stringent requirements upon the FM and transportation thereof, resulting from the objectively increasing risk associated with constant rise in volume and density of transportation, and also strained social and economical situation in a number of regions in the world. In the new edition of the IAEA Regulations (ST-1), published in 1996 and brought into force in 2001 (IAEA-96 Regulations), the requirements to FM packages conveyed by aircraft were radically changed. These requirements are completely presented in new Russian ''Regulations for the Safe Transport of Radioactive Materials'' (PBTRM- 2004) which will be brought into force in the time ahead.

  18. PROVIDING STRINGENT STAR FORMATION RATE LIMITS OF z ∼ 2 QSO HOST GALAXIES AT HIGH ANGULAR RESOLUTION

    Energy Technology Data Exchange (ETDEWEB)

    Vayner, Andrey; Wright, Shelley A. [Department of Astronomy and Astrophysics, University of Toronto, 50 St. George Street, Toronto, ON, M5S 3H4 (Canada); Do, Tuan [Dunlap Institute for Astronomy and Astrophysics, University of Toronto, 50 St. George Street, Toronto, ON, M5S 3H4 (Canada); Larkin, James E. [Department of Physics and Astronomy, University of California, Los Angeles, CA 90095 (United States); Armus, Lee [Spitzer Science Center, California Institute of Technology, 1200 E. California Boulevard, Pasadena, CA 91125 (United States); Gallagher, S. C. [Department of Physics and Astronomy, The University of Western Ontario, London, ON N6A 3K7 (Canada)

    2016-04-10

    We present integral field spectrograph (IFS) with laser guide star adaptive optics (LGS-AO) observations of z ∼ 2 quasi-stellar objects (QSOs) designed to resolve extended nebular line emission from the host galaxy. Our data was obtained with W. M. Keck and Gemini North Observatories, using OSIRIS and NIFS coupled with the LGS-AO systems, respectively. We have conducted a pilot survey of five QSOs, three observed with NIFS+AO and two observed with OSIRIS+AO at an average redshift of z = 2.2. We demonstrate that the combination of AO and IFSs provides the necessary spatial and spectral resolutions required to separate QSO emission from its host. We present our technique for generating a point-spread function (PSF) from the broad-line region of the QSO and performing PSF subtraction of the QSO emission to detect the host galaxy emission at a separation of ∼0.″2 (∼1.4 kpc). We detect Hα narrow-line emission for two sources, SDSS J1029+6510 (z{sub Hα} = 2.182) and SDSS J0925+0655 (z{sub Hα} = 2.197), that have evidence for both star formation and extended narrow-line emission. Assuming that the majority of narrow-line Hα emission is from star formation, we infer a star formation rate (SFR) for SDSS J1029+6510 of 78.4 M{sub ⊙} yr{sup −1} originating from a compact region that is kinematically offset by 290–350 km s{sup −1}. For SDSS J0925+0655 we infer a SFR of 29 M{sub ⊙} yr{sup −1} distributed over three clumps that are spatially offset by ∼7 kpc. The null detections on three of the QSOs are used to infer surface brightness limits and we find that at 1.4 kpc from the QSO the un-reddened star formation limit is ≲0.3 M{sub ⊙} yr{sup −1} kpc{sup −2}. If we assume typical extinction values for z = 2 type-1 QSOs, the dereddened SFR for our null detections would be ≲0.6 M{sub ⊙} yr{sup −1} kpc{sup −2}. These IFS observations indicate that while the central black hole is accreting mass at 10%–40% of the Eddington rate, if

  19. Acquisition, tracking, and pointing IV; Proceedings of the Meeting, Orlando, FL, Apr. 19, 20, 1990

    Science.gov (United States)

    Gowrinathan, Sankaran

    1990-09-01

    Various papers on acquisition, tracking, and pointing are presented. Individual topics addressed include: backlash control techniques in geared servo mechanics; optical fiber and photodetector array for robotic seam tracking; star trackers for spacecraft applications; Starfire optical range tracking system for the 1.5 m telescope; real-time video image centroid tracker; optical alignment with a beamwalk system; line-of-sight stabilization requirements for target tracking system; image quality with narrow beam illumination in an active tracking system; IR sensor data fusion for target detection, identification, and tracking; target location and pointing algorithm for a three-axis stabilized line scanner. Also discussed are: adaptive control system techniques applied to inertial stabilization systems; supervisory control of electrooptic tracking and pointing; position loop compensation for flex-pivot-mounted gimbal stabilization systems; advanced testing methods for acquisition, tracking, and pointing; development of kinmatics for gimballed mirror systems.

  20. FBX dosimetry for point dose measurements in head and neck cancer patients

    International Nuclear Information System (INIS)

    Balraj, A.; Thakur, P.K.; Bhatnagar, S.; Vidyasagar, P.B.; Nirhali, Amit; Semwal, M.K.

    2007-01-01

    FBX dosimeter is mainly based on the determination of the radiation dose from the chemical changes produced in an irradiated medium, which can be measured by Spectrophotometry or Colorimetry, for which adequate FBX solution of 2 ml required for measuring the optical density (OD). To measure the point dose using 2 ml solution may lead to error in the measured dose since the solution may occupy 2 cc volume of the point measured. In head and neck carcinoma patients, the treatment area involves curvatures. Fixing 2 ml vial at the body surface is difficult and leads to give wrong readings. In this study we have measured the entrance and exit dose by filling 0.5 ml solution in a flexible catheter and placed at a point in the patient body surface during the radiation treatment. The solution was diluted adding 1.5 ml distilled water to measure the OD in the colorimeter