WorldWideScience

Sample records for base extension technique

  1. High-extensible scene graph framework based on component techniques

    Institute of Scientific and Technical Information of China (English)

    LI Qi-cheng; WANG Guo-ping; ZHOU Feng

    2006-01-01

    In this paper, a novel component-based scene graph is proposed, in which all objects in the scene are classified to different entities, and a scene can be represented as a hierarchical graph composed of the instances of entities. Each entity contains basic data and its operations which are encapsulated into the entity component. The entity possesses certain behaviours which are responses to rules and interaction defined by the high-level application. Such behaviours can be described by script or behaviours model. The component-based scene graph in the paper is more abstractive and high-level than traditional scene graphs. The contents of a scene could be extended flexibly by adding new entities and new entity components, and behaviour modification can be obtained by modifying the model components or behaviour scripts. Its robustness and efficiency are verified by many examples implemented in the Virtual Scenario developed by Peking University.

  2. An enhanced single base extension technique for the analysis of complex viral populations.

    Directory of Open Access Journals (Sweden)

    Dale R Webster

    Full Text Available Many techniques for the study of complex populations provide either specific information on a small number of variants or general information on the entire population. Here we describe a powerful new technique for elucidating mutation frequencies at each genomic position in a complex population. This single base extension (SBE based microarray platform was designed and optimized using poliovirus as the target genotype, but can be easily adapted to assay populations derived from any organism. The sensitivity of the method was demonstrated by accurate and consistent readouts from a controlled population of mutant genotypes. We subsequently deployed the technique to investigate the effects of the nucleotide analog ribavirin on a typical poliovirus population through two rounds of passage. Our results show that this economical platform can be used to investigate dynamic changes occurring at frequencies below 1% within a complex nucleic acid population. Given that many key aspects of the study and treatment of disease are intimately linked to population-level genomic diversity, our SBE-based technique provides a scalable and cost-effective complement to both traditional and next generation sequencing methodologies.

  3. Method Chunks Selection by Multicriteria Techniques: an Extension of the Assembly-based Approach

    CERN Document Server

    Kornyshova, Elena; Salinesi, Camille

    2009-01-01

    The work presented in this paper is related to the area of situational method engineering (SME). In this domain, approaches are developed accordingly to specific project specifications. We propose to adapt an existing method construction process, namely the assembly-based one. One of the particular features of assembly-based SME approach is the selection of method chunks. Our proposal is to offer a better guidance in the retrieval of chunks by the introduction of multicriteria techniques. To use them efficiently, we defined a typology of projects characteristics, in order to identify all their critical aspects, which will offer a priorisation to help the method engineer in the choice between similar chunks.

  4. Network Lifetime Extension Based On Network Coding Technique In Underwater Acoustic Sensor Networks

    Directory of Open Access Journals (Sweden)

    Padmavathy.T.V

    2012-06-01

    Full Text Available Underwater acoustic sensor networks (UWASNs are playing a lot of interest in ocean applications, such as ocean pollution monitoring, ocean animal surveillance, oceanographic data collection, assisted- navigation, and offshore exploration, UWASN is composed of underwater sensors that engage sound to transmit information collected in the ocean. The reason to utilize sound is that radio frequency (RF signals used by terrestrial sensor networks (TWSNs can merely transmit a few meters in the water. Unfortunately, the efficiency of UWASNs is inferior to that of the terrestrial sensor networks (TWSNs. Some of the challenges in under water communication are propagation delay, high bit error rate and limited bandwidth. Our aim is to minimize the power consumption and to improve the reliability of data transmission by finding the optimum number of clusters based on energy consumption.

  5. Investigating Upper Bounds on Network Lifetime Extension for Cell-Based Energy Conservation Techniques in Stationary Ad Hoc Networks

    OpenAIRE

    Santi, Paolo

    2002-01-01

    Cooperative cell-based strategies have been recently proposed as a technique for extending the lifetime of wireless adhoc networks, while only slightly impacting network performance. The effectiveness of this approach depends heavilyon the node density: the higher it is, the more consistent energy savings can potentially be achieved. However, nogeneral analyses of network lifetime have been done either for a base network (one without any energy conservationtechnique) or for one using cooperat...

  6. Synchrotron radiation techniques. Extension to magnetism research

    International Nuclear Information System (INIS)

    Recently developed techniques using synchrotron radiation for the study of magnetism are reviewed. These techniques are based on X-ray absorption spectroscopy (XAS), and they exhibit significant advantages in element specificity. This is very important since the most attractive magnetic materials contain many magnetic elements, and those with small magnetic moments often play an essential role in the magnetic properties. Circularly polarized X-rays emitted from bending magnets or helical undulators allow us to perform magnetic circular dichroism measurements to reveal microscopic magnetic properties of various kinds of magnetic materials. X-ray absorption magnetic circular dichroism (XMCD) is discussed in detail. This technique provides unique information on orbital magnetic moments as well as spin magnetic moments, which are useful for the study of magnetic anisotropy. X-ray magnetic linear dichroism (XMLD) and X-ray resonant magnetic reflectometry (XRMR) techniques are also described. (author)

  7. Two Extension Block Kirschner Wires' Technique for Bony Mallet Thumb

    Science.gov (United States)

    Takase, Fumiaki; Ueda, Yasuhiro; Shinohara, Issei; Kuroda, Ryosuke; Kokubu, Takeshi

    2016-01-01

    Mallet fingers with an avulsion fracture of the distal phalanx or rupture of the terminal tendon of the extensor mechanism is known as a common injury, while mallet thumb is very rare. In this paper, the case of a 19-year-old woman with a sprained left thumb sustained while playing basketball is presented. Plain radiographs and computed tomography revealed an avulsion fracture involving more than half of the articular surface at the base of the distal phalanx. Closed reduction and percutaneous fixation were performed using the two extension block Kirschner wires' technique under digital block anesthesia. At 4 months postoperatively, the patient had achieved excellent results according to Crawford's evaluation criteria and had no difficulties in working or playing basketball. Various conservative and operative treatment strategies have been reported for management of mallet thumb. We chose the two extension block Kirschner wires' technique to minimize invasion of the extensor mechanism and nail bed and to stabilize the large fracture fragment.

  8. Extension of the preceding birth technique.

    Science.gov (United States)

    Aguirre, A

    1994-01-01

    The Brass-inspired Preceding Birth Technique (PBT), is an indirect estimation technique with low costs of administration. PBT involves asking women at a time close to delivery about the survival of the preceding births. The proportion dead is close to the probability of dying between the birth and the second birthday or an index of early childhood mortality (II or Q). Brass and Macrae have determined that II is an estimate of mortality between birth and an age lower than the birth interval or around 4/5 of the birth interval. Hospital and clinic data are likely to include a concentration of women with lower risks of disease because of higher educational levels and socioeconomic status. A simulation of PBT data from the World Fertility Survey for Mexico and Peru found that the proportions of previously dead children were 0.156 in Peru and 0.092 in Mexican home deliveries. Maternity clinic proportions were 0.088 in Peru and 0.066 in Mexico. Use of clinic and hospital data collection underestimated mortality by 32% in Peru and 15% in Mexico. Another alternative was proposed: interviewing women at some other time than delivery. If the interview was during a child/infant intervention after delivery, the subsample would still be subject to a bias, but this problem could be overcome by computing the weighted average of the actual probability of the older child being dead and the conditional probability of the younger child being dead or both younger and older children being dead. Correction factors could be applied using the general standard of the logit life table system of Brass. Calculation of a simple average of the ages of the younger children could provide enough information to help decide which tables to use. Five surveys were selected for testing the factors of dependence between probabilities of death of successive siblings: Bangladesh, Lesotho, Kenya, Ghana, and Guyana. Higher mortality was related to lower dependency factors between the probabilities of death

  9. Two Extension Block Kirschner Wires’ Technique for Bony Mallet Thumb

    Directory of Open Access Journals (Sweden)

    Yutaka Mifune

    2016-01-01

    Full Text Available Mallet fingers with an avulsion fracture of the distal phalanx or rupture of the terminal tendon of the extensor mechanism is known as a common injury, while mallet thumb is very rare. In this paper, the case of a 19-year-old woman with a sprained left thumb sustained while playing basketball is presented. Plain radiographs and computed tomography revealed an avulsion fracture involving more than half of the articular surface at the base of the distal phalanx. Closed reduction and percutaneous fixation were performed using the two extension block Kirschner wires’ technique under digital block anesthesia. At 4 months postoperatively, the patient had achieved excellent results according to Crawford’s evaluation criteria and had no difficulties in working or playing basketball. Various conservative and operative treatment strategies have been reported for management of mallet thumb. We chose the two extension block Kirschner wires’ technique to minimize invasion of the extensor mechanism and nail bed and to stabilize the large fracture fragment.

  10. Wavelet Based Image Denoising Technique

    Directory of Open Access Journals (Sweden)

    Sachin D Ruikar

    2011-03-01

    Full Text Available This paper proposes different approaches of wavelet based image denoising methods. The search for efficient image denoising methods is still a valid challenge at the crossing of functional analysis and statistics. In spite of the sophistication of the recently proposed methods, most algorithms have not yet attained a desirable level of applicability. Wavelet algorithms are useful tool for signal processing such as image compression and denoising. Multi wavelets can be considered as an extension of scalar wavelets. The main aim is to modify the wavelet coefficients in the new basis, the noise can be removed from the data. In this paper, we extend the existing technique and providing a comprehensive evaluation of the proposed method. Results based on different noise, such as Gaussian, Poisson’s, Salt and Pepper, and Speckle performed in this paper. A signal to noise ratio as a measure of the quality of denoising was preferred.

  11. Improvements and extensions of the item count technique

    OpenAIRE

    Groenitz, Heiko

    2014-01-01

    The item count technique (ICT) is a helpful tool to conduct surveys on sensitive characteristics such as tax evasion, corruption, insurance fraud, social fraud or drug consumption. The ICT yields cooperation of the respondents by protecting their privacy. There have been several interesting developments on the ICT in recent years. However, some approaches are incomplete while some research questions can not be tackled by the ICT so far. For these reasons, we broaden the existing literature in...

  12. Flexible use and technique extension of logistics management

    Science.gov (United States)

    Xiong, Furong

    2011-10-01

    As we all know, the origin of modern logistics was in the United States, developed in Japan, became mature in Europe, and expanded in China. This is a historical development of the modern logistics recognized track. Due to China's economic and technological development, and with the construction of Shanghai International Shipping Center and Shanghai Yangshan International Deepwater development, China's modern logistics industry will attain a leap-forward development of a strong pace, and will also catch up with developed countries in the Western modern logistics level. In this paper, the author explores the flexibility of China's modern logistics management techniques to extend the use, and has certain practical and guidance significances.

  13. Should structure-based virtual screening techniques be used more extensively in modern drug discovery?%基于结构的虚拟筛选技术能更广泛应用于现代药物发现?

    Institute of Scientific and Technical Information of China (English)

    V. Leroux; B. Maigret

    2007-01-01

    The drug discovery processes used by academic and industrial scientists are nowadays being questioned. The approaches of the pharmaceutical industry that were successful 20 years ago are simply not suitable anymore for the increasing complexity of available biological targets and the raising standards for medical safety. While the current scientific context resulting from significant developments in genomic, proteomic, organic synthesis and biochemistry seems particularly favorable, the efficiency of drug research does not appear to be following the trend. In particular, the in silico approaches, often considered as potential enhancements for classic drug discovery, are an interesting case. Techniques such as virtual screening did undergo many significant progresses in the past 5-10years and have proven their usefulness in hit discovery approaches for who wants to avoid carrying out too many expensive experimental tests while exploring an important molecular diversity. However, reliability is still deceiving despite constant enhancements,and results are unpredictable. What are the origins of such issues?In this short review, we will first summarize the current status of computer-aided drug design, then we will focus on the structurebased class of virtual screening approaches, for which docking programs constitute the main part. Can such methods give something more than cost savings in the early banks-to-hit phases of the drug discovery process? We will try to answer this question by exploring the highlights and pitfalls of the great variety of docking approaches. It will appear that while the structure-based drug design field is not yet ready to fulfill all of its early promises, it should still be investigated extensively and used with caution. Most interestingly,structure-based methods are best used when combined with other complementary drug design approaches such as the ligand-based ones. In this regard, they will have an increasing role to play in modern drug

  14. Biomechanical analysis of press-extension technique on degenerative lumbar with disc herniation and staggered facet joint.

    Science.gov (United States)

    Du, Hong-Gen; Liao, Sheng-Hui; Jiang, Zhong; Huang, Huan-Ming; Ning, Xi-Tao; Jiang, Neng-Yi; Pei, Jian-Wei; Huang, Qin; Wei, Hui

    2016-05-01

    This study investigates the effect of a new Chinese massage technique named "press-extension" on degenerative lumbar with disc herniation and facet joint dislocation, and provides a biomechanical explanation of this massage technique. Self-developed biomechanical software was used to establish a normal L1-S1 lumbar 3D FE model, which integrated the spine CT and MRI data-based anatomical structure. Then graphic technique is utilized to build a degenerative lumbar FE model with disc herniation and facet joint dislocation. According to the actual press-extension experiments, mechanic parameters are collected to set boundary condition for FE analysis. The result demonstrated that press-extension techniques bring the annuli fibrosi obvious induction effect, making the central nucleus pulposus forward close, increasing the pressure in front part. Study concludes that finite element modelling for lumbar spine is suitable for the analysis of press-extension technique impact on lumbar intervertebral disc biomechanics, to provide the basis for the disease mechanism of intervertebral disc herniation using press-extension technique. PMID:27275119

  15. Biomechanical analysis of press-extension technique on degenerative lumbar with disc herniation and staggered facet joint

    Directory of Open Access Journals (Sweden)

    Hong-gen Du

    2016-05-01

    Full Text Available This study investigates the effect of a new Chinese massage technique named “press-extension” on degenerative lumbar with disc herniation and facet joint dislocation, and provides a biomechanical explanation of this massage technique. Self-developed biomechanical software was used to establish a normal L1–S1 lumbar 3D FE model, which integrated the spine CT and MRI data-based anatomical structure. Then graphic technique is utilized to build a degenerative lumbar FE model with disc herniation and facet joint dislocation. According to the actual press-extension experiments, mechanic parameters are collected to set boundary condition for FE analysis. The result demonstrated that press-extension techniques bring the annuli fibrosi obvious induction effect, making the central nucleus pulposus forward close, increasing the pressure in front part. Study concludes that finite element modelling for lumbar spine is suitable for the analysis of press-extension technique impact on lumbar intervertebral disc biomechanics, to provide the basis for the disease mechanism of intervertebral disc herniation using press-extension technique.

  16. A Novel Active Network Architecture Based on Extensible Services Router

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    Active networks are a new kind of packet-switched networks in which packets have code fragments that are executed on the intermediary nodes (routers). The code can extend or modify the foundation architecture of a network. In this paper, the authors present a novel active network architecture combined with advantages of two major active networks technology based on extensible services router. The architecture consists of extensible service router, active extensible components server and key distribution center (KDC). Users can write extensible service components with programming interface. At the present time, we have finished the extensible services router prototype system based on Highly Efficient Router Operating System (HEROS), active extensible components server and KDC prototype system based on Linux.

  17. Space-based observation of the extensive airshowers

    Directory of Open Access Journals (Sweden)

    Ebisuzaki T.

    2013-06-01

    Full Text Available Space based observations of extensive air showers constitute the next experimental challenge for the study of the universe at extreme energy. Space observation will allow a “quantum jump” in the observational area available to detect the UV light tracks produced by particles with energies higher than 1020 eV. These are thought to reach the Earth almost undeflected by the cosmic magnetic field. This new technique will contribute to establish the new field of astronomy and astrophysics performed with charged particles and neutrinos at the highest energies. This idea was created by the incredible efforts of three outstanding comic ray physicists: John Linsley, Livio Scarsi, and Yoshiyuki Takahashi. This challenging technique has four significant merits in comparison with ground-based observations: 1 Very large observational area, 2 Well constrained distances of the showers, 3 Clear and stable atmospheric transmission in the above half troposphere, 4 Uniform Exposure across both the north and south skies. Four proposed and planned missions constitute the roadmap of the community: TUS, JEM-EUSO, KLPVE, and Super-EUSO will contribute step-by-step to establish this challenging field of research.

  18. Source extension based on ε-entropy

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jian; YU Sheng-sheng; ZHOU Jing-li; ZHENG Xin-wei

    2005-01-01

    It is known by entropy theory that image is a source correlated with a certain characteristic of probability. The entropy rate of the source and ? entropy (rate-distortion function theory) are the information content to identify the characteristics of video images, and hence are essentially related with video image compression. They are fundamental theories of great significance to image compression, though impossible to be directly turned into a compression method. Based on the entropy theory and the image compression theory, by the application of the rate-distortion feature mathematical model and Lagrange multipliers to some theoretical problems in the H.264 standard, this paper presents a new the algorithm model of coding rate-distortion. This model is introduced into complete test on the capability of the test model of JM61e (JUT Test Model). The result shows that the speed of coding increases without significant reduction of the rate-distortion performance of the coder.

  19. Technique for anisotropic extension of organic crystals: Application to temperature dependence of electrical resistance

    Science.gov (United States)

    Yamamoto, Takashi; Kato, Reizo; Yamamoto, Hiroshi M.; Fukaya, Atsuko; Yamasawa, Kenji; Takahashi, Ichiro; Akutsu, Hiroki; Akutsu-Sato, Akane; Day, Peter

    2007-08-01

    We have developed a technique for the anisotropic extension of fragile molecular crystals. The pressure medium and the instrument, which extends the pressure medium, are both made from epoxy resin. Since the thermal contraction of our instrument is identical to that of the pressure medium, the strain applied to the pressure medium has no temperature dependence down to 2K. Therefore, the degree of extension applied to the single crystal at low temperatures is uniquely determined from the degree of extension in the pressure medium and thermal contractions of the epoxy resin and the single crystal at ambient pressure. Using this novel instrument, we have measured the temperature dependence of the electrical resistance of metallic, superconducting, and insulating materials. The experimental results are discussed from the viewpoint of the extension (compression) of the lattice constants along the parallel (perpendicular) direction.

  20. Epidural volume extension: A novel technique and its efficacy in high risk cases.

    Science.gov (United States)

    Tiwari, Akhilesh Kumar; Singh, Rajeev Ratan; Anupam, Rudra Pratap; Ganguly, S; Tomar, Gaurav Singh

    2012-01-01

    We present a unique case series restricting ourselves only to the high-risk case of different specialities who underwent successful surgery in our Institute by using epidural volume extension's technique using 1 mL of 0.5% ropivacaine and 25 μg of fentanyl. PMID:25885627

  1. A Knowledge-based and Extensible Aircraft Conceptual Design Environment

    Institute of Scientific and Technical Information of China (English)

    FENG Haocheng; LUO Mingqiang; LIU Hu; WU Zhe

    2011-01-01

    Design knowledge and experience are the bases to carry out aircraft conceptual design tasks due to the high complexity and integration of the tasks during this phase.When carrying out the same task,different designers may need individual strategies to fulfill their own demands.A knowledge-based and extensible method in building aircraft conceptual design systems is studied considering the above requirements.Based on the theory,a knowledge-based aircraft conceptual design environment,called knowledge-based and extensible aircraft conceptual design environment (KEACDE) with open architecture,is built as to enable designers to wrap add-on extensions and make their own aircraft conceptual design systems.The architecture,characteristics and other design and development aspects of KEACDE are discussed.A civil airplane conceptual design system (CACDS) is achieved using KEACDE.Finally,a civil airplane design case is presented to demonstrate the usability and effectiveness of this environment.

  2. Designing and application of SAN extension interface based on CWDM

    Science.gov (United States)

    Qin, Leihua; Yu, Shengsheng; Zhou, Jingli

    2005-11-01

    As Fibre Channel (FC) becomes the protocol of choice within corporate data centers, enterprises are increasingly deploying SANs in their data central. In order to mitigate the risk of losing data and improve the availability of data, more and more enterprises are increasingly adopting storage extension technologies to replicate their business critical data to a secondary site. Transmitting this information over distance requires a carrier grade environment with zero data loss, scalable throughput, low jitter, high security and ability to travel long distance. To address this business requirements, there are three basic architectures for storage extension, they are Storage over Internet Protocol, Storage over Synchronous Optical Network/Synchronous Digital Hierarchy (SONET/SDH) and Storage over Dense Wavelength Division Multiplexing (DWDM). Each approach varies in functionality, complexity, cost, scalability, security, availability , predictable behavior (bandwidth, jitter, latency) and multiple carrier limitations. Compared with these connectiviy technologies,Coarse Wavelength Division Multiplexing (CWDM) is a Simplified, Low Cost and High Performance connectivity solutions for enterprises to deploy their storage extension. In this paper, we design a storage extension connectivity over CWDM and test it's electrical characteristic and random read and write performance of disk array through the CWDM connectivity, testing result show us that the performance of the connectivity over CWDM is acceptable. Furthermore, we propose three kinds of network architecture of SAN extension based on CWDM interface. Finally the credit-Based flow control mechanism of FC, and the relationship between credits and extension distance is analyzed.

  3. On the extension of the complex-step derivative technique to pseudospectral algorithms

    International Nuclear Information System (INIS)

    The complex-step derivative (CSD) technique is a convenient and highly accurate strategy to perform a linearized 'perturbation' analysis to determine a 'directional derivative' via a minor modification of an existing nonlinear simulation code. The technique has previously been applied to nonlinear simulation codes (such as finite-element codes) which employ real arithmetic only. The present note examines the suitability of this technique for extension to efficient pseudospectral simulation codes which nominally use the fast Fourier transform (FFT) to convert back and forth between the physical and transformed representations of the system. It is found that, if used carefully, this extension retains the remarkable accuracy of the CSD approach. However, to perform this extension without sacrificing this accuracy, particular care must be exercised; specifically, the state (real) and perturbation (imaginary) components of the complexified system must be transformed separately and arranged in such a manner that they are kept distinct during the process of differentiation in the transformed space in order to avoid the linear combination of the large and small quantities in the analysis. It is shown that this is relatively straightforward to implement even in complicated nonlinear simulation codes, thus positioning the CSD approach as an attractive and relatively simple alternative to hand coding a perturbation (a.k.a. 'tangent linear') code for determining the directional derivative even when pseudospectral algorithms are employed

  4. Extensive techniques of TIPS in buddi-chiari syndrome with occlusive hepatic veins

    International Nuclear Information System (INIS)

    Objective: To elaborate the ameliorative technique steps of transjugular intrahepatic portosystemic shunt (TIPS) and to evaluate its therapeutic effects to Buddi-Chiari syndrome with occlusive hepatic veins. Methods: Eleven patients were treated by the improved methods of TIPS, who were diagnosed as Buddi-Chiari syndrome with widespread stenosis or occlusive lesions of hepatic veins and verified through imagiology. Key points of the extensive techniques of TIPS were to design and build the access to the artificial hepatic vein. The changes of portal vein pressure, shunt blood flow, and stent patency posterior to the procedure were assessed and followed up for 24 months. Results: The intrahepatic shunt between portal vein and inferior vena cava was successfully built and good clinical responses were obtained in all patients. The main portal pressure decreased from (4.62 ± 0.52) kPa (x-bar ± s) to (2.16 ± 0.21) kPa posterior to the shunt. The maximum velocity of shunt blood flow was (56.2 ± 3.50) cm/s and stent patency was 7/11 at 24 months after the procedure. Conclusion: The extensive TIPS has a high successful technique rate, and is worthy of a new therapeutic means to Buddi-Chiari syndrome with occlusive hepatic veins

  5. Decomposition Techniques and Effective Algorithms in Reliability-Based Optimization

    DEFF Research Database (Denmark)

    Enevoldsen, I.; Sørensen, John Dalsgaard

    1995-01-01

    The common problem of an extensive number of limit state function calculations in the various formulations and applications of reliability-based optimization is treated. It is suggested to use a formulation based on decomposition techniques so the nested two-level optimization problem can be solved...

  6. Support vector machines optimization based theory, algorithms, and extensions

    CERN Document Server

    Deng, Naiyang; Zhang, Chunhua

    2013-01-01

    Support Vector Machines: Optimization Based Theory, Algorithms, and Extensions presents an accessible treatment of the two main components of support vector machines (SVMs)-classification problems and regression problems. The book emphasizes the close connection between optimization theory and SVMs since optimization is one of the pillars on which SVMs are built.The authors share insight on many of their research achievements. They give a precise interpretation of statistical leaning theory for C-support vector classification. They also discuss regularized twi

  7. THE PRIMER EXTENSION TECHNIQUE FOR THE POLYMORPHISM DETECTION AT OVINE PRN-P LOCUS

    Directory of Open Access Journals (Sweden)

    VIORICA COSIER

    2013-12-01

    Full Text Available Scrapie is a prionic illness with endemic character in many parts of the glob, and the control measures is difficult to apply because of the long incubation period, the lack of the preclinical manifestation and the existing tests for diagnostic in living animals. The Ppn-p locus is polymorphic with known variability at codon 136, 154, 171, which are associated with different sensibility in experimental and natural spongiform encephalopaties. General the possible combinations of the 5 amino acids encoded by the 3 different codons will determine the existence of 15 possible genotypes. To put in evidence those polymorphisms at the ovine Prn-p locus, several methods are developed but the most accurate assay is the direct sequencing of the gene and the primer extension technique. The purpose of this study was to determine the genotypes at Prp locus in 123 male of Tsurcana breed, Hateg ecotype, using primer extension technique (ABI 3130xl Genetic Analyzer and to establish the risk groups of the susceptibility at scrapie disease.

  8. THE PRIMER EXTENSION TECHNIQUE FOR THE POLYMORPHISM DETECTION AT OVINE PRN-P LOCUS

    Directory of Open Access Journals (Sweden)

    COSIER VIORICA

    2008-01-01

    Full Text Available Scrapie is a prionic illness with endemic character in many parts of the glob, and the control measuresis difficult to apply because of the long incubation period, the lack of the preclinical manifestation andthe existing tests for diagnostic in living animals. The Ppn-p locus is polymorphic with knownvariability at codon 136, 154, 171, which are associated with different sensibility in experimental andnatural spongiform encephalopaties. General the possible combinations of the 5 amino acids encodedby the 3 different codons will determine the existence of 15 possible genotypes. To put in evidencethose polymorphisms at the ovine Prn-p locus, several methods are developed but the most accurateassay is the direct sequencing of the gene and the primer extension technique. The purpose of thisstudy was to determine the genotypes at Prp locus in 123 male of Tsurcana breed, Hateg ecotype,using primer extension technique (ABI 3130xl Genetic Analyzer and to establish the risk groups of thesusceptibility at scrapie disease.

  9. A useful technique for adjusting nasal tip projection in Asian rhinoplasty: Trapezoidal caudal extension cartilage grafting.

    Science.gov (United States)

    Liu, Shao-Cheng; Lin, Deng-Shan; Wang, Hsing-Won; Kao, Chuan-Hsiang

    2016-01-01

    The purpose of this article is to present our experience with Asian patients in (1) using a trapezoidal caudal extension cartilage graft to adjust the tip projection in tip refinement for augmentation rhinoplasty, especially for the correction of short nose, and (2) avoiding complications of augmentation rhinoplasty with alloplastic implants. We conducted a retrospective chart review of 358 rhinoplasties that were performed by the corresponding author from January 2004 through July 2009. Patients were included in this study if they had undergone open rhinoplasty with a trapezoidal caudal extension cartilage graft as the only tip-modifying procedure. Patients in whom any additional grafting was performed that might have altered the nasal tip position were excluded. The surgical results were analyzed in terms of the degree of satisfaction judged separately by investigators and by patients. A total of 84 patients-46 males and 38 females, all Asians, aged 13 to 61 years (mean: 29.3)-met our eligibility criteria. Postoperative follow-up for 24 months was achieved in 62 patients. At the 24-month follow-up, the surgeons judged the results to be good or very good in 57 of the 62 patients (91.9%); at the same time, 56 patients (90.3%) said they were satisfied or very satisfied with their aesthetic outcome. Good nasal tip projection, a natural columellar appearance, and improvement in the nasolabial angle were achieved for most patients. Two patients required revision rhinoplasty to correct an insufficient augmentation and migration of the onlay graft. No severe complications were observed during the 2-year follow-up. We have found that trapezoidal caudal extension cartilage grafting in nasal tip refinement is an easy technique to learn and execute, its results are predictable, and it has been associated with no major complications. We recommend trapezoidal caudal extension cartilage grafting for Asian patients as a good and reliable alternative for managing tip projection

  10. The Search for Extension: 7 Steps to Help People Find Research-Based Information on the Internet

    Science.gov (United States)

    Hill, Paul; Rader, Heidi B.; Hino, Jeff

    2012-01-01

    For Extension's unbiased, research-based content to be found by people searching the Internet, it needs to be organized in a way conducive to the ranking criteria of a search engine. With proper web design and search engine optimization techniques, Extension's content can be found, recognized, and properly indexed by search engines and…

  11. Project Milestone. Analysis of Range Extension Techniques for Battery Electric Vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, Jeremy [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wood, Eric [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Pesaran, Ahmad [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2013-07-01

    This report documents completion of the July 2013 milestone as part of NREL’s Vehicle Technologies Annual Operating Plan with the U.S. Department of Energy. The objective was to perform analysis on range extension techniques for battery electric vehicles (BEVs). This work represents a significant advancement over previous thru-life BEV analyses using NREL’s Battery Ownership Model, FastSim,* and DRIVE.* Herein, the ability of different charging infrastructure to increase achievable travel of BEVs in response to real-world, year-long travel histories is assessed. Effects of battery and cabin thermal response to local climate, battery degradation, and vehicle auxiliary loads are captured. The results reveal the conditions under which different public infrastructure options are most effective, and encourage continued study of fast charging and electric roadway scenarios.

  12. CUSTOMER BASED BRAND EQUITY DETERMINANTS ON BRAND EXTENSION IN TELEVISION INDUSTRY

    OpenAIRE

    Vetrivel, V.; A.N.Solayappan; Jothi.Jayakrishnan

    2015-01-01

    The aim of this paper is find out the relationship of customer ased brand equity and brand extension also compare with the selected customer based brand equity determinants of brand awareness, brand association, brand trust, customer satisfaction and CBBE. The structured questionnaire was distributed among the espondents by following the simple random sampling technique. Questionnaires were distributed among 550 respondents, out of which, only 517 were properly filled. Data was analyzed thr...

  13. Multihop Relay Techniques for Communication Range Extension in Near-Field Magnetic Induction Communication Systems

    Directory of Open Access Journals (Sweden)

    Mehrnoush Masihpour

    2013-05-01

    Full Text Available In this paper, multihop relaying in RF-based communications and near field magnetic induction communication (NFMIC is discussed. Three multihop relay strategies for NFMIC are proposed: Non Line of Sight Magnetic Induction Relay (NLoS-MI Relay, Non Line of Sight Master/Assistant Magnetic Induction Relay1 (NLoS-MAMI Relay1 and Non Line of Sight Master/Assistant Magnetic Induction Relay2 (NLoSMAMI Relay2. In the first approach only one node contributes to the communication, while in the other two techniques (which are based on a master-assistant strategy, two relaying nodes are employed. This paper shows that these three techniques can be used to overcome the problem of dead spots within a body area network and extend the communication range without increasing the transmission power and the antenna size or decreasing receiver sensitivity. The impact of the separation distance between the nodes on the achievable RSS and channel data rate is evaluated for the three techniques. It is demonstrated that the technique which is most effective depends on the specific network topology. Optimum selection of nodes as relay master and assistant based on the location of the nodes is discussed. The paper also studies the impact of the quality factor on achievable data rate. It is shown that to obtain the highest data rate, the optimum quality factor needs to be determined for each proposed cooperative communication method.

  14. A graph-based approach for designing extensible pipelines

    Directory of Open Access Journals (Sweden)

    Rodrigues Maíra R

    2012-07-01

    Full Text Available Abstract Background In bioinformatics, it is important to build extensible and low-maintenance systems that are able to deal with the new tools and data formats that are constantly being developed. The traditional and simplest implementation of pipelines involves hardcoding the execution steps into programs or scripts. This approach can lead to problems when a pipeline is expanding because the incorporation of new tools is often error prone and time consuming. Current approaches to pipeline development such as workflow management systems focus on analysis tasks that are systematically repeated without significant changes in their course of execution, such as genome annotation. However, more dynamism on the pipeline composition is necessary when each execution requires a different combination of steps. Results We propose a graph-based approach to implement extensible and low-maintenance pipelines that is suitable for pipeline applications with multiple functionalities that require different combinations of steps in each execution. Here pipelines are composed automatically by compiling a specialised set of tools on demand, depending on the functionality required, instead of specifying every sequence of tools in advance. We represent the connectivity of pipeline components with a directed graph in which components are the graph edges, their inputs and outputs are the graph nodes, and the paths through the graph are pipelines. To that end, we developed special data structures and a pipeline system algorithm. We demonstrate the applicability of our approach by implementing a format conversion pipeline for the fields of population genetics and genetic epidemiology, but our approach is also helpful in other fields where the use of multiple software is necessary to perform comprehensive analyses, such as gene expression and proteomics analyses. The project code, documentation and the Java executables are available under an open source license at http

  15. Nontraditional manufacturing technique-Nano machining technique based on SPM

    Institute of Scientific and Technical Information of China (English)

    DONG Shen; YAN Yongda; SUN Tao; LIANG Yingchun; CHENG Kai

    2004-01-01

    Nano machining based on SPM is a novel, nontraditional advanced manufacturing technique. There are three main machining methods based on SPM, i.e.single atom manipulation, surface modification using physical or chemical actions and mechanical scratching. The current development of this technique is summarized. Based on the analysis of mechanical scratching mechanism, a 5 μm micro inflation hole is fabricated on the surface of inertial confinement fusion (ICF) target. The processing technique is optimized. The machining properties of brittle material, single crystal Ge, are investigated. A micro machining system combining SPM and a high accuracy stage is developed. Some 2D and 3D microstructures are fabricated using the system. This method has broad applications in the field of nano machining.

  16. DDH-Like Assumptions Based on Extension Rings

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Kiltz, Eike;

    2012-01-01

    We introduce and study a new type of DDH-like assumptions based on groups of prime order q. Whereas standard DDH is based on encoding elements of $\\mathbb{F}_{q}$ “in the exponent” of elements in the group, we ask what happens if instead we put in the exponent elements of the extension ring $R...... obtain, in fact, an infinite hierarchy of progressively weaker assumptions whose complexities lie “between” DDH and CDH. This leads to a large number of new schemes because virtually all known DDH-based constructions can very easily be upgraded to be based on d-DDH. We use the same construction...... and security proof but get better security and moreover, the amortized complexity (e.g, computation per encrypted bit) is the same as when using DDH. We also show that d-DDH, just like DDH, is easy in bilinear groups. We therefore suggest a different type of assumption, the d-vector DDH problems (d...

  17. The MIDAS Experiment: A New Technique for the Detection of Extensive Air Showers

    CERN Document Server

    Williams, C; Berlin, A; Bohacova, M; Facal, P; Genat, J F; Mills, E; Monasor, M; Privitera, P; Reyes, L C; d'Orfeuil, B Rouille; Wayne, S; Alekotte, I; Bertou, X; Bonifazi, C; Neto, J R T de Mello; Santos, E M; Alvarez-Muñiz, J; Carvalho, W; Zas, E

    2010-01-01

    Recent measurements suggest free electrons created in ultra-high energy cosmic ray extensive air showers (EAS) can interact with neutral air molecules producing Bremsstrahlung radiation in the microwave regime. The microwave radiation produced is expected to scale with the number of free electrons in the shower, which itself is a function of the energy of the primary particle and atmospheric depth. Using these properties a calorimetric measurement of the EAS is possible. This technique is analogous to fluorescence detection with the added benefit of a nearly 100% duty cycle and practically no atmospheric attenuation. The Microwave Detection of Air Showers (MIDAS) prototype is currently being developed at the University of Chicago. MIDAS consists of a 53 feed receiver operating in the 3.4 to 4.2 GHz band. The camera is deployed on a 4.5 meter parabolic reflector and is instrumented with high speed power detectors and autonomous FPGA trigger electronics. We present the current status of the MIDAS instrument and...

  18. An extension to the dynamic plane source technique for measuring thermal conductivity, thermal diffusivity, and specific heat of dielectric solids

    Science.gov (United States)

    Karawacki, Ernest; Suleiman, Bashir M.; ul-Haq, Izhar; Nhi, Bui-Thi

    1992-10-01

    The recently developed dynamic plane source (DPS) technique for simultaneous determination of the thermal properties of fast thermally conducting materials with thermal conductivities between 200 and 2 W/mK has now been extended for studying relatively slow conducting materials with thermal conductivities equal or below 2 W/mK. The method is self-checking since the thermal conductivity, thermal diffusivity specific heat, and effusivity of the material are obtained independently from each other. The theory of the technique and the experimental arrangement are given in detail. The data evaluation procedure is simple and makes it possible to reveal the distortions due to the nonideal experimental conditions. The extension to the DPS technique has been implemented at room temperature to study the samples of cordierite-based ceramic Cecorite 130P (thermal conductivity equal to 1.48 W/mK), rubber (0.403 W/mK), and polycarbonate (0.245 W/mK). The accuracy of the method is within ±5%.

  19. Extensibility in Model-Based Business Process Engines

    Science.gov (United States)

    Sánchez, Mario; Jiménez, Camilo; Villalobos, Jorge; Deridder, Dirk

    An organization’s ability to embrace change, greatly depends on systems that support their operation. Specifically, process engines might facilitate or hinder changes, depending on their flexibility, their extensibility and the changes required: current workflow engine characteristics create difficulties in organizations that need to incorporate some types of modifications. In this paper we present Cumbia, an extensible MDE platform to support the development of flexible and extensible process engines. In a Cumbia process, models represent participating concerns (control, resources, etc.), which are described with concern-specific languages. Cumbia models are executed in a coordinated way, using extensible engines specialized for each concern.

  20. Extension Clientele Preferences: Accessing Research-Based Information Online

    Science.gov (United States)

    Davis, Jamie M.

    2014-01-01

    Research has indicated there are a number of benefits to Extension educators in delivering educational program and content through distance technology methods. However, Extension educators are commonly apprehensive about this transition due to assumptions made about their clientele, because little research has been conducted to examine…

  1. The Liquisolid Technique: Based Drug Delivery System

    OpenAIRE

    Izhar Ahmed Syed; E. Pavani

    2012-01-01

    The “Liquisolid” technique is a novel and capable addition towards such an aims for solubility enhancement and dissolution improvement, thereby it increases the bioavailability. It contains liquid medications in powdered form. This technique is an efficient method for formulating water insoluble and water soluble drugs. This technique is based upon the admixture of drug loaded solutions with appropriate carrier and coating materials. The use of non-volatile solvent causes improved wettability...

  2. Tandem Rhomboid Flap Repair: A New Technique in Treatment of Extensive Pilonidal Disease of the Natal Cleft

    Science.gov (United States)

    Kumar M, Kamal; Babu K, Ramesh; Dhanraj, Prema

    2014-01-01

    Pilonidal sinus is an annoying chronic benign disease causing disability in young adults, mainly affecting the intergluteal furrow. Treatment of this condition remains controversial and is represented by a myriad of techniques available. Most of the techniques are judged against open excision and secondary healing in terms of minimizing disease recurrence and patient discomfort. More recently superiority of flap reconstruction to non-flap techniques is accepted. An ideal operation should be simple, associated with minimal pain and wound care after surgery, minimize hospital stay and have a low recurrence rate. We hereby present a new type of rhomboid flap technique for an extensive pilonidal sinus disease. This technique has given good results in our hands considering the aforementioned factors of an ideal operation. The following case report is of our first stint with the procedure. PMID:25386481

  3. A Conformal Extension Theorem based on Null Conformal Geodesics

    CERN Document Server

    Lübbe, Christian

    2008-01-01

    In this article we describe the formulation of null geodesics as null conformal geodesics and their description in the tractor formalism. A conformal extension theorem through an isotropic singularity is proven by requiring the boundedness of the tractor curvature and its derivatives to sufficient order along a congruence of null conformal geodesic. This article extends earlier work by Tod and Luebbe.

  4. A graph-based approach for designing extensible pipelines

    OpenAIRE

    Rodrigues Maíra R; Magalhães Wagner CS; Machado Moara; Tarazona-Santos Eduardo

    2012-01-01

    Abstract Background In bioinformatics, it is important to build extensible and low-maintenance systems that are able to deal with the new tools and data formats that are constantly being developed. The traditional and simplest implementation of pipelines involves hardcoding the execution steps into programs or scripts. This approach can lead to problems when a pipeline is expanding because the incorporation of new tools is often error prone and time consuming. Current approaches to pipeline d...

  5. Extensive Characterisation of Copper-clad Plates, Bonded by the Explosive Technique, for ITER Electrical Joints

    CERN Document Server

    Langeslag, S A E; Libeyre, P; Gung, C Y

    2015-01-01

    Cable-in-conduit conductors will be extensively implemented in the large superconducting magnet coils foreseen to confine the plasma in the ITER experiment. The design of the various magnet systems imposes the use of electrical joints to connect unit lengths of superconducting coils by inter-pancake coupling. These twin-box lap type joints, produced by compacting each cable end in into a copper - stainless steel bimetallic box, are required to be highly performing in terms of electrical and mechanical prop- erties. To ascertain the suitability of the first copper-clad plates, recently produced, the performance of several plates is studied. Validation of the bonded interface is carried out by determining microstructural, tensile and shear characteristics. These measure- ments confirm the suitability of explosion bonded copper-clad plates for an overall joint application. Additionally, an extensive study is conducted on the suitability of certain copper purity grades for the various joint types.

  6. Flow Modeling Based Wall Element Technique

    Directory of Open Access Journals (Sweden)

    Sabah Tamimi

    2012-08-01

    Full Text Available Two types of flow where examined, pressure and combination of pressure and Coquette flow of confined turbulent flow with a one equation model used to depict the turbulent viscosity of confined flow in a smooth straight channel when a finite element technique based on a zone close to a solid wall has been adopted for predicting the distribution of the pertinent variables in this zone and examined even with case when the near wall zone was extended away from the wall. The validation of imposed technique has been tested and well compared with other techniques.

  7. Extension of the Viscous Collision Limiting Direct Simulation Monte Carlo Technique to Multiple Species

    Science.gov (United States)

    Liechty, Derek S.; Burt, Jonathan M.

    2016-01-01

    There are many flows fields that span a wide range of length scales where regions of both rarefied and continuum flow exist and neither direct simulation Monte Carlo (DSMC) nor computational fluid dynamics (CFD) provide the appropriate solution everywhere. Recently, a new viscous collision limited (VCL) DSMC technique was proposed to incorporate effects of physical diffusion into collision limiter calculations to make the low Knudsen number regime normally limited to CFD more tractable for an all-particle technique. This original work had been derived for a single species gas. The current work extends the VCL-DSMC technique to gases with multiple species. Similar derivations were performed to equate numerical and physical transport coefficients. However, a more rigorous treatment of determining the mixture viscosity is applied. In the original work, consideration was given to internal energy non-equilibrium, and this is also extended in the current work to chemical non-equilibrium.

  8. A Shape Based Image Search Technique

    Directory of Open Access Journals (Sweden)

    Aratrika Sarkar

    2014-08-01

    Full Text Available This paper describes an interactive application we have developed based on shaped-based image retrieval technique. The key concepts described in the project are, imatching of images based on contour matching; iimatching of images based on edge matching; iiimatching of images based on pixel matching of colours. Further, the application facilitates the matching of images invariant of transformations like i translation ; ii rotation; iii scaling. The key factor of the system is, the system shows the percentage unmatched of the image uploaded with respect to the images already existing in the database graphically, whereas, the integrity of the system lies on the unique matching techniques used for optimum result. This increases the accuracy of the system. For example, when a user uploads an image say, an image of a mango leaf, then the application shows all mango leaves present in the database as well other leaves matching the colour and shape of the mango leaf uploaded.

  9. Some Novel Solidification Processing Techniques Being Investigated at MSFC: Their Extension for Study Aboard the ISS

    Science.gov (United States)

    Grugel, R. N.; Anilkumar, A. V.; Fedoseyev, A. I.; Mazuruk, K.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    The float-zone and the Bridgman techniques are two classical directional solidification processing methods that are used to improve materials properties. Unfortunately, buoyancy effects and gravity-driven convection due to unstable temperature and/or composition gradients still produce solidified products that exhibit segregation and, consequently, degraded properties. This presentation will briefly introduce how some novel processing applications can minimize detrimental gravitational effects and enhance microstructural uniformity. Discussion follows that to fully understand and model these procedures requires utilizing, in conjunction with a novel mixing technique, the facilities and quiescent microgravity environment available on the ISS.

  10. OpenARC: Extensible OpenACC Compiler Framework for Directive-Based Accelerator Programming Study

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seyong [ORNL; Vetter, Jeffrey S [ORNL

    2014-01-01

    Directive-based, accelerator programming models such as OpenACC have arisen as an alternative solution to program emerging Scalable Heterogeneous Computing (SHC) platforms. However, the increased complexity in the SHC systems incurs several challenges in terms of portability and productivity. This paper presents an open-sourced OpenACC compiler, called OpenARC, which serves as an extensible research framework to address those issues in the directive-based accelerator programming. This paper explains important design strategies and key compiler transformation techniques needed to implement the reference OpenACC compiler. Moreover, this paper demonstrates the efficacy of OpenARC as a research framework for directive-based programming study, by proposing and implementing OpenACC extensions in the OpenARC framework to 1) support hybrid programming of the unified memory and separate memory and 2) exploit architecture-specific features in an abstract manner. Porting thirteen standard OpenACC programs and three extended OpenACC programs to CUDA GPUs shows that OpenARC performs similarly to a commercial OpenACC compiler, while it serves as a high-level research framework.

  11. Dataset Quality Assessment: An extension for analogy based effort estimation

    Directory of Open Access Journals (Sweden)

    Mohammad Azzeh

    2013-03-01

    Full Text Available Estimation by Analogy(EBAisanincreasingly active researchmethod in the area ofsoftwareengineering. Thefundamentalassumption of this method is thatthesimilarprojects in terms of attributevalueswillalsobesimilar in terms of effortvalues.It is well recognized thatthequality ofsoftwaredatasets hasaconsiderable impact on the reliability and accuracy of such method.Therefore,if thesoftwaredataset does notsatisfythe aforementionedassumptionthenitis notratherusefulfor EBAmethod.This paperpresentsa new methodbased on Kendall’s row-wise rank correlationthat enablesdataqualityevaluationandproviding a data pre-processing stagefor EBA.The proposedmethodprovidessound statistical basis and justification for the processofdataquality evaluation. Unlike Analogy-X,ourmethodhastheability to deal withcategorical attributesindividually withoutthe need for partitioningthedataset.Experimental results showed thatthe proposed method could formauseful extension forEBAas itenables: dataset quality evaluation, attribute selection and identifying abnormal observation

  12. Design and Implementation of Agro-technical Extension Information System Based on Cloud Storage

    OpenAIRE

    Guo, Leifeng; Wang, Wensheng; Yang, Yong; Sun,Zhiguo

    2013-01-01

    International audience In order to solve the problems of low efficiency and backward methods in the agro-technical extension activities, this paper designed an agro-technical extension information system based on cloud storage technology. This paper studied the key technologies, such as cloud storage service engine, cloud storage management node and cloud storage data node and designed the overall architecture of the agro-technical extension information system based on cloud storage techno...

  13. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  14. Rapid Tooling Technique Based on Stereolithograph Prototype

    Institute of Scientific and Technical Information of China (English)

    丁浩; 狄平; 顾伟生; 朱世根

    2001-01-01

    Rapid tooling technique based on the sterelithograph prototype is investigated. The epoxy tooling technological process was elucidated. It is analyzed in detail that the epoxy resin formula is easy to cast, curing process, and release agents. The transitional plaster model is also proposed. The mold to encrust mutual.inductors with epoxy and mold to inject plastic soapboxes was made with the technique The tooling needs very little time and cost, for the process is only to achieve the nice replica of the prototype. It is benefit for the trial and small batch of production.

  15. Development of CDMS-II Surface Event Rejection Techniques and Their Extensions to Lower Energy Thresholds

    Energy Technology Data Exchange (ETDEWEB)

    Hofer, Thomas James [Univ. of Minnesota, Minneapolis, MN (United States)

    2014-12-01

    The CDMS-II phase of the Cryogenic Dark Matter Search, a dark matter direct-detection experiment, was operated at the Soudan Underground Laboratory from 2003 to 2008. The full payload consisted of 30 ZIP detectors, totaling approximately 1.1 kg of Si and 4.8 kg of Ge, operated at temperatures of 50 mK. The ZIP detectors read out both ionization and phonon pulses from scatters within the crystals; channel segmentation and analysis of pulse timing parameters allowed e ective ducialization of the crystal volumes and background rejection su cient to set world-leading limits at the times of their publications. A full re-analysis of the CDMS-II data was motivated by an improvement in the event reconstruction algorithms which improved the resolution of ionization energy and timing information. The Ge data were re-analyzed using three distinct background-rejection techniques; the Si data from runs 125 - 128 were analyzed for the rst time using the most successful of the techniques from the Ge re-analysis. The results of these analyses prompted a novel \\mid-threshold" analysis, wherein energy thresholds were lowered but background rejection using phonon timing information was still maintained. This technique proved to have signi cant discrimination power, maintaining adequate signal acceptance and minimizing background leakage. The primary background for CDMS-II analyses comes from surface events, whose poor ionization collection make them di cult to distinguish from true nuclear recoil events. The novel detector technology of SuperCDMS, the successor to CDMS-II, uses interleaved electrodes to achieve full ionization collection for events occurring at the top and bottom detector surfaces. This, along with dual-sided ionization and phonon instrumentation, allows for excellent ducialization and relegates the surface-event rejection techniques of CDMS-II to a secondary level of background discrimination. Current and future SuperCDMS results hold great promise for mid- to low

  16. Extension and application of a scaling technique for duplication of in-flight aerodynamic heat flux in ground test facilities

    NARCIS (Netherlands)

    Veraar, R.G.

    2009-01-01

    To enable direct experimental duplication of the inflight heat flux distribution on supersonic and hypersonic vehicles, an aerodynamic heating scaling technique has been developed. The scaling technique is based on the analytical equations for convective heat transfer for laminar and turbulent bound

  17. An Extensible, Kinematically-Based Gesture Annotation Scheme

    OpenAIRE

    Martell, Craig H.

    2002-01-01

    Chapter 1 in the book: Advances in Natural Multimodal Dialogue Systems Annotated corpora have played a critical role in speech and natural language research; and, there is an increasing interest in corpora-based research in sign language and gesture as well. We present a non-semantic, geometrically-based annotation scheme, FORM, which allows an annotator to capture the kinematic information in a gesture just from videos of speakers. In addition, FORM stores this gestural in...

  18. The Liquisolid Technique: Based Drug Delivery System

    Directory of Open Access Journals (Sweden)

    Izhar Ahmed Syed

    2012-04-01

    Full Text Available The “Liquisolid” technique is a novel and capable addition towards such an aims for solubility enhancement and dissolution improvement, thereby it increases the bioavailability. It contains liquid medications in powdered form. This technique is an efficient method for formulating water insoluble and water soluble drugs. This technique is based upon the admixture of drug loaded solutions with appropriate carrier and coating materials. The use of non-volatile solvent causes improved wettability and ensures molecular dispersion of drug in the formulation and leads to enhance solubility. By using hydrophobic carriers (non-volatile solvents one can modify release (sustained release of drugs by this technique. Liquisolid system is characterized by flow behavior, wettability, powder bed hydrophilicity, saturation solubility, drug content, differential scanning calorimetry, Fourier transform infra red spectroscopy, powder X-ray diffraction, scanning electron microscopy, in-vitro release and in-vivo evaluation. By using this technique, solubility and dissolution rate can be improved, sustained drug delivery systems be developed for the water soluble drugs.

  19. Extensive Taguchi's Quality Loss Function Based On Asymmetric tolerances

    Institute of Scientific and Technical Information of China (English)

    ZHU Wei; LI Yuan-sheng; LIU Feng

    2004-01-01

    If specification interval is asymmetric, basic specification is the target value of quality characteristics. In this paper Taguchi's quality loss function is applied to describe quality loss based on asymmetric tolerances. The measurement of quality loss which is caused by the deviation of quality characteristics from basic specification is further presented.

  20. Interactive early warning technique based on SVDD

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    After reviewing current researches on early warning,it is found that"bad" data of some systems is not easy to obtain,which makes methods proposed by these researches unsuitable for monitored systems.An interactive early warning technique based on SVDD(support vector data description)is proposed to adopt"good" data as samples to overcome the difficulty in obtaining the"bad"data.The process consists of two parts:(1)A hypersphere is fitted on"good"data using SVDD.If the data object are outside the hypersphere,it would be taken as"suspicious";(2)A group of experts would decide whether the suspicious data is"bad"or"good",early warning messages would be issued according to the decisions.And the detailed process of implementation is proposed.At last,an experiment based on data of a macroeconomic system is conducted to verify the proposed technique.

  1. MATRIX BASED INDEXING TECHNIQUE FOR VIDEO DATA

    Directory of Open Access Journals (Sweden)

    Devarj Saravanan

    2013-01-01

    Full Text Available Due to increasing the usage of media, the utilization of video play central role as it supports various applications. Video is the particular media which contains complex collection of objects like audio, motion, text, color and picture. Due to the rapid growth of this information video indexing process is mandatory for fast and effective retrieval. Many current indexing techniques fails to extract the needed image from the stored data set, based on the users query. Urgent attention in the field of video indexing and image retrieval is the need of the hour. Here a new matrix based indexing technique for image retrieval has been proposed. The proposed method provide better result, experimental results prove this.

  2. Graph based techniques for tag cloud generation

    DEFF Research Database (Denmark)

    Leginus, Martin; Dolog, Peter; Lage, Ricardo Gomes

    2013-01-01

    Tag cloud is one of the navigation aids for exploring documents. Tag cloud also link documents through the user defined terms. We explore various graph based techniques to improve the tag cloud generation. Moreover, we introduce relevance measures based on underlying data such as ratings...... or citation counts for improved measurement of relevance of tag clouds. We show, that on the given data sets, our approach outperforms the state of the art baseline methods with respect to such relevance by 41 % on Movielens dataset and by 11 % on Bibsonomy data set....

  3. Laser Remote Sensing: Velocimetry Based Techniques

    Science.gov (United States)

    Molebny, Vasyl; Steinvall, Ove

    Laser-based velocity measurement is an area of the field of remote sensing where the coherent properties of laser radiation are the most exposed. Much of the published literature deals with the theory and techniques of remote sensing. We restrict our discussion to current trends in this area, gathered from recent conferences and professional journals. Remote wind sensing and vibrometry are promising in their new scientific, industrial, military, and biomedical applications, including improving flight safety, precise weapon correction, non-contact mine detection, optimization of wind farm operation, object identification based on its vibration signature, fluid flow studies, and vibrometry-associated diagnosis.

  4. Implementation of the three-field electron wraparound technique for extensive recurrent chest wall carcinoma: dosimetric and clinical considerations.

    Science.gov (United States)

    Norris, M

    1991-09-01

    Treatment of extensive recurrent chest wall carcinoma is a challenge for the radiation oncologist as well as the physics team responsible for setup, computer planning, and daily reproducibility. While electron arc therapy is desirable, unfortunately, most sites do not have this capability. The alternative method of treatment discussed here involves the use of a three-field electron wraparound technique for the chest wall when electron arc therapy is not available. This technique yields an excellent alternative treatment modality with flexibility to accommodate multiple electron energies to compensate for varying chest wall thickness. An additional anterior photon beam is used when skin lesions extend superiorly to the clavicle and along the proximal aspect of the arm. Computerized tomography (CT) interfaced radiotherapy computer planning is used to precisely calculate the sequential gantry angles, skin gaps for adjacent electron fields, and the appropriate junction moves to create a feathering effect of all overlap areas. Treatment aids include extensive shaping of electron and photon fields and the application of bolus material on all four fields. A Smithers Medical Products' Alpha Cradle is used to make this intricate setup possible, providing patient comfort and daily reproducibility for a more efficient treatment. PMID:1910473

  5. Low-cost computer classification of land cover in the Portland area, Oregon, by signature extension techniques

    Science.gov (United States)

    Gaydos, Leonard

    1978-01-01

    Computer-aided techniques for interpreting multispectral data acquired by Landsat offer economies in the mapping of land cover. Even so, the actual establishment of the statistical classes, or "signatures," is one of the relatively more costly operations involved. Analysts have therefore been seeking cost-saving signature extension techniques that would accept training data acquired for one time or place and apply them to another. Opportunities to extend signatures occur in preprocessing steps and in the classification steps that follow. In the present example, land cover classes were derived by the simplest and most direct form of signature extension: Classes statistically derived from a Landsat scene for the Puget Sound area, Wash., were applied to the Portland area, Oreg., using data for the next Landsat scene acquired less than 25 seconds down orbit. Many features can be recognized on the reduced-scale version of the Portland land cover map shown in this report, although no statistical assessment of its accuracy is available.

  6. Performance Evaluation of Extension Education Centers in Universities Based on the Balanced Scorecard

    Science.gov (United States)

    Wu, Hung-Yi; Lin, Yi-Kuei; Chang, Chi-Hsiang

    2011-01-01

    This study aims at developing a set of appropriate performance evaluation indices mainly based on balanced scorecard (BSC) for extension education centers in universities by utilizing multiple criteria decision making (MCDM). Through literature reviews and experts who have real practical experiences in extension education, adequate performance…

  7. DDH-like Assumptions Based on Extension Rings

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Kiltz, Eike;

    2011-01-01

    generalized to use instead d-DDH, and we show in the generic group model that d-DDH is harder than DDH. This means that virtually any application of DDH can now be realized with the same (amortized) efficiency, but under a potentially weaker assumption. On the negative side, we also show that d-DDH, just like...... DDH, is easy in bilinear groups. This motivates our suggestion of a different type of assumption, the d-vector DDH problems (VDDH), which are based on f(X)= X^d, but with a twist to avoid the problems with reducible polynomials. We show in the generic group model that VDDH is hard in bilinear groups...... and that in fact the problems become harder with increasing d and hence form an infinite hierarchy. We show that hardness of VDDH implies CCA-secure encryption, efficient Naor-Reingold style pseudorandom functions, and auxiliary input secure encryption, a strong form of leakage resilience. This can be seen...

  8. Resection of giant ethmoid osteoma with orbital and skull base extension followed by duraplasty

    Directory of Open Access Journals (Sweden)

    Ferekidou Eliza

    2008-10-01

    Full Text Available Abstract Background Osteomas of ethmoid sinus are rare, especially when they involve anterior skull base and orbit, and lead to ophthalmologic and neurological symptoms. Case presentation The present case describes a giant ethmoid osteoma. Patient symptoms and signs were exophthalmos and proptosis of the left eye, with progressive visual acuity impairment and visual fields defects. CT/MRI scanning demonstrated a huge osseous lesion of the left ethmoid sinus (6.5 cm × 5 cm × 2.2 cm, extending laterally in to the orbit and cranially up to the anterior skull base. Bilateral extensive polyposis was also found. Endoscopic and external techniques were combined to remove the lesion. Bilateral endoscopic polypectomy, anterior and posterior ethmoidectomy and middle meatus antrostomy were performed. Finally, the remaining part of the tumor was reached and dissected from the surrounding tissue via a minimally invasive Lynch incision around the left middle canthus. During surgery, CSF rhinorrhea was observed and leakage was grafted with fascia lata and coated with bio-glu. Postoperatively, symptoms disappeared. Eighteen months after surgery, the patient is still free of symptoms. Conclusion Before management of ethmoid osteomas with intraorbital and skull base extension, a thorough neurological, ophthalmological and imaging evaluation is required, in order to define the bounders of the tumor, carefully survey the severity of symptoms and signs, and precisely plan the optimal treatment. The endoscopic procedure can constitute an important part of surgery undertaken for giant ethmoidal osteomas. In addition, surgeons always have to take into account a possible CSF leak and they have to be prepared to resolve it.

  9. Language Based Techniques for Systems Biology

    DEFF Research Database (Denmark)

    Pilegaard, Henrik

    calculi have similarly been used for the study of bio-chemical reactive systems. In this dissertation it is argued that techniques rooted in the theory and practice of programming languages, language based techniques if you will, constitute a strong basis for the investigation of models of biological......Process calculus is the common denominator for a class of compact, idealised, domain-specific formalisms normally associated with the study of reactive concurrent systems within Computer Science. With the rise of the interactioncentred science of Systems Biology a number of bio-inspired process...... systems as formalised in a process calculus. In particular it is argued that Static Program Analysis provides a useful approach to the study of qualitative properties of such models. In support of this claim a number of static program analyses are developed for Regev’s BioAmbients – a bio-inspired variant...

  10. XSemantic: An Extension of LCA Based XML Semantic Search

    Science.gov (United States)

    Supasitthimethee, Umaporn; Shimizu, Toshiyuki; Yoshikawa, Masatoshi; Porkaew, Kriengkrai

    One of the most convenient ways to query XML data is a keyword search because it does not require any knowledge of XML structure or learning a new user interface. However, the keyword search is ambiguous. The users may use different terms to search for the same information. Furthermore, it is difficult for a system to decide which node is likely to be chosen as a return node and how much information should be included in the result. To address these challenges, we propose an XML semantic search based on keywords called XSemantic. On the one hand, we give three definitions to complete in terms of semantics. Firstly, the semantic term expansion, our system is robust from the ambiguous keywords by using the domain ontology. Secondly, to return semantic meaningful answers, we automatically infer the return information from the user queries and take advantage of the shortest path to return meaningful connections between keywords. Thirdly, we present the semantic ranking that reflects the degree of similarity as well as the semantic relationship so that the search results with the higher relevance are presented to the users first. On the other hand, in the LCA and the proximity search approaches, we investigated the problem of information included in the search results. Therefore, we introduce the notion of the Lowest Common Element Ancestor (LCEA) and define our simple rule without any requirement on the schema information such as the DTD or XML Schema. The first experiment indicated that XSemantic not only properly infers the return information but also generates compact meaningful results. Additionally, the benefits of our proposed semantics are demonstrated by the second experiment.

  11. Knowledge-based techniques in software engineering

    Energy Technology Data Exchange (ETDEWEB)

    Jairam, B.N.; Agarwal, A.; Emrich, M.L.

    1988-05-04

    Recent trends in software engineering research focus on the incorporation of AI techniques. The feasibility of an overlap between AI and software engineering is examined. The benefits of merging the two fields are highlighted. The long-term goal is to automate the software development process. Some projects being undertaken towards the attainment of this goal are presented as examples. Finally, research on the Oak Ridge Reservation aimed at developing a knowledge-based software project management aid is presented. 25 refs., 1 tab.

  12. On HTML and XML based web design and implementation techniques

    International Nuclear Information System (INIS)

    Web implementation is truly a multidisciplinary field with influences from programming, choosing of scripting languages, graphic design, user interface design, and database design. The challenge of a Web designer/implementer is his ability to create an attractive and informative Web. To work with the universal framework and link diagrams from the design process as well as the Web specifications and domain information, it is essential to create Hypertext Markup Language (HTML) or other software and multimedia to accomplish the Web's objective. In this article we will discuss Web design standards and the techniques involved in Web implementation based on HTML and Extensible Markup Language (XML). We will also discuss the advantages and disadvantages of HTML over its successor XML in designing and implementing a Web. We have developed two Web pages, one utilizing the features of HTML and the other based on the features of XML to carry out the present investigation. (author)

  13. Artificial Intelligence based technique for BTS placement

    Science.gov (United States)

    Alenoghena, C. O.; Emagbetere, J. O.; Aibinu, A. M.

    2013-12-01

    The increase of the base transceiver station (BTS) in most urban areas can be traced to the drive by network providers to meet demand for coverage and capacity. In traditional network planning, the final decision of BTS placement is taken by a team of radio planners, this decision is not fool proof against regulatory requirements. In this paper, an intelligent based algorithm for optimal BTS site placement has been proposed. The proposed technique takes into consideration neighbour and regulation considerations objectively while determining cell site. The application will lead to a quantitatively unbiased evaluated decision making process in BTS placement. An experimental data of a 2km by 3km territory was simulated for testing the new algorithm, results obtained show a 100% performance of the neighbour constrained algorithm in BTS placement optimization. Results on the application of GA with neighbourhood constraint indicate that the choices of location can be unbiased and optimization of facility placement for network design can be carried out.

  14. Artificial Intelligence based technique for BTS placement

    International Nuclear Information System (INIS)

    The increase of the base transceiver station (BTS) in most urban areas can be traced to the drive by network providers to meet demand for coverage and capacity. In traditional network planning, the final decision of BTS placement is taken by a team of radio planners, this decision is not fool proof against regulatory requirements. In this paper, an intelligent based algorithm for optimal BTS site placement has been proposed. The proposed technique takes into consideration neighbour and regulation considerations objectively while determining cell site. The application will lead to a quantitatively unbiased evaluated decision making process in BTS placement. An experimental data of a 2km by 3km territory was simulated for testing the new algorithm, results obtained show a 100% performance of the neighbour constrained algorithm in BTS placement optimization. Results on the application of GA with neighbourhood constraint indicate that the choices of location can be unbiased and optimization of facility placement for network design can be carried out

  15. Performance Based Novel Techniques for Semantic Web Mining

    Directory of Open Access Journals (Sweden)

    Mahendra Thakur

    2012-01-01

    Full Text Available The explosive growth in the size and use of the World Wide Web continuously creates new great challenges and needs. The need for predicting the users preferences in order to expedite and improve the browsing though a site can be achieved through personalizing of the websites. Most of the research efforts in web personalization correspond to the evolution of extensive research in web usage mining, i.e. the exploitation of the navigational patterns of the web site visitors. When a personalization system relies solely on usage-based results, however, valuable information conceptually related to what is finally recommended may be missed. Moreover, the structural properties of the web site are often disregarded. In this paper, we propose novel techniques that use the content semantics and the structural properties of a web site in order to improve the effectiveness of web personalization. In the first part of our work we present standing for Semantic Web Personalization, a personalization system that integrates usage data with content semantics, expressed in ontology terms, in order to compute semantically enhanced navigational patterns and effectively generate useful recommendations. To the best of our knowledge, our proposed technique is the only semantic web personalization system that may be used by non-semantic web sites. In the second part of our work, we present a novel approach for enhancing the quality of recommendations based on the underlying structure of a web site. We introduce UPR (Usage-based PageRank, a PageRank-style algorithm that relies on the recorded usage data and link analysis techniques. Overall, we demonstrate that our proposed hybrid personalization framework results in more objective and representative predictions than existing techniques.

  16. "YFlag"--a single-base extension primer based method for gender determination.

    Science.gov (United States)

    Allwood, Julia S; Harbison, Sally Ann

    2015-01-01

    Assigning the gender of a DNA contributor in forensic analysis is typically achieved using the amelogenin test. Occasionally, this test produces false-positive results due to deletions occurring on the Y chromosome. Here, a four-marker "YFlag" method is presented to infer gender using single-base extension primers to flag the presence (or absence) of Y-chromosome DNA within a sample to supplement forensic STR profiling. This method offers built-in redundancy, with a single marker being sufficient to detect the presence of male DNA. In a study using 30 male and 30 female individuals, detection of male DNA was achieved with c. 0.03 ng of male DNA. All four markers were present in male/female mixture samples despite the presence of excessive female DNA. In summary, the YFlag system offers a method that is reproducible, specific, and sensitive, making it suitable for forensic use to detect male DNA. PMID:25354446

  17. The Knowledge Base as an Extension of Distance Learning Reference Service

    Science.gov (United States)

    Casey, Anne Marie

    2012-01-01

    This study explores knowledge bases as extension of reference services for distance learners. Through a survey and follow-up interviews with distance learning librarians, this paper discusses their interest in creating and maintaining a knowledge base as a resource for reference services to distance learners. It also investigates their perceptions…

  18. Type extension trees

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    We introduce type extension trees as a formal representation language for complex combinatorial features of relational data. Based on a very simple syntax this language provides a unified framework for expressing features as diverse as embedded subgraphs on the one hand, and marginal counts...... of attribute values on the other. We show by various examples how many existing relational data mining techniques can be expressed as the problem of constructing a type extension tree and a discriminant function....

  19. Structural level characterization of base oils using advanced analytical techniques

    KAUST Repository

    Hourani, Nadim

    2015-05-21

    Base oils, blended for finished lubricant formulations, are classified by the American Petroleum Institute into five groups, viz., groups I-V. Groups I-III consist of petroleum based hydrocarbons whereas groups IV and V are made of synthetic polymers. In the present study, five base oil samples belonging to groups I and III were extensively characterized using high performance liquid chromatography (HPLC), comprehensive two-dimensional gas chromatography (GC×GC), and Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) equipped with atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) sources. First, the capabilities and limitations of each analytical technique were evaluated, and then the availed information was combined to reveal compositional details on the base oil samples studied. HPLC showed the overwhelming presence of saturated over aromatic compounds in all five base oils. A similar trend was further corroborated using GC×GC, which yielded semiquantitative information on the compound classes present in the samples and provided further details on the carbon number distributions within these classes. In addition to chromatography methods, FT-ICR MS supplemented the compositional information on the base oil samples by resolving the aromatics compounds into alkyl- and naphtheno-subtituted families. APCI proved more effective for the ionization of the highly saturated base oil components compared to APPI. Furthermore, for the detailed information on hydrocarbon molecules FT-ICR MS revealed the presence of saturated and aromatic sulfur species in all base oil samples. The results presented herein offer a unique perspective into the detailed molecular structure of base oils typically used to formulate lubricants. © 2015 American Chemical Society.

  20. A description of Seismicity based on Non-extensive Statistical Physics: An introduction to Non-extensive Statistical Seismology.

    Science.gov (United States)

    Vallianatos, Filippos

    2015-04-01

    Despite the extreme complexity that characterizes earthquake generation process, simple phenomenology seems to apply in the collective properties of seismicity. The best known is the Gutenberg-Richter relation. Short and long-term clustering, power-law scaling and scale-invariance have been exhibited in the spatio-temporal evolution of seismicity providing evidence for earthquakes as a nonlinear dynamic process. Regarding the physics of "many" earthquakes and how this can be derived from first principles, one may wonder, how can the collective properties of a set formed by all earthquakes in a given region, be derived and how does the structure of seismicity depend on its elementary constituents - the earthquakes? What are these properties? The physics of many earthquakes has to be studied with a different approach than the physics of one earthquake making the use of statistical physics necessary to understand the collective properties of earthquakes. Then a natural question arises. What type of statistical physics is appropriate to commonly describe effects from the microscale and crack opening level to the level of large earthquakes? An answer to the previous question could be non-extensive statistical physics, introduced by Tsallis (1988), as the appropriate methodological tool to describe entities with (multi) fractal distributions of their elements and where long-range interactions or intermittency are important, as in fracturing phenomena and earthquakes. In the present work, we review some fundamental properties of earthquake physics and how these are derived by means of non-extensive statistical physics. The aim is to understand aspects of the underlying physics that lead to the evolution of the earthquake phenomenon introducing the new topic of non-extensive statistical seismology. This research has been funded by the European Union (European Social Fund) and Greek national resources under the framework of the "THALES Program: SEISMO FEAR HELLARC" project

  1. A New Extension Theory-based Production Operation Method in Industrial Process

    Institute of Scientific and Technical Information of China (English)

    XU Yuan; ZHU Qunxiong

    2013-01-01

    To explore the problems of dynamic change in production demand and operating contradiction in production process,a new extension theory-based production operation method is proposed.The core is the demand requisition,contradiction resolution and operation classification.For the demand requisition,the deep and comprehensive demand elements are collected by the conjugating analysis.For the contradiction resolution,the conflict between the demand and operating elements are solved by the extension reasoning,extension transformation and consistency judgment.For the operating classification,the operating importance among the operating elements is calculated by the extension clustering so as to guide the production operation and ensure the production safety.Through the actual application in the cascade reaction process of high-density polyethylene (HDPE) of a chemicalplant,cases study and comparison show that the proposed extension theory-based production operation method is significantly better than the traditional experience-based operation method in actual production process,which exploits a new way to the research on the production operating methods for industrial process.

  2. A Comparative Study of Three Vibration Based Damage Assessment Techniques

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Rytter, A.

    Three different vibration based damage assessment techniques have been compared. One of the techniques uses the ratios between changes in experimentally and theoretically estimated natural frequencies, respectively, to locate a damage. The second technique relies on updating of an FEM based...... on experimentally estimated natural frequencies where the stiffness matrix is given as a function of damage size and location. The last technique is based on neural networks trained with the relative changes in natural frequencies. It has been found that all techniques seems to be useful. Especially, the neural...... networks based technique seems to be very promising....

  3. The community-based Health Extension Program significantly improved contraceptive utilization in West Gojjam Zone, Ethiopia

    Directory of Open Access Journals (Sweden)

    Yitayal M

    2014-05-01

    Full Text Available Mezgebu Yitayal,1 Yemane Berhane,2 Alemayehu Worku,3 Yigzaw Kebede11University of Gondar, Gondar, Ethiopia; 2Addis Continental Institute of Public Health, Addis Ababa, Ethiopia; 3Addis Ababa University, Addis Ababa, EthiopiaBackground: Ethiopia has implemented a nationwide primary health program at grassroots level (known as the Health Extension Program since 2003 to increase public access to basic health services. This study was conducted to assess whether households that fully implemented the Health Extension Program have improved current contraceptive use.Methods: A cross-sectional community-based survey was conducted to collect data from 1,320 mothers using a structured questionnaire. A multivariate logistic regression was used to identify the predictors of current contraceptive utilization. A propensity score analysis was used to determine the contribution of the Health Extension Program “model households” on current contraceptive utilization.Result: Mothers from households which fully benefited from the Health Extension Program (“model households” were 3.97 (adjusted odds ratio, 3.97; 95% confidence interval, 3.01–5.23 times more likely to use contraceptives compared with mothers from non-model households. Model household status contributed to 29.3% (t=7.08 of the increase in current contraceptive utilization.Conclusion: The Health Extension Program when implemented fully could help to increase the utilization of contraceptives in the rural community and improve family planning.Keywords: Health Extension Program, current contraceptive utilization

  4. FLOWCER - a flowmeter based on radiotracer techniques

    International Nuclear Information System (INIS)

    One of the most difficult problems in the field of flow measurement is the lack of a portable, clamp-on type of flowmeter of good accuracy. This is a serious restriction in non-continuous flow measurements and on-site calibrations of flow meters. One possibility of constructing a meter capable for these measurements is to use tracer techniques, particularly radioisotope tracers. A flow measurement instrument, FLOWCER, has been developed in the Reactor Laboratory of the Technical Research Centre of Finland (VTT). The instrument is based on the radioisotope transit time method. The device can be used for the accurate instantaneous measurement of volume flow rate in ducts. The tracer used is 137mBa produced in a portable isotope generator. Because of the short half-life (2.6 min) of 137mBa the measurement is radiologically very safe. The device consists of the isotope generator, an injection device for the tracer, radiation detectors, a data logger unit and a micro-computer. Also a transducer for various other quantities than flow may be connected to the analog input channels of the FLOWCER. The measurement program can be modified for measurements of different types. The FLOWCER has been used for the measurememts of energy and material balances, for the on-site calibrations of flow meters and for pump efficiency analysis. The application most frequently used has been the on-site calibration of flow meters. According to the present experience (over 100 calibrated flow meters) the accuracy level of flow measurements can be increased by a factor of ten or more by using the transit time method for on-site calibration

  5. Phase difference estimation method based on data extension and Hilbert transform

    International Nuclear Information System (INIS)

    To improve the precision and anti-interference performance of phase difference estimation for non-integer periods of sampling signals, a phase difference estimation method based on data extension and Hilbert transform is proposed. Estimated phase difference is obtained by means of data extension, Hilbert transform, cross-correlation, auto-correlation, and weighted phase average. Theoretical analysis shows that the proposed method suppresses the end effects of Hilbert transform effectively. The results of simulations and field experiments demonstrate that the proposed method improves the anti-interference performance of phase difference estimation and has better performance of phase difference estimation than the correlation, Hilbert transform, and data extension-based correlation methods, which contribute to improving the measurement precision of the Coriolis mass flowmeter. (paper)

  6. Light based techniques for improving health care: studies at RRCAT

    International Nuclear Information System (INIS)

    The invention of Lasers in 1960, the phenomenal advances in photonics as well as the information processing capability of the computers has given a major boost to the R and D activity on the use of light for high resolution biomedical imaging, sensitive, non-invasive diagnosis and precision therapy. The effort has resulted in remarkable progress and it is widely believed that light based techniques hold great potential to offer simpler, portable systems which can help provide diagnostics and therapy in a low resource setting. At Raja Ramanna Centre for Advanced Technology (RRCAT) extensive studies have been carried out on fluorescence spectroscopy of native tissue. This work led to two important outcomes. First, a better understanding of tissue fluorescence and insights on the possible use of fluorescence spectroscopy for screening of cancer and second development of diagnostic systems that can serve as standalone tool for non-invasive screening of the cancer of oral cavity. The optical coherence tomography setups and their functional extensions (polarization sensitive, Doppler) have also been developed and used for high resolution (∼10 µm) biomedical imaging applications, in particular for non-invasive monitoring of the healing of wounds. Chlorophyll based photo-sensitisers and their derivatives have been synthesized in house and used for photodynamic therapy of tumors in animal models and for antimicrobial applications. Various variants of optical tweezers (holographic, Raman etc.) have also been developed and utilised for different applications notably Raman spectroscopy of optically trapped red blood cells. An overview of these activities carried out at RRCAT is presented in this article. (author)

  7. DCT-based cyber defense techniques

    Science.gov (United States)

    Amsalem, Yaron; Puzanov, Anton; Bedinerman, Anton; Kutcher, Maxim; Hadar, Ofer

    2015-09-01

    With the increasing popularity of video streaming services and multimedia sharing via social networks, there is a need to protect the multimedia from malicious use. An attacker may use steganography and watermarking techniques to embed malicious content, in order to attack the end user. Most of the attack algorithms are robust to basic image processing techniques such as filtering, compression, noise addition, etc. Hence, in this article two novel, real-time, defense techniques are proposed: Smart threshold and anomaly correction. Both techniques operate at the DCT domain, and are applicable for JPEG images and H.264 I-Frames. The defense performance was evaluated against a highly robust attack, and the perceptual quality degradation was measured by the well-known PSNR and SSIM quality assessment metrics. A set of defense techniques is suggested for improving the defense efficiency. For the most aggressive attack configuration, the combination of all the defense techniques results in 80% protection against cyber-attacks with PSNR of 25.74 db.

  8. Strategic Partnerships that Strengthen Extension's Community-Based Entrepreneurship Programs: An Example from Maine

    Science.gov (United States)

    Bassano, Louis V.; McConnon, James C., Jr.

    2011-01-01

    This article explains how Extension can enhance and expand its nationwide community-based entrepreneurship programs by developing strategic partnerships with other organizations to create highly effective educational programs for rural entrepreneurs. The activities and impacts of the Down East Micro-Enterprise Network (DEMN), an alliance of three…

  9. Flood alert system based on bayesian techniques

    Science.gov (United States)

    Gulliver, Z.; Herrero, J.; Viesca, C.; Polo, M. J.

    2012-04-01

    The problem of floods in the Mediterranean regions is closely linked to the occurrence of torrential storms in dry regions, where even the water supply relies on adequate water management. Like other Mediterranean basins in Southern Spain, the Guadalhorce River Basin is a medium sized watershed (3856 km2) where recurrent yearly floods occur , mainly in autumn and spring periods, driven by cold front phenomena. The torrential character of the precipitation in such small basins, with a concentration time of less than 12 hours, produces flash flood events with catastrophic effects over the city of Malaga (600000 inhabitants). From this fact arises the need for specific alert tools which can forecast these kinds of phenomena. Bayesian networks (BN) have been emerging in the last decade as a very useful and reliable computational tool for water resources and for the decision making process. The joint use of Artificial Neural Networks (ANN) and BN have served us to recognize and simulate the two different types of hydrological behaviour in the basin: natural and regulated. This led to the establishment of causal relationships between precipitation, discharge from upstream reservoirs, and water levels at a gauging station. It was seen that a recurrent ANN model working at an hourly scale, considering daily precipitation and the two previous hourly values of reservoir discharge and water level, could provide R2 values of 0.86. BN's results slightly improve this fit, but contribute with uncertainty to the prediction. In our current work to Design a Weather Warning Service based on Bayesian techniques the first steps were carried out through an analysis of the correlations between the water level and rainfall at certain representative points in the basin, along with the upstream reservoir discharge. The lower correlation found between precipitation and water level emphasizes the highly regulated condition of the stream. The autocorrelations of the variables were also

  10. Study and Application of Case-based Extension Fault Diagnosis for Chemical Process

    Institute of Scientific and Technical Information of China (English)

    PENG Di; XU Yuan; ZHU Qunxiong

    2013-01-01

    In chemical processes,fault diagnosis is relatively difficult due to the incomplete prior-knowledge and unpredictable production changes.To solve the problem,a case-based extension fault diagnosis (CEFD) method is proposed combining with extension theory,in which the basic-element model is used for the unified and deep fault description,the distance concept is applied to quantify the correlation degree between the new fault and the original fault cases,and the extension transformation is used to expand and obtain the solution of unknown faults.With the application in Tennessee Eastman process,the result indicates that CEFD method has a flexible fault representation,objective fault retrieve performance and good ability for fault study,providing a new way for diagnosing production faults accurately.

  11. Chaotic Extension Neural Network-Based Fault Diagnosis Method for Solar Photovoltaic Systems

    Directory of Open Access Journals (Sweden)

    Kuo-Nan Yu

    2014-01-01

    Full Text Available At present, the solar photovoltaic system is extensively used. However, once a fault occurs, it is inspected manually, which is not economical. In order to remedy the defect of unavailable fault diagnosis at any irradiance and temperature in the literature with chaos synchronization based intelligent fault diagnosis for photovoltaic systems proposed by Hsieh et al., this study proposed a chaotic extension fault diagnosis method combined with error back propagation neural network to overcome this problem. It used the nn toolbox of matlab 2010 for simulation and comparison, measured current irradiance and temperature, and used the maximum power point tracking (MPPT for chaotic extraction of eigenvalue. The range of extension field was determined by neural network. Finally, the voltage eigenvalue obtained from current temperature and irradiance was used for the fault diagnosis. Comparing the diagnostic rates with the results by Hsieh et al., this scheme can obtain better diagnostic rates when the irradiances or the temperatures are changed.

  12. Risk-Based Allowed Outage Time and Surveillance Test Interval Extensions for Angra 1

    Directory of Open Access Journals (Sweden)

    Sonia M. Orlando Gibelli

    2012-01-01

    Full Text Available In this work, Probabilistic Safety Assessment (PSA is used to evaluate Allowed Outage Times (AOT and Surveillance Test Intervals (STI extensions for three Angra 1 nuclear power plant safety systems. The interest in such an analysis lies on the fact that PSA comprises a risk-based tool for safety evaluation and has been increasingly applied to support both the regulatory and the operational decision-making processes. Regarding Angra 1, among other applications, PSA is meant to be an additional method that can be used by the utility to justify Technical Specification relaxation to the Brazilian regulatory body. The risk measure used in this work is the Core Damage Frequency, obtained from the Angra 1 Level 1 PSA study. AOT and STI extensions are evaluated for the Safety Injection, Service Water and Auxiliary Feedwater Systems using the SAPHIRE code. In order to compensate for the risk increase caused by the extensions, compensatory measures as (1 test of redundant train prior to entering maintenance and (2 staggered test strategy are proposed. Results have shown that the proposed AOT extensions are acceptable for two of the systems with the implementation of compensatory measures whereas STI extensions are acceptable for all three systems.

  13. Segmentation of Color Images Based on Different Segmentation Techniques

    OpenAIRE

    Purnashti Bhosale; Aniket Gokhale

    2013-01-01

    In this paper, we propose an Color image segmentation algorithm based on different segmentation techniques. We recognize the background objects such as the sky, ground, and trees etc based on the color and texture information using various methods of segmentation. The study of segmentation techniques by using different threshold methods such as global and local techniques and they are compared with one another so as to choose the best technique for threshold segmentation. Further segmentation...

  14. Translation of Untranslatable Words — Integration of Lexical Approximation and Phrase-Table Extension Techniques into Statistical Machine Translation

    Science.gov (United States)

    Paul, Michael; Arora, Karunesh; Sumita, Eiichiro

    This paper proposes a method for handling out-of-vocabulary (OOV) words that cannot be translated using conventional phrase-based statistical machine translation (SMT) systems. For a given OOV word, lexical approximation techniques are utilized to identify spelling and inflectional word variants that occur in the training data. All OOV words in the source sentence are then replaced with appropriate word variants found in the training corpus, thus reducing the number of OOV words in the input. Moreover, in order to increase the coverage of such word translations, the SMT translation model is extended by adding new phrase translations for all source language words that do not have a single-word entry in the original phrase-table but only appear in the context of larger phrases. The effectiveness of the proposed methods is investigated for the translation of Hindi to English, Chinese, and Japanese.

  15. Combined surgical and catheter-based treatment of extensive thoracic aortic aneurysm and aortic valve stenosis

    DEFF Research Database (Denmark)

    De Backer, Ole; Lönn, Lars; Søndergaard, Lars

    2015-01-01

    endovascular aneurysm repair (TEVAR) has changed and extended management options in thoracic aorta disease, including in those patients deemed unfit or unsuitable for open surgery. Accordingly, transcatheter aortic valve replacement (TAVR) is increasingly used to treat patients with symptomatic severe aortic......An extensive thoracic aortic aneurysm (TAA) is a potentially life-threatening condition and remains a technical challenge to surgeons. Over the past decade, repair of aortic arch aneurysms has been accomplished using both hybrid (open and endovascular) and totally endovascular techniques. Thoracic...

  16. Downward Price-Based Brand Line Extensions Effects on Luxury Brands

    Directory of Open Access Journals (Sweden)

    Marcelo Royo-Vela

    2015-07-01

    Full Text Available This study tries to examine the brand concept consistency, the self-concept congruence and the resulting loyalty status of the consumers in order to evaluate whether a downward price-based line extensions in the luxury goods market has any negative or positive effect on them. By conducting focus group and in-depth interviews it was tried to filter out how brand concepts of luxury brands are perceived before and after a line extension. Results revealed that a crucial aspect for the evaluation of downward price-based line extensions is the exclusivity variable. Additionally, the research showed different modification to the brand concept consistency after an extension depending whether the brand is bought for pure hedonic or emotional reasons or actually for functional reasons. As practical implications brands appealing to hedonic/emotional motivations need to be crucially differentiated to those brands appealing to functional/rational motivations. In the case of a mixed concept an in-depth segmentation of the target markets is needed in order to successfully reach the consumers’ needs.

  17. Autofluorescence based diagnostic techniques for oral cancer

    OpenAIRE

    Balasubramaniam, A. Murali; Sriraman, Rajkumari; Sindhuja, P; Mohideen, Khadijah; Parameswar, R. Arjun; Muhamed Haris, K. T.

    2015-01-01

    Oral cancer is one of the most common cancers worldwide. Despite of various advancements in the treatment modalities, oral cancer mortalities are more, particularly in developing countries like India. This is mainly due to the delay in diagnosis of oral cancer. Delay in diagnosis greatly reduces prognosis of the treatment and also cause increased morbidity and mortality rates. Early diagnosis plays a key role in effective management of oral cancer. A rapid diagnostic technique can greatly aid...

  18. Association of Anterior and Lateral Extraprostatic Extensions with Base-Positive Resection Margins in Prostate Cancer

    Science.gov (United States)

    Abalajon, Mark Joseph; Jang, Won Sik; Kwon, Jong Kyou; Yoon, Cheol Yong; Lee, Joo Yong; Cho, Kang Su; Ham, Won Sik

    2016-01-01

    Introduction Positive surgical margins (PSM) detected in the radical prostatectomy specimen increase the risk of biochemical recurrence (BCR). Still, with formidable number of patients never experiencing BCR in their life, the reason for this inconsistency has been attributed to the artifacts and to the spontaneous regression of micrometastatic site. To investigate the origin of margin positive cancers, we have looked into the influence of extraprostatic extension location on the resection margin positive site and its implications on BCR risk. Materials & Methods The clinical information and follow-up data of 612 patients who had extraprostatic extension and positive surgical margin at the time of robot assisted radical prostatectomy (RARP) in the single center between 2005 and 2014 were modeled using Fine and Gray’s competing risk regression analysis for BCR. Extraprostatic extensions were divided into categories according to location as apex, base, anterior, posterior, lateral, and posterolateral. Extraprostatic extensions were defined as presence of tumor beyond the borders of the gland in the posterior and posterolateral regions. Tumor admixed with periprostatic fat was additionally considered as having extraprostatic extension if capsule was vague in the anterior, apex, and base regions. Positive surgical margins were defined as the presence of tumor cells at the inked margin on the inspection under microscopy. Association of these classifications with the site of PSM was evaluated by Cohen’s Kappa analysis for concordance and logistic regression for the odds of apical and base PSMs. Results Median follow-up duration was 36.5 months (interquartile range[IQR] 20.1–36.5). Apex involvement was found in 158 (25.8%) patients and base in 110 (18.0%) patients. PSMs generally were found to be associated with increased risk of BCR regardless of location, with BCR risk highest for base PSM (HR 1.94, 95% CI 1.40–2.68, p<0.001) after adjusting for age, initial

  19. Physics based modeling of a series parallel battery pack for asymmetry analysis, predictive control and life extension

    Science.gov (United States)

    Ganesan, Nandhini; Basu, Suman; Hariharan, Krishnan S.; Kolake, Subramanya Mayya; Song, Taewon; Yeo, Taejung; Sohn, Dong Kee; Doo, Seokgwang

    2016-08-01

    Lithium-Ion batteries used for electric vehicle applications are subject to large currents and various operation conditions, making battery pack design and life extension a challenging problem. With increase in complexity, modeling and simulation can lead to insights that ensure optimal performance and life extension. In this manuscript, an electrochemical-thermal (ECT) coupled model for a 6 series × 5 parallel pack is developed for Li ion cells with NCA/C electrodes and validated against experimental data. Contribution of the cathode to overall degradation at various operating conditions is assessed. Pack asymmetry is analyzed from a design and an operational perspective. Design based asymmetry leads to a new approach of obtaining the individual cell responses of the pack from an average ECT output. Operational asymmetry is demonstrated in terms of effects of thermal gradients on cycle life, and an efficient model predictive control technique is developed. Concept of reconfigurable battery pack is studied using detailed simulations that can be used for effective monitoring and extension of battery pack life.

  20. Path Based Mapping Technique for Robots

    Directory of Open Access Journals (Sweden)

    Amiraj Dhawan

    2013-05-01

    Full Text Available The purpose of this paper is to explore a new way of autonomous mapping. Current systems using perception techniques like LAZER or SONAR use probabilistic methods and have a drawback of allowing considerable uncertainty in the mapping process. Our approach is to break down the environment, specifically indoor, into reachable areas and objects, separated by boundaries, and identifying their shape, to render various navigable paths around them. This is a novel method to do away with uncertainties, as far as possible, at the cost of temporal efficiency. Also this system demands only minimum and cheap hardware, as it relies on only Infra-Red sensors to do the job.

  1. Using Satellite Based Techniques to Combine Volcanic Ash Detection Methods

    Science.gov (United States)

    Hendrickson, B. T.; Kessinger, C.; Herzegh, P.; Blackburn, G.; Cowie, J.; Williams, E.

    2006-12-01

    Volcanic ash poses a serious threat to aircraft avionics due to the corrosive nature of the silicate particles. Aircraft encounters with ash have resulted in millions of dollars in damage and loss of power to aircraft engines. Accurate detection of volcanic ash for the purpose of avoiding these hazardous areas is of the utmost importance to ensure aviation safety as well as to minimize economic loss. Satellite-based detection of volcanic ash has been used extensively to warn the aviation community of its presence through the use of multi-band detection algorithms. However, these algorithms are generally used individually rather than in combination and require the intervention of a human analyst. Automation of the detection and warning of the presence of volcanic ash for the aviation community is a long term goal of the Federal Aviation Administration Oceanic Weather Product Development Team. We are exploring the use of data fusion techniques within a fuzzy logic framework to perform a weighted combination of several multi-band detection algorithms. Our purpose is to improve the overall performance of volcanic ash detection and to test whether automation is feasible. Our initial focus is on deep, stratospheric eruptions.

  2. Risk-Based Allowed Outage Time and Surveillance Test Interval Extensions for Angra 1

    OpenAIRE

    Orlando Gibelli, Sonia M.; e Melo, P. F. Frutuoso; Bogado Leite, Sérgio Q.

    2012-01-01

    In this work, Probabilistic Safety Assessment (PSA) is used to evaluate Allowed Outage Times (AOT) and Surveillance Test Intervals (STI) extensions for three Angra 1 nuclear power plant safety systems. The interest in such an analysis lies on the fact that PSA comprises a risk-based tool for safety evaluation and has been increasingly applied to support both the regulatory and the operational decision-making processes. Regarding Angra 1, among other applications, PSA is meant to be an additio...

  3. DanteR: an extensible R-based tool for quantitative analysis of -omics data

    OpenAIRE

    Taverner, Tom; Karpievitch, Yuliya V.; Polpitiya, Ashoka D.; Brown, Joseph N.; Dabney, Alan R.; Anderson, Gordon A.; Smith, Richard D.

    2012-01-01

    Motivation: The size and complex nature of mass spectrometry-based proteomics datasets motivate development of specialized software for statistical data analysis and exploration. We present DanteR, a graphical R package that features extensive statistical and diagnostic functions for quantitative proteomics data analysis, including normalization, imputation, hypothesis testing, interactive visualization and peptide-to-protein rollup. More importantly, users can easily extend the existing func...

  4. Comparison of Vibration-Based Damage Assessment Techniques

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Rytter, A.

    1995-01-01

    Three different vibration-based damage assessment techniques have been compared. One of the techniques uses the ratios between changes in experimentally and theoretically estimated natural frequencies, respectively, to locate a damage. The second technique relies on updating of a finite element...

  5. Service-Based Extensions to an OAIS Archive for Science Data Management

    Science.gov (United States)

    Flathers, E.; Seamon, E.; Gessler, P. E.

    2014-12-01

    With new data management mandates from major funding sources such as the National Institutes for Health and the National Science Foundation, architecture of science data archive systems is becoming a critical concern for research institutions. The Consultative Committee for Space Data Systems (CCSDS), in 2002, released their first version of a Reference Model for an Open Archival Information System (OAIS). The CCSDS document (now an ISO standard) was updated in 2012 with additional focus on verifying the authenticity of data and developing concepts of access rights and a security model. The OAIS model is a good fit for research data archives, having been designed to support data collections of heterogeneous types, disciplines, storage formats, etc. for the space sciences. As fast, reliable, persistent Internet connectivity spreads, new network-available resources have been developed that can support the science data archive. A natural extension of an OAIS archive is the interconnection with network- or cloud-based services and resources. We use the Service Oriented Architecture (SOA) design paradigm to describe a set of extensions to an OAIS-type archive: purpose and justification for each extension, where and how each extension connects to the model, and an example of a specific service that meets the purpose.

  6. 加快推进基层农业技术推广体系改革的对策及建议%Countermeasures to Push on Reform of Grass-roots Agricultural Technique Extension System

    Institute of Scientific and Technical Information of China (English)

    杨忠娜; 陈曦; 张淑云; 陶佩君

    2009-01-01

    The grass-roots agricultural technique extension system based on county and township levels is an organization to provide the achievements of farming, animal husbandry, fishery, forestry, agricultural machinery, water conservancy and other technical services for farmers, and animportant carrier to implement the strategy of agricultural revitalization through science and education. The paper puts forward countermeasures of strengthening its horizontal connection and vertical improvement to push on refotto of grass-roots agricultural technique extension system based on analyzing current status of grassroots agricultural extension system and the bottleneck problem in construction of grass-roots agricultural technical extension organization in China.%基层农业技术推广体系设立在县、乡两级,是为农民提供种植业、畜牧业、渔业、林业、农业机械、水利等科研成果和实用技术服务的组织,是实施科教兴农战略的重要载体.以全国基层农业推广体系建设现状为切入点,分析了制约基层农业技术推广机构建设过程中存在的瓶颈问题,提出通过对基层农村推广体系的横向衔接和纵向完善来推进改革.

  7. An Authentication Technique Based on Classification

    Institute of Scientific and Technical Information of China (English)

    李钢; 杨杰

    2004-01-01

    We present a novel watermarking approach based on classification for authentication, in which a watermark is embedded into the host image. When the marked image is modified, the extracted watermark is also different to the original watermark, and different kinds of modification lead to different extracted watermarks. In this paper, different kinds of modification are considered as classes, and we used classification algorithm to recognize the modifications with high probability. Simulation results show that the proposed method is potential and effective.

  8. Work-based learning and role extension: A match made in heaven?

    Energy Technology Data Exchange (ETDEWEB)

    Eddy, Angela [Robert Winston Building, 11-15 Collegiate Crescent Campus, Faculty of Health and Well Being, Sheffield Hallam University, Sheffield S10 2BP (United Kingdom)], E-mail: a.eddy@shu.ac.uk

    2010-05-15

    This paper presents, and discusses the findings from an exploratory study which examined a cohort of postgraduate therapeutic radiographer students' experiences of undertaking work-based learning to support role extension. The findings showed that three themes emerged which impacted on individual experiences: organisational issues, role and practice issues related to competence development and the individual's background and experience. The conclusions are that new models must emerge, and be evaluated, to offer appropriate support to those individuals who demonstrate the skills and ability to progress to advanced and consultant levels. Departments need to deliberate how they can effectively introduce and support role extension, giving specific consideration to study time, the number of higher level practitioners in training, as well as how to offer effective clinical supervision. Collaboration between higher education institutes and departments should enable the development of tripartite agreements to facilitate effective support for the learners.

  9. An Extensible Dialogue Script for a Robot Based on Unification of State-Transition Models

    Directory of Open Access Journals (Sweden)

    Yosuke Matsusaka

    2010-01-01

    development of communication function of the robot. Compared to previous extension-by-connection method used in behavior-based communication robot developments, the extension-by-unification method has the ability to decompose the script into components. The decomposed components can be recomposed to build a new application easily. In this paper, first we, explain a reformulation we have applied to the conventional state-transition model. Second, we explain a set of algorithms to decompose, recompose, and detect the conflict of each component. Third, we explain a dialogue engine and a script management server we have developed. The script management server has a function to propose reusable components to the developer in real time by implementing the conflict detection algorithm. The dialogue engine SEAT (Speech Event-Action Translator has flexible adapter mechanism to enable quick integration to robotic systems. We have confirmed that by the application of three robots, development efficiency has improved by 30%.

  10. Extension Activity Support System (EASY: A Web-Based Prototype for Facilitating Farm Management

    Directory of Open Access Journals (Sweden)

    Christopher Pettit

    2012-01-01

    Full Text Available In response to disparate advances in delivering spatial information to support agricultural extension activities, the Extension Activity Support System (EASY project was established to develop a vision statement and conceptual design for such a system based on a national needs assessment. Personnel from across Australia were consulted and a review of existing farm information/management software undertaken to ensure that any system that is eventually produced from the EASY vision will build on the strengths of existing efforts. This paper reports on the collaborative consultative process undertaken to create the EASY vision as well as the conceptual technical design and business models that could support a fully functional spatially enabled online system.

  11. Enhanced mechanical performance of biocompatible hemicelluloses-based hydrogel via chain extension.

    Science.gov (United States)

    Qi, Xian-Ming; Chen, Ge-Gu; Gong, Xiao-Dong; Fu, Gen-Que; Niu, Ya-Shuai; Bian, Jing; Peng, Feng; Sun, Run-Cang

    2016-01-01

    Hemicelluloses are widely used to prepare gel materials because of their renewability, biodegradability, and biocompatibility. Here, molecular chain extension of hemicelluloses was obtained in a two-step process. Composite hydrogels were prepared via free radical graft copolymerization of crosslinked quaternized hemicelluloses (CQH) and acrylic acid (AA) in the presence of crosslinking agent N,N'-methylenebisacrylamide (MBA). This chain extension strategy significantly improved the mechanical performance of the resulting hydrogels. The crosslinking density, compression modulus, and swelling capacities of hydrogels were tuned by changing the AA/CQH and MBA/CQH contents. Moreover, the biocompatibility test suggests that the hemicelluloses-based hydrogels exhibited no toxicity to cells and allowed cell growth. Taken together, these properties demonstrated that the composite hydrogels have potential applications in the fields of water absorbents, cell culture, and other functional biomaterials. PMID:27634095

  12. Enhanced mechanical performance of biocompatible hemicelluloses-based hydrogel via chain extension

    Science.gov (United States)

    Qi, Xian-Ming; Chen, Ge-Gu; Gong, Xiao-Dong; Fu, Gen-Que; Niu, Ya-Shuai; Bian, Jing; Peng, Feng; Sun, Run-Cang

    2016-01-01

    Hemicelluloses are widely used to prepare gel materials because of their renewability, biodegradability, and biocompatibility. Here, molecular chain extension of hemicelluloses was obtained in a two-step process. Composite hydrogels were prepared via free radical graft copolymerization of crosslinked quaternized hemicelluloses (CQH) and acrylic acid (AA) in the presence of crosslinking agent N,N’-methylenebisacrylamide (MBA). This chain extension strategy significantly improved the mechanical performance of the resulting hydrogels. The crosslinking density, compression modulus, and swelling capacities of hydrogels were tuned by changing the AA/CQH and MBA/CQH contents. Moreover, the biocompatibility test suggests that the hemicelluloses-based hydrogels exhibited no toxicity to cells and allowed cell growth. Taken together, these properties demonstrated that the composite hydrogels have potential applications in the fields of water absorbents, cell culture, and other functional biomaterials. PMID:27634095

  13. A repository based on a dynamically extensible data model supporting multidisciplinary research in neuroscience

    Directory of Open Access Journals (Sweden)

    Corradi Luca

    2012-10-01

    Full Text Available Abstract Background Robust, extensible and distributed databases integrating clinical, imaging and molecular data represent a substantial challenge for modern neuroscience. It is even more difficult to provide extensible software environments able to effectively target the rapidly changing data requirements and structures of research experiments. There is an increasing request from the neuroscience community for software tools addressing technical challenges about: (i supporting researchers in the medical field to carry out data analysis using integrated bioinformatics services and tools; (ii handling multimodal/multiscale data and metadata, enabling the injection of several different data types according to structured schemas; (iii providing high extensibility, in order to address different requirements deriving from a large variety of applications simply through a user runtime configuration. Methods A dynamically extensible data structure supporting collaborative multidisciplinary research projects in neuroscience has been defined and implemented. We have considered extensibility issues from two different points of view. First, the improvement of data flexibility has been taken into account. This has been done through the development of a methodology for the dynamic creation and use of data types and related metadata, based on the definition of “meta” data model. This way, users are not constrainted to a set of predefined data and the model can be easily extensible and applicable to different contexts. Second, users have been enabled to easily customize and extend the experimental procedures in order to track each step of acquisition or analysis. This has been achieved through a process-event data structure, a multipurpose taxonomic schema composed by two generic main objects: events and processes. Then, a repository has been built based on such data model and structure, and deployed on distributed resources thanks to a Grid-based approach

  14. Field of view extension and truncation correction for MR-based human attenuation correction in simultaneous MR/PET imaging

    Energy Technology Data Exchange (ETDEWEB)

    Blumhagen, Jan O., E-mail: janole.blumhagen@siemens.com; Ladebeck, Ralf; Fenchel, Matthias [Magnetic Resonance, Siemens AG Healthcare Sector, Erlangen 91052 (Germany); Braun, Harald; Quick, Harald H. [Institute of Medical Physics, Friedrich-Alexander-University Erlangen-Nürnberg, Erlangen 91052 (Germany); Faul, David [Siemens Medical Solutions, New York, New York 10015 (United States); Scheffler, Klaus [MRC Department, Max Planck Institute for Biological Cybernetics, Tübingen 72076, Germany and Department of Biomedical Magnetic Resonance, University Hospital Tübingen, Tübingen 72076 (Germany)

    2014-02-15

    Purpose: In quantitative PET imaging, it is critical to accurately measure and compensate for the attenuation of the photons absorbed in the tissue. While in PET/CT the linear attenuation coefficients can be easily determined from a low-dose CT-based transmission scan, in whole-body MR/PET the computation of the linear attenuation coefficients is based on the MR data. However, a constraint of the MR-based attenuation correction (AC) is the MR-inherent field-of-view (FoV) limitation due to static magnetic field (B{sub 0}) inhomogeneities and gradient nonlinearities. Therefore, the MR-based human AC map may be truncated or geometrically distorted toward the edges of the FoV and, consequently, the PET reconstruction with MR-based AC may be biased. This is especially of impact laterally where the patient arms rest beside the body and are not fully considered. Methods: A method is proposed to extend the MR FoV by determining an optimal readout gradient field which locally compensates B{sub 0} inhomogeneities and gradient nonlinearities. This technique was used to reduce truncation in AC maps of 12 patients, and the impact on the PET quantification was analyzed and compared to truncated data without applying the FoV extension and additionally to an established approach of PET-based FoV extension. Results: The truncation artifacts in the MR-based AC maps were successfully reduced in all patients, and the mean body volume was thereby increased by 5.4%. In some cases large patient-dependent changes in SUV of up to 30% were observed in individual lesions when compared to the standard truncated attenuation map. Conclusions: The proposed technique successfully extends the MR FoV in MR-based attenuation correction and shows an improvement of PET quantification in whole-body MR/PET hybrid imaging. In comparison to the PET-based completion of the truncated body contour, the proposed method is also applicable to specialized PET tracers with little uptake in the arms and might

  15. FDI and Accommodation Using NN Based Techniques

    Science.gov (United States)

    Garcia, Ramon Ferreiro; de Miguel Catoira, Alberto; Sanz, Beatriz Ferreiro

    Massive application of dynamic backpropagation neural networks is used on closed loop control FDI (fault detection and isolation) tasks. The process dynamics is mapped by means of a trained backpropagation NN to be applied on residual generation. Process supervision is then applied to discriminate faults on process sensors, and process plant parameters. A rule based expert system is used to implement the decision making task and the corresponding solution in terms of faults accommodation and/or reconfiguration. Results show an efficient and robust FDI system which could be used as the core of an SCADA or alternatively as a complement supervision tool operating in parallel with the SCADA when applied on a heat exchanger.

  16. A frequency domain radar interferometric imaging (FII) technique based on high-resolution methods

    Science.gov (United States)

    Luce, H.; Yamamoto, M.; Fukao, S.; Helal, D.; Crochet, M.

    2001-01-01

    In the present work, we propose a frequency-domain interferometric imaging (FII) technique for a better knowledge of the vertical distribution of the atmospheric scatterers detected by MST radars. This is an extension of the dual frequency-domain interferometry (FDI) technique to multiple frequencies. Its objective is to reduce the ambiguity (resulting from the use of only two adjacent frequencies), inherent with the FDI technique. Different methods, commonly used in antenna array processing, are first described within the context of application to the FII technique. These methods are the Fourier-based imaging, the Capon's and the singular value decomposition method used with the MUSIC algorithm. Some preliminary simulations and tests performed on data collected with the middle and upper atmosphere (MU) radar (Shigaraki, Japan) are also presented. This work is a first step in the developments of the FII technique which seems to be very promising.

  17. Life extension techniques for aircraft structures-Extending durability and promoting damage tolerance through bonded crack retarders

    OpenAIRE

    Irving, Phil E.; Zhang, Xiang; Doucet, J; Figueroa-Gordon, Douglas J.; Boscolo, M.; Heinimann, M.; Shepherd, G.; Fitzpatrick, M. E.; D. Liljedahl

    2011-01-01

    This paper explores the viability of the bonded crack retarder concept as a device for life extension of damage tolerant aircraft structures. Fatigue crack growth behaviour in metallic substrates with bonded straps has been determined. SENT and M(T) test coupons and large scale skin-stringer panels were tested at constant and variable amplitude loads. The strap materials were glass fibre polymer composites, GLARE, AA7085 and Ti-6Al-4V. Comprehensive measurements were made of...

  18. Huffman-based code compression techniques for embedded processors

    KAUST Repository

    Bonny, Mohamed Talal

    2010-09-01

    The size of embedded software is increasing at a rapid pace. It is often challenging and time consuming to fit an amount of required software functionality within a given hardware resource budget. Code compression is a means to alleviate the problem by providing substantial savings in terms of code size. In this article we introduce a novel and efficient hardware-supported compression technique that is based on Huffman Coding. Our technique reduces the size of the generated decoding table, which takes a large portion of the memory. It combines our previous techniques, Instruction Splitting Technique and Instruction Re-encoding Technique into new one called Combined Compression Technique to improve the final compression ratio by taking advantage of both previous techniques. The instruction Splitting Technique is instruction set architecture (ISA)-independent. It splits the instructions into portions of varying size (called patterns) before Huffman coding is applied. This technique improves the final compression ratio by more than 20% compared to other known schemes based on Huffman Coding. The average compression ratios achieved using this technique are 48% and 50% for ARM and MIPS, respectively. The Instruction Re-encoding Technique is ISA-dependent. It investigates the benefits of reencoding unused bits (we call them reencodable bits) in the instruction format for a specific application to improve the compression ratio. Reencoding those bits can reduce the size of decoding tables by up to 40%. Using this technique, we improve the final compression ratios in comparison to the first technique to 46% and 45% for ARM and MIPS, respectively (including all overhead that incurs). The Combined Compression Technique improves the compression ratio to 45% and 42% for ARM and MIPS, respectively. In our compression technique, we have conducted evaluations using a representative set of applications and we have applied each technique to two major embedded processor architectures

  19. Personnel neutron monitoring based on albedo technique

    International Nuclear Information System (INIS)

    This work deals with the study, design and test of a personal neutron monitor based on the detection of albedo neutrons from the body and its further relation to the incident flux. By this method, neutrons of energies below about 100 KeV can be efficiently detected, providing good information in the region where the biological effectiveness of neutron radiation starts to rise. The system consists of a pair of Thermoluminescent Detectors (6 LiF - 7 LiF) ∼ inside a polyethylene moderating body, in order to increase the sensitivity. The surface of the dosimeter facing away from the body is covered by a layer of a borated resin to assure appropriate shielding of incident low energy neutrons. The response of the dosimeter to monoenergetic neutrons from a 3 MeV Van de Graaff, to Am Be neutrons and to neutrons from a thermal column was investigated. The directional sensitivity, the effect of beam divergence was well as the effect of changes in dosimeter-to-body distances were also studied. (author)

  20. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    OpenAIRE

    T. Subramani; J. Jothi

    2014-01-01

    This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case s...

  1. Non-Destructive Techniques Based on Eddy Current Testing

    Directory of Open Access Journals (Sweden)

    Ernesto Vázquez-Sánchez

    2011-02-01

    Full Text Available Non-destructive techniques are used widely in the metal industry in order to control the quality of materials. Eddy current testing is one of the most extensively used non-destructive techniques for inspecting electrically conductive materials at very high speeds that does not require any contact between the test piece and the sensor. This paper includes an overview of the fundamentals and main variables of eddy current testing. It also describes the state-of-the-art sensors and modern techniques such as multi-frequency and pulsed systems. Recent advances in complex models towards solving crack-sensor interaction, developments in instrumentation due to advances in electronic devices, and the evolution of data processing suggest that eddy current testing systems will be increasingly used in the future.

  2. Non-destructive techniques based on eddy current testing.

    Science.gov (United States)

    García-Martín, Javier; Gómez-Gil, Jaime; Vázquez-Sánchez, Ernesto

    2011-01-01

    Non-destructive techniques are used widely in the metal industry in order to control the quality of materials. Eddy current testing is one of the most extensively used non-destructive techniques for inspecting electrically conductive materials at very high speeds that does not require any contact between the test piece and the sensor. This paper includes an overview of the fundamentals and main variables of eddy current testing. It also describes the state-of-the-art sensors and modern techniques such as multi-frequency and pulsed systems. Recent advances in complex models towards solving crack-sensor interaction, developments in instrumentation due to advances in electronic devices, and the evolution of data processing suggest that eddy current testing systems will be increasingly used in the future.

  3. Array-based techniques for fingerprinting medicinal herbs

    Directory of Open Access Journals (Sweden)

    Xue Charlie

    2011-05-01

    Full Text Available Abstract Poor quality control of medicinal herbs has led to instances of toxicity, poisoning and even deaths. The fundamental step in quality control of herbal medicine is accurate identification of herbs. Array-based techniques have recently been adapted to authenticate or identify herbal plants. This article reviews the current array-based techniques, eg oligonucleotides microarrays, gene-based probe microarrays, Suppression Subtractive Hybridization (SSH-based arrays, Diversity Array Technology (DArT and Subtracted Diversity Array (SDA. We further compare these techniques according to important parameters such as markers, polymorphism rates, restriction enzymes and sample type. The applicability of the array-based methods for fingerprinting depends on the availability of genomics and genetics of the species to be fingerprinted. For the species with few genome sequence information but high polymorphism rates, SDA techniques are particularly recommended because they require less labour and lower material cost.

  4. Fusion Based Neutron Sources for Security Applications: Neutron Techniques

    OpenAIRE

    Albright, S.; Seviour, Rebecca

    2014-01-01

    The current reliance on X-Rays and intelligence for na- tional security is insufficient to combat the current risks of smuggling and terrorism seen on an international level. There are a range of neutron based security techniques which have the potential to dramatically improve national security. Neutron techniques can be broadly grouped into neutron in/neutron out and neutron in/photon out tech- niques. The use of accelerator based fusion devices will potentially enable to wide spread applic...

  5. The detection of bulk explosives using nuclear-based techniques

    Energy Technology Data Exchange (ETDEWEB)

    Morgado, R.E.; Gozani, T.; Seher, C.C.

    1988-01-01

    In 1986 we presented a rationale for the detection of bulk explosives based on nuclear techniques that addressed the requirements of civil aviation security in the airport environment. Since then, efforts have intensified to implement a system based on thermal neutron activation (TNA), with new work developing in fast neutron and energetic photon reactions. In this paper we will describe these techniques and present new results from laboratory and airport testing. Based on preliminary results, we contended in our earlier paper that nuclear-based techniques did provide sufficiently penetrating probes and distinguishable detectable reaction products to achieve the FAA operational goals; new data have supported this contention. The status of nuclear-based techniques for the detection of bulk explosives presently under investigation by the US Federal Aviation Administration (FAA) is reviewed. These include thermal neutron activation (TNA), fast neutron activation (FNA), the associated particle technique, nuclear resonance absorption, and photoneutron activation. The results of comprehensive airport testing of the TNA system performed during 1987-88 are summarized. From a technical point of view, nuclear-based techniques now represent the most comprehensive and feasible approach for meeting the operational criteria of detection, false alarms, and throughput. 9 refs., 5 figs., 2 tabs.

  6. Application of glyph-based techniques for multivariate engineering visualization

    Science.gov (United States)

    Glazar, Vladimir; Marunic, Gordana; Percic, Marko; Butkovic, Zlatko

    2016-01-01

    This article presents a review of glyph-based techniques for engineering visualization as well as practical application for the multivariate visualization process. Two glyph techniques, Chernoff faces and star glyphs, uncommonly used in engineering practice, are described, applied to the selected data set, run through the chosen optimization methods and user evaluated. As an example of how these techniques function, a set of data for the optimization of a heat exchanger with a microchannel coil is adopted for visualization. The results acquired by the chosen visualization techniques are related to the results of optimization carried out by the response surface method and compared with the results of user evaluation. Based on the data set from engineering research and practice, the advantages and disadvantages of these techniques for engineering visualization are identified and discussed.

  7. Classification of acute pancreatitis based on retroperitoneal extension: Application of the concept of interfascial planes

    Energy Technology Data Exchange (ETDEWEB)

    Ishikawa, Kazuo [Osaka Prefectural Senshu Critical Care Medical Center, 2-24 Rinku-Ourai-Kita, Izumisano-shi, Osaka 598-0048 (Japan)]. E-mail: ishikawa@sccmc.izumisano.osaka.jp; Idoguchi, Koji [Osaka Prefectural Senshu Critical Care Medical Center, 2-24 Rinku-Ourai-Kita, Izumisano-shi, Osaka 598-0048 (Japan)]. E-mail: idoguchi@sccmc.izumisano.osaka.jp; Tanaka, Hiroshi [Department of Traumatology and Acute Critical Care Medicine, Osaka University Hospital, 2-15 Yamada-Oka, Suita-shi, Osaka 565-0871 (Japan)]. E-mail: tanaka@hp-emerg.med.osaka-u.ac.jp; Tohma, Yoshiki [Osaka Prefectural Nakakawachi Medical Center of Acute Medicine, 3-4-13 Nishi-Iwata, Higashiosaka-shi, Osaka 578-0947 (Japan)]. E-mail: tohma@nmcam.jp; Ukai, Isao [Department of Traumatology and Acute Critical Care Medicine, Osaka University Hospital, 2-15 Yamada-Oka, Suita-shi, Osaka 565-0871 (Japan)]. E-mail: isaoukai@nifty.com; Watanabe, Hiroaki [Osaka Prefectural Senshu Critical Care Medical Center, 2-24 Rinku-Ourai-Kita, Izumisano-shi, Osaka 598-0048 (Japan)]. E-mail: hiwatana@sccmc.izumisano.osaka.jp; Matsuoka, Tetsuya [Osaka Prefectural Senshu Critical Care Medical Center, 2-24 Rinku-Ourai-Kita, Izumisano-shi, Osaka 598-0048 (Japan)]. E-mail: matsuoka@sccmc.izumisano.osaka.jp; Yokota, Jyunichiro [Osaka Prefectural Senshu Critical Care Medical Center, 2-24 Rinku-Ourai-Kita, Izumisano-shi, Osaka 598-0048 (Japan)]. E-mail: jyokota@sccmc.izumisano.osaka.jp; Sugimoto, Tsuyoshi [Ryokufukai Hospital, 1-16-13 Setoguchi, Hirano-ku, Osaka-shi, Osaka 547-0034 (Japan)]. E-mail: ts-sugi@ryokufukai.or.jp

    2006-12-15

    Objective: This study aimed to provide a classification system for acute pancreatitis by applying the principle that the disease spreads along the retroperitoneal interfascial planes. Materials and methods: Medical records and computed tomography (CT) images of 58 patients with acute pancreatitis treated between 2000 and 2005 were reviewed. The retroperitoneum was subdivided into 10 components according to the concept of interfascial planes. Severity of acute pancreatitis was graded according to retroperitoneal extension into these components. Clinical courses and outcomes were compared with the grades. The prognostic value of our classification system was compared with that of Balthazar's CT severity index (CTSI). Results: Retroperitoneal extension of acute fluid collection was classified into five grades: Grade I, fluid confined to the anterior pararenal space or retromesenteric plane (8 patients); Grade II, fluid spreading into the lateroconal or retrorenal plane (16 patients); Grade III, fluid spreading into the combined interfascial plane (8 patients); Grade IV, fluid spreading into the subfascial plane beyond the interfascial planes (15 patients); and Grade V, fluid intruding into the posterior pararenal space (11 patients). Morbidity and mortality were 92.3% and 38.5% in the 26 patients with Grade IV or V disease, and 21.9% and 0% in the 32 patients with Grade I, II, or III disease. Morbidity and mortality were 86.7% and 33.3% in patients with disease classified 'severe' according to the CTSI, and 37.5% and 9.4% in patients with disease classified 'mild' or 'moderate'. Conclusion: Classification of acute pancreatitis based on CT-determined retroperitoneal extension is a useful indicator of the disease severity and prognosis without the need for contrast-medium enhanced CT.

  8. Rapid analysis of steels using laser-based techniques

    International Nuclear Information System (INIS)

    Based on the data obtained by this study, we conclude that laser-based techniques can be used to provide at least semi-quantitative information about the elemental composition of molten steel. Of the two techniques investigated here, the Sample-Only method appears preferable to the LIBS (laser-induced breakdown spectroscopy) method because of its superior analytical performance. In addition, the Sample-Only method would probably be easier to incorporate into a steel plant environment. However, before either technique can be applied to steel monitoring, additional research is needed

  9. A Review On Segmentation Based Image Compression Techniques

    Directory of Open Access Journals (Sweden)

    S.Thayammal

    2013-11-01

    Full Text Available Abstract -The storage and transmission of imagery become more challenging task in the current scenario of multimedia applications. Hence, an efficient compression scheme is highly essential for imagery, which reduces the requirement of storage medium and transmission bandwidth. Not only improvement in performance and also the compression techniques must converge quickly in order to apply them for real time applications. There are various algorithms have been done in image compression, but everyone has its own pros and cons. Here, an extensive analysis between existing methods is performed. Also, the use of existing works is highlighted, for developing the novel techniques which face the challenging task of image storage and transmission in multimedia applications.

  10. Extensive aqueous deposits at the base of the dichotomy boundary in Nilosyrtis Mensae, Mars

    Science.gov (United States)

    Bandfield, Joshua L.; Amador, Elena S.

    2016-09-01

    Thermal emission imaging system (THEMIS) and Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) spectral datasets were used to identify high bulk SiO2 and hydrated compositions throughout the Nilosyrtis Mensae region. Four isolated locations were identified across the region showing short wavelength silicate absorptions within the 8-12 μm spectral region, indicating surfaces dominated by high Si phases. Much more extensive exposures of hydrated compositions are present throughout the region, indicated by a spectral absorption near 1.9 μm in CRISM data. Although limited in spatial coverage, detailed spectral observations indicate that the hydrated materials contain Fe/Mg-smectites and hydrated silica along with minor exposures of Mg-carbonates and an unidentified hydrated phase. The high SiO2 and hydrated materials are present in layered sediments near the base of topographic scarps at the hemispheric dichotomy boundary, typically near or within low albedo sand deposits. The source of the high SiO2 and hydrated materials appears to be from groundwater discharge from Nili Fossae and Syrtis Major to the south, where there is evidence for extensive aqueous alteration of the subsurface. Although discontinuous, the exposures of high SiO2 and hydrated materials span a wide area and are present in a similar geomorphological context to previously identified deposits in western Hellas Basin. These regional deposits may reflect aqueous conditions and alteration within the adjacent crust of the martian highlands.

  11. Extension of car-to-X-communication by radiolocation techniques; Erweiterung der Fahrzeug-zu-Fahrzeug-Kommunikation mit Funkortungstechniken

    Energy Technology Data Exchange (ETDEWEB)

    Schwarz, Daniel [BMW Group, Muenchen (Germany). Bereich Konzepte ' ' Aktive und Integrale Sicherheit' '

    2012-10-15

    A transponder, carried by vulnerable road users, e.g. pedestrians or bicyclists, and integrated in a smartphone, enables localisation from the vehicle side. A connected driver assistant system can detect movements, predict a possible collision and take preventive actions like informing the driver, braking or steering. In the research project Ko-TAG (Cooperative Transponder) within the research initiative Ko-FAS (Cooperative Vehicle Safety) new sensor technologies are developed. BMW, as Ko-TAG project manager, explains the cooperative sensor technology and the automatic control technique of the preventive safety system. (orig.)

  12. Local Community Detection in Complex Networks Based on Maximum Cliques Extension

    Directory of Open Access Journals (Sweden)

    Meng Fanrong

    2014-01-01

    Full Text Available Detecting local community structure in complex networks is an appealing problem that has attracted increasing attention in various domains. However, most of the current local community detection algorithms, on one hand, are influenced by the state of the source node and, on the other hand, cannot effectively identify the multiple communities linked with the overlapping nodes. We proposed a novel local community detection algorithm based on maximum clique extension called LCD-MC. The proposed method firstly finds the set of all the maximum cliques containing the source node and initializes them as the starting local communities; then, it extends each unclassified local community by greedy optimization until a certain objective is satisfied; finally, the expected local communities will be obtained until all maximum cliques are assigned into a community. An empirical evaluation using both synthetic and real datasets demonstrates that our algorithm has a superior performance to some of the state-of-the-art approaches.

  13. Structural Fatigue Reliability Based on Extension of Random Loads into Interval Variables

    Directory of Open Access Journals (Sweden)

    Qiangfeng Wang

    2013-01-01

    Full Text Available According to the problem that for a structure under random loads, the structural fatigue life cant be directly calculated out by S-N curves and linear Miner cumulative damage rule. Owing to the uncertainty of loads, and the problem of the inaccuracy of calculated structural reliability index for the existence of deviation between measured data in projects and real data, the research method for structural fatigue reliability based on extension of random loads into interval variables is proposed. The innovation is that we can accurately calculate out the interval of the structural fatigue life and reliability index of a structure according to the probability density function of stress level of random loads and the coefficient of variation of measured loads. By practical calculation example, it is proved that this method is more suitable to practical engineering comparing to traditional methods. It will provide a perfect research approach for reliability analysis of the structure under random loads.

  14. Typing of 49 autosomal SNPs by single base extension and capillary electrophoresis for forensic genetic testing.

    Science.gov (United States)

    Børsting, Claus; Tomas, Carmen; Morling, Niels

    2012-01-01

    We describe a method for simultaneous amplification of 49 autosomal single nucleotide polymorphisms (SNPs) by multiplex PCR and detection of the SNP alleles by single base extension (SBE) and capillary electrophoresis. All the SNPs may be amplified from only 100 pg of genomic DNA and the length of the amplicons range from 65 to 115 bp. The high sensitivity and the short amplicon sizes make the assay very suitable for typing of degraded DNA samples, and the low mutation rate of SNPs makes the assay very useful for relationship testing. Combined, these advantages make the assay well suited for disaster victim identifications, where the DNA from the victims may be highly degraded and the victims are identified via investigation of their relatives. The assay was validated according to the ISO 17025 standard and used for routine case work in our laboratory. PMID:22139655

  15. An Extension on Logic of Semantic Web Based on OIL and RDF%基于OIL和RDFS的语义化Web逻辑扩展

    Institute of Scientific and Technical Information of China (English)

    姚绍文; 宗勇; 刘爱莲; 周明天

    2002-01-01

    RDF(S) is the standard for metadata of Web that aims to turn the current Web into the foundation of a machine-understandable knowledge system.By employing the folw-blown technique of KE,the Web-NG is targeted to provide the semantic inter-operability for data and knowledge exchange.As a hot spot in KE,ontology can be well integrated with the Web and such integration will support knowledge modeling and description on the Web-based spplications.This paper introduces RDF(S).Web-NG and an Web-Based ontology modeling language OIL.Based on the introduction,the paper defines an extension of OIL/RDFS on proposition logic,and presents an approach for modeling proposition,the paper defines an extension of OIL/RDFS on proposition logic,and presents an approach for modeling propostion formula and inferring rule.

  16. An Improved Particle Swarm Optimization Algorithm Based on Ensemble Technique

    Institute of Scientific and Technical Information of China (English)

    SHI Yan; HUANG Cong-ming

    2006-01-01

    An improved particle swarm optimization (PSO) algorithm based on ensemble technique is presented. The algorithm combines some previous best positions (pbest) of the particles to get an ensemble position (Epbest), which is used to replace the global best position (gbest). It is compared with the standard PSO algorithm invented by Kennedy and Eberhart and some improved PSO algorithms based on three different benchmark functions. The simulation results show that the improved PSO based on ensemble technique can get better solutions than the standard PSO and some other improved algorithms under all test cases.

  17. Power system stabilizers based on modern control techniques

    Energy Technology Data Exchange (ETDEWEB)

    Malik, O.P.; Chen, G.P.; Zhang, Y.; El-Metwally, K. [Calgary Univ., AB (Canada). Dept. of Electrical and Computer Engineering

    1994-12-31

    Developments in digital technology have made it feasible to develop and implement improved controllers based on sophisticated control techniques. Power system stabilizers based on adaptive control, fuzzy logic and artificial networks are being developed. Each of these control techniques possesses unique features and strengths. In this paper, the relative performance of power systems stabilizers based on adaptive control, fuzzy logic and neural network, both in simulation studies and real time tests on a physical model of a power system, is presented and compared to that of a fixed parameter conventional power system stabilizer. (author) 16 refs., 45 figs., 3 tabs.

  18. An Agent Communication Framework Based on XML and SOAP Technique

    Institute of Scientific and Technical Information of China (English)

    李晓瑜

    2009-01-01

    This thesis introducing XML technology and SOAP technology,present an agent communication fi-amework based on XML and SOAP technique,and analyze the principle,architecture,function and benefit of it. At the end, based on KQML communication primitive lan- guages.

  19. Data Mining and Neural Network Techniques in Case Based System

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper first puts forward a case-based system framework basedon data mining techniques. Then the paper examines the possibility of using neural n etworks as a method of retrieval in such a case-based system. In this system we propose data mining algorithms to discover case knowledge and other algorithms.

  20. Simulation-based optimization parametric optimization techniques and reinforcement learning

    CERN Document Server

    Gosavi, Abhijit

    2003-01-01

    Simulation-Based Optimization: Parametric Optimization Techniques and Reinforcement Learning introduces the evolving area of simulation-based optimization. The book's objective is two-fold: (1) It examines the mathematical governing principles of simulation-based optimization, thereby providing the reader with the ability to model relevant real-life problems using these techniques. (2) It outlines the computational technology underlying these methods. Taken together these two aspects demonstrate that the mathematical and computational methods discussed in this book do work. Broadly speaking, the book has two parts: (1) parametric (static) optimization and (2) control (dynamic) optimization. Some of the book's special features are: *An accessible introduction to reinforcement learning and parametric-optimization techniques. *A step-by-step description of several algorithms of simulation-based optimization. *A clear and simple introduction to the methodology of neural networks. *A gentle introduction to converg...

  1. Activities of colistin- and minocycline-based combinations against extensive drug resistant Acinetobacter baumannii isolates from intensive care unit patients

    OpenAIRE

    Li Jian; Zhu De-mei; Huang Jun; Liu Xiao-fang; Liang Wang; Zhang Jing

    2011-01-01

    Abstract Background Extensive drug resistance of Acinetobacter baumannii is a serious problem in the clinical setting. It is therefore important to find active antibiotic combinations that could be effective in the treatment of infections caused by this problematic 'superbug'. In this study, we analyzed the in vitro activities of three colistin-based combinations and a minocycline-based combination against clinically isolated extensive drug resistant Acinetobacter baumannii (XDR-AB) strains. ...

  2. Efficiency of Integrated Geophysical techniques in delineating the extension of Bauxites ore in north Riyadh, Saudi Arabia

    Science.gov (United States)

    Almutairi, Yasir; Alanazi, Abdulrahman; Almutairi, Muteb; Alsama, Ali; Alhenaki, Bander; Almalki, Awadh

    2014-05-01

    We exploit the integration of Ground Penetrating Radar (GPR) techniques, magnetic gradiometry, resistivity measurements and seismic tomography for the high-resolution non-invasive study for delineating the subsurface Bauxite layer in Zabira locality, north of Riyadh. Integrated GPR, magnetic gradiometry resistivity and seismic refraction are used in the case of high contrast targets and provide an accurate subsurface reconstruction of foundations in sediments. Resistivity pseudo-sections are in particular useful for the areal identification of contacts between soils and foundations while GPR and magnetic gradiometry provide detailed information about location and depth of the structures. Results obtained by GPR, Magnetics and resistivity shows a very good agreement in mapping the bauxite layer depth at range of 5 m to 10 m while the depth obtained by seismic refraction was 10 m to 15 m due to lack of velocity information.

  3. A Hough Transform based Technique for Text Segmentation

    CERN Document Server

    Saha, Satadal; Nasipuri, Mita; Basu, Dipak Kr

    2010-01-01

    Text segmentation is an inherent part of an OCR system irrespective of the domain of application of it. The OCR system contains a segmentation module where the text lines, words and ultimately the characters must be segmented properly for its successful recognition. The present work implements a Hough transform based technique for line and word segmentation from digitized images. The proposed technique is applied not only on the document image dataset but also on dataset for business card reader system and license plate recognition system. For standardization of the performance of the system the technique is also applied on public domain dataset published in the website by CMATER, Jadavpur University. The document images consist of multi-script printed and hand written text lines with variety in script and line spacing in single document image. The technique performs quite satisfactorily when applied on mobile camera captured business card images with low resolution. The usefulness of the technique is verifie...

  4. Optical fiber hydrogen sensor based on photothermal reflectance detection technique

    Energy Technology Data Exchange (ETDEWEB)

    Yarai, A; Nakanishi, T, E-mail: yarai@osaka-sandai.ac.j [Department of Electronics, Information and Communication Engineering Osaka Sangyo University, 3-1-1 Nakagaito, Daito, Osaka 574-8530 (Japan)

    2010-03-01

    This article proposes an optical fiber hydrogen (H{sub 2}) sensor based on photothermal reflectance [hereinafter modulated optical reflectance (MOR)] technique. Our H{sub 2} sensor is based on a technique that detects the changes of MOR signals in palladium film, which is widely known to absorb H{sub 2} gas. The sensor element is a palladium film deposited on a 2.5-mm-diameter FC-ferrule made from zirconium to realize the optical fiber sensor. Our recently developed 'laptop' MOR instrument assembled with optical fiber components is applied to this technique. Thus, an extremely compact photothermal H{sub 2} gas sensor system can be constructed. We certified that our technique has hypersensitive less than 1% with a concentration of H{sub 2} gas and also demonstrated that the response time is approximately 5 seconds when the sensor head is filled with H{sub 2} gas.

  5. Face Recognition Approach Based on Wavelet - Curvelet Technique

    Directory of Open Access Journals (Sweden)

    Muzhir Shaban Al-Ani

    2012-04-01

    Full Text Available In this paper, a novel face recognition approach based on wavelet-curvelet technique, is proposed. This algorithm based on the similarities embedded in the images, That utilize the wavelet-curvelet technique to extract facial features. The implemented technique can overcome on the other mathematical image analysis approaches. This approaches may suffered from the potential for a high dimensional feature space, Therefore it aims to reduce the dimensionality that reduce the required computational power and memory size. Then the Nearest Mean Classifier (NMC is adopted to recognize different faces. In this work, three major experiments were done. two face databases (MAFD & ORL, and higher recognition rate is obtained by the implementation of this techniques.

  6. MPPT Technique Based on Current and Temperature Measurements

    Directory of Open Access Journals (Sweden)

    Eduardo Moreira Vicente

    2015-01-01

    Full Text Available This paper presents a new maximum power point tracking (MPPT method based on the measurement of temperature and short-circuit current, in a simple and efficient approach. These measurements, which can precisely define the maximum power point (MPP, have not been used together in other existing techniques. The temperature is measured with a low cost sensor and the solar irradiance is estimated through the relationship of the measured short-circuit current and its reference. Fast tracking speed and stable steady-state operation are advantages of this technique, which presents higher performance when compared to other well-known techniques.

  7. Protein-Protein Interactions Prediction Based on Iterative Clique Extension with Gene Ontology Filtering

    Directory of Open Access Journals (Sweden)

    Lei Yang

    2014-01-01

    Full Text Available Cliques (maximal complete subnets in protein-protein interaction (PPI network are an important resource used to analyze protein complexes and functional modules. Clique-based methods of predicting PPI complement the data defection from biological experiments. However, clique-based predicting methods only depend on the topology of network. The false-positive and false-negative interactions in a network usually interfere with prediction. Therefore, we propose a method combining clique-based method of prediction and gene ontology (GO annotations to overcome the shortcoming and improve the accuracy of predictions. According to different GO correcting rules, we generate two predicted interaction sets which guarantee the quality and quantity of predicted protein interactions. The proposed method is applied to the PPI network from the Database of Interacting Proteins (DIP and most of the predicted interactions are verified by another biological database, BioGRID. The predicted protein interactions are appended to the original protein network, which leads to clique extension and shows the significance of biological meaning.

  8. Protein-protein interactions prediction based on iterative clique extension with gene ontology filtering.

    Science.gov (United States)

    Yang, Lei; Tang, Xianglong

    2014-01-01

    Cliques (maximal complete subnets) in protein-protein interaction (PPI) network are an important resource used to analyze protein complexes and functional modules. Clique-based methods of predicting PPI complement the data defection from biological experiments. However, clique-based predicting methods only depend on the topology of network. The false-positive and false-negative interactions in a network usually interfere with prediction. Therefore, we propose a method combining clique-based method of prediction and gene ontology (GO) annotations to overcome the shortcoming and improve the accuracy of predictions. According to different GO correcting rules, we generate two predicted interaction sets which guarantee the quality and quantity of predicted protein interactions. The proposed method is applied to the PPI network from the Database of Interacting Proteins (DIP) and most of the predicted interactions are verified by another biological database, BioGRID. The predicted protein interactions are appended to the original protein network, which leads to clique extension and shows the significance of biological meaning. PMID:24578640

  9. Community-based Ontology Development, Annotation and Discussion with MediaWiki extension Ontokiwi and Ontokiwi-based Ontobedia

    Science.gov (United States)

    Ong, Edison; He, Yongqun

    2016-01-01

    Hundreds of biological and biomedical ontologies have been developed to support data standardization, integration and analysis. Although ontologies are typically developed for community usage, community efforts in ontology development are limited. To support ontology visualization, distribution, and community-based annotation and development, we have developed Ontokiwi, an ontology extension to the MediaWiki software. Ontokiwi displays hierarchical classes and ontological axioms. Ontology classes and axioms can be edited and added using Ontokiwi form or MediaWiki source editor. Ontokiwi also inherits MediaWiki features such as Wikitext editing and version control. Based on the Ontokiwi/MediaWiki software package, we have developed Ontobedia, which targets to support community-based development and annotations of biological and biomedical ontologies. As demonstrations, we have loaded the Ontology of Adverse Events (OAE) and the Cell Line Ontology (CLO) into Ontobedia. Our studies showed that Ontobedia was able to achieve expected Ontokiwi features.

  10. EMAAS: An extensible grid-based Rich Internet Application for microarray data analysis and management

    Directory of Open Access Journals (Sweden)

    Aitman T

    2008-11-01

    Full Text Available Abstract Background Microarray experimentation requires the application of complex analysis methods as well as the use of non-trivial computer technologies to manage the resultant large data sets. This, together with the proliferation of tools and techniques for microarray data analysis, makes it very challenging for a laboratory scientist to keep up-to-date with the latest developments in this field. Our aim was to develop a distributed e-support system for microarray data analysis and management. Results EMAAS (Extensible MicroArray Analysis System is a multi-user rich internet application (RIA providing simple, robust access to up-to-date resources for microarray data storage and analysis, combined with integrated tools to optimise real time user support and training. The system leverages the power of distributed computing to perform microarray analyses, and provides seamless access to resources located at various remote facilities. The EMAAS framework allows users to import microarray data from several sources to an underlying database, to pre-process, quality assess and analyse the data, to perform functional analyses, and to track data analysis steps, all through a single easy to use web portal. This interface offers distance support to users both in the form of video tutorials and via live screen feeds using the web conferencing tool EVO. A number of analysis packages, including R-Bioconductor and Affymetrix Power Tools have been integrated on the server side and are available programmatically through the Postgres-PLR library or on grid compute clusters. Integrated distributed resources include the functional annotation tool DAVID, GeneCards and the microarray data repositories GEO, CELSIUS and MiMiR. EMAAS currently supports analysis of Affymetrix 3' and Exon expression arrays, and the system is extensible to cater for other microarray and transcriptomic platforms. Conclusion EMAAS enables users to track and perform microarray data

  11. Runtime Monitoring Technique to handle Tautology based SQL Injection Attacks

    Directory of Open Access Journals (Sweden)

    Ramya Dharam

    2015-05-01

    Full Text Available Software systems, like web applications, are often used to provide reliable online services such as banking, shopping, social networking, etc., to users. The increasing use of such systems has led to a high need for assuring confidentiality, integrity, and availability of user data. SQL Injection Attacks (SQLIAs is one of the major security threats to web applications. It allows attackers to get unauthorized access to the back-end database consisting of confidential user information. In this paper we present and evaluate a Runtime Monitoring Technique to detect and prevent tautology based SQLIAs in web applications. Our technique monitors the behavior of the application during its post- deployment to identify all the tautology based SQLIAs. A framework called Runtime Monitoring Framework, that implements our technique, is used in the development of runtime monitors. The framework uses two pre-deployment testing techniques, such as basis-path and data-flow to identify a minimal set of all legal/valid execution paths of the application. Runtime monitors are then developed and integrated to perform runtime monitoring of the application, during its post-deployment for the identified valid/legal execution paths. For evaluation we targeted a subject application with a large number of both legitimate inputs and illegitimate tautology based inputs, and measured the performance of the proposed technique. The results of our study show that runtime monitor developed for the application was successfully able to detect all the tautology based attacks without generating any false positives.

  12. Extensions of the lost letter technique to divisive issues of creationism, darwinism, sex education, and gay and lesbian affiliations.

    Science.gov (United States)

    Bridges, F Stephen; Anzalone, Debra A; Ryan, Stuart W; Anzalone, Fanancy L

    2002-04-01

    Two field studies using 1,004 "lost letters" were designed to test the hypotheses that returned responses would be greater in small towns than from a city, that addressees' affiliation with a group either (1) opposed to physical education in schools, (2) supporting gay and lesbian teachers, or (3) advocating Creationism or Darwinism would reduce the return rate. Of 504 letters "lost" in Study A, 163 (32.3%) were returned in the mail from residents of southeast Louisiana and indicated across 3 addressees and 2 sizes of community, addressees' affiLiations were not associated with returned responses. Community size and addressees' affiliations were associated with significantly different rates of return in the city. Return rates from sites within a city were lower when letters were addressed to an organization which opposed (teaching) health education in the schools than to one supporting daily health education. Of 500 letters "lost" in Study B, 95 (19.0%) were returned from residents of northwest Florida and indicated across 5 addressees and 2 sizes of community, addressees' affiliations were significantly associated with returned responses overall (5 addressees) and in small towns (control, Creationism, Darwinism addressees), but not with community size. Community size and addressees' affiliations were associated with significantly different rates of return in small towns, with returns greater than or equal to those in the city (except for the addressee advocating teaching Darwinism in public schools). The present findings appear to show that applications of the lost letter technique to other divisive social issues are useful in assessing public opinion.

  13. A Lyapunov-Based Extension to Particle Swarm Dynamics for Continuous Function Optimization

    Science.gov (United States)

    Bhattacharya, Sayantani; Konar, Amit; Das, Swagatam; Han, Sang Yong

    2009-01-01

    The paper proposes three alternative extensions to the classical global-best particle swarm optimization dynamics, and compares their relative performance with the standard particle swarm algorithm. The first extension, which readily follows from the well-known Lyapunov's stability theorem, provides a mathematical basis of the particle dynamics with a guaranteed convergence at an optimum. The inclusion of local and global attractors to this dynamics leads to faster convergence speed and better accuracy than the classical one. The second extension augments the velocity adaptation equation by a negative randomly weighted positional term of individual particle, while the third extension considers the negative positional term in place of the inertial term. Computer simulations further reveal that the last two extensions outperform both the classical and the first extension in terms of convergence speed and accuracy. PMID:22303158

  14. PCA Based Rapid and Real Time Face Recognition Technique

    Directory of Open Access Journals (Sweden)

    T R Chandrashekar

    2013-12-01

    Full Text Available Economical and efficient that is used in various applications is face Biometric which has been a popular form biometric system. Face recognition system is being a topic of research for last few decades. Several techniques are proposed to improve the performance of face recognition system. Accuracy is tested against intensity, distance from camera, and pose variance. Multiple face recognition is another subtopic which is under research now a day. Speed at which the technique works is a parameter under consideration to evaluate a technique. As an example a support vector machine performs really well for face recognition but the computational efficiency degrades significantly with increase in number of classes. Eigen Face technique produces quality features for face recognition but the accuracy is proved to be comparatively less to many other techniques. With increase in use of core processors in personal computers and application demanding speed in processing and multiple face detection and recognition system (for example an entry detection system in shopping mall or an industry, demand for such systems are cumulative as there is a need for automated systems worldwide. In this paper we propose a novel system of face recognition developed with C# .Net that can detect multiple faces and can recognize the faces parallel by utilizing the system resources and the core processors. The system is built around Haar Cascade based face detection and PCA based face recognition system with C#.Net. Parallel library designed for .Net is used to aide to high speed detection and recognition of the real time faces. Analysis of the performance of the proposed technique with some of the conventional techniques reveals that the proposed technique is not only accurate, but also is fast in comparison to other techniques.

  15. Extensive Evaluation of a Diffusion Denuder Technique for the Quantification of Atmospheric Stable and Radioactive Molecular Iodine

    DEFF Research Database (Denmark)

    Huang, Ru-Jin; Hou, Xiaolin; Hoffmann, Thorsten

    2010-01-01

    In this paper we present the evaluation and optimization of a new approach for the quantification of gaseous molecular iodine (I2) for laboratory- and field-based studies and its novel application for the measurement of radioactive molecular iodine. α-Cyclodextrin (α-CD) in combination with 129I......− is shown to be an effective denuder coating for the sampling of gaseous I2 by the formation of an inclusion complex. The entrapped 127I2 together with the 129I− spike in the coating is then released and derivatized to 4-iodo-N,N-dimethylaniline (4-I-DMA) for gas chromatography−mass spectrometry (GC...

  16. A Knowledge—Based Specification Technique for Protocol Development

    Institute of Scientific and Technical Information of China (English)

    张尧学; 史美林; 等

    1993-01-01

    is paper proposes a knowledge-based specification technique(KST)for protocol development.This technique semi-automatically translates a protocol described in an informal description(natural languages or graphs)into one described in forml specifications(Estells and SDL).The translation processes are suported by knowledge stored in the knowledge base.This paper discusses the concept,the specification control mechanism of KST and the rules and algorithms for production of FSM's which is the basis of Estelle and SDL.

  17. Least-squares based iterative multipath super-resolution technique

    CERN Document Server

    Nam, Wooseok

    2011-01-01

    In this paper, we study the problem of multipath channel estimation for direct sequence spread spectrum signals. To resolve multipath components arriving within a short interval, we propose a new algorithm called the least-squares based iterative multipath super-resolution (LIMS). Compared to conventional super-resolution techniques, such as the multiple signal classification (MUSIC) and the estimation of signal parameters via rotation invariance techniques (ESPRIT), our algorithm has several appealing features. In particular, even in critical situations where the conventional super-resolution techniques are not very powerful due to limited data or the correlation between path coefficients, the LIMS algorithm can produce successful results. In addition, due to its iterative nature, the LIMS algorithm is suitable for recursive multipath tracking, whereas the conventional super-resolution techniques may not be. Through numerical simulations, we show that the LIMS algorithm can resolve the first arrival path amo...

  18. Dimensionality Reduction using SOM based Technique for Face Recognition

    Directory of Open Access Journals (Sweden)

    Dinesh Kumar

    2008-05-01

    Full Text Available Unsupervised or Self-Organized learning algorithms have become very popular for discovery of significant patterns or features in the input data. The three prominent algorithms namely Principal Component Analysis (PCA, Self Organizing Maps (SOM, and Independent Component Analysis (ICA have widely and successfully been used for face recognition. In this paper a SOM based technique for dimensionality reduction has been proposed. This technique has also been successfully used for face recognition. A comparative study of PCA, SOM and ICA along with the proposed technique for face recognition has also been given. Simulation results indicate that SOM is better than the other techniques for the given face database and the classifier used. The results also show that the performance of the system decreases as the number of classes increase.

  19. 75 FR 15693 - Extension of Web-Based TRICARE Assistance Program Demonstration Project

    Science.gov (United States)

    2010-03-30

    ... (TRIAP), the other a permanent benefit (Telemental Health). Significant program changes days prior to... extension. SUMMARY: This notice is to advise interested parties of an extension to the Military Health... referenced in the original Federal Register (FR) Notice, 74 FR 3667, dated July 24, 2009. The...

  20. Using Maps in Web Analytics to Evaluate the Impact of Web-Based Extension Programs

    Science.gov (United States)

    Veregin, Howard

    2015-01-01

    Maps can be a valuable addition to the Web analytics toolbox for Extension programs that use the Web to disseminate information. Extension professionals use Web analytics tools to evaluate program impacts. Maps add a unique perspective through visualization and analysis of geographic patterns and their relationships to other variables. Maps can…

  1. R2O, an extensible and semantically based database-to-ontology mapping language

    OpenAIRE

    Barrasa Rodríguez, Jesús; Corcho, Oscar; A. GÓMEZ-PÉREZ

    2004-01-01

    We present R2O, an extensible and declarative language to describe mappings between relational DB schemas and ontologies implemented in RDF(S) or OWL. R2O provides an extensible set of primitives with welldefined semantics. This language has been conceived expressive enough to cope with complex mapping cases arisen from situations of low similarity between the ontology and the DB models.

  2. Promoting Behavior Change Using Social Norms: Applying a Community Based Social Marketing Tool to Extension Programming

    Science.gov (United States)

    Chaudhary, Anil Kumar; Warner, Laura A.

    2015-01-01

    Most educational programs are designed to produce lower level outcomes, and Extension educators are challenged to produce behavior change in target audiences. Social norms are a very powerful proven tool for encouraging sustainable behavior change among Extension's target audiences. Minor modifications to program content to demonstrate the…

  3. A Randomized Controlled Trial Assessing Growth of Infants Fed a 100% Whey Extensively Hydrolyzed Formula Compared With a Casein-Based Extensively Hydrolyzed Formula.

    Science.gov (United States)

    Fields, David; Czerkies, Laura; Sun, Shumei; Storm, Heidi; Saavedra, José; Sorensen, Ricardo

    2016-01-01

    This study compared the growth of healthy infants fed a hypoallergenic 100% whey-based extensively hydrolyzed formula (EHF) with Bifidobacterium lactis (test) with that of infants fed an extensively hydrolyzed casein formula (control). Formula-fed infants (14 ± 3 days) were randomized to test or control groups until 112 days of age. Anthropometrics were assessed at 14, 28, 56, 84, and 112 days, and daily records were kept for 2 days prior to study visits. Serum albumin and plasma amino acids at 84 days were assessed in a subset. A total of 282 infants were randomized (124 test, 158 control). Significantly more infants dropped out of the control (56%) as compared with the test (41%) group. Mean daily weight gain was significantly higher in the test group compared with the control group (27.95 ± 5.91 vs 25.93 ± 6.12 g/d; P = .027) with the test group reporting significantly fewer stools (2.2 vs 3.6 stools/d; P 3 loose stools/d and a higher incidence of vomiting as compared with the test group. There were no differences in gas, mood, sleep, or serum albumin. Plasma arginine and valine were significantly lower in the test group, whereas leucine and lysine were higher; all values were within normal limits. Significantly more adverse events attributed to the study formula were reported in the control group. The 100% whey-based hypoallergenic EHF containing Bifidobacterium lactis and medium chain triglycerides supported growth of healthy infants. Future studies on the application of this formula in clinically indicated populations are warranted. PMID:27336009

  4. Wavelet transformation based watermarking technique for human electrocardiogram (ECG).

    Science.gov (United States)

    Engin, Mehmet; Cidam, Oğuz; Engin, Erkan Zeki

    2005-12-01

    Nowadays, watermarking has become a technology of choice for a broad range of multimedia copyright protection applications. Watermarks have also been used to embed prespecified data in biomedical signals. Thus, the watermarked biomedical signals being transmitted through communication are resistant to some attacks. This paper investigates discrete wavelet transform based watermarking technique for signal integrity verification in an Electrocardiogram (ECG) coming from four ECG classes for monitoring application of cardiovascular diseases. The proposed technique is evaluated under different noisy conditions for different wavelet functions. Daubechies (db2) wavelet function based technique performs better than those of Biorthogonal (bior5.5) wavelet function. For the beat-to-beat applications, all performance results belonging to four ECG classes are highly moderate. PMID:16235811

  5. A Component-Based Vocabulary-Extensible Sign Language Gesture Recognition Framework.

    Science.gov (United States)

    Wei, Shengjing; Chen, Xiang; Yang, Xidong; Cao, Shuai; Zhang, Xu

    2016-01-01

    Sign language recognition (SLR) can provide a helpful tool for the communication between the deaf and the external world. This paper proposed a component-based vocabulary extensible SLR framework using data from surface electromyographic (sEMG) sensors, accelerometers (ACC), and gyroscopes (GYRO). In this framework, a sign word was considered to be a combination of five common sign components, including hand shape, axis, orientation, rotation, and trajectory, and sign classification was implemented based on the recognition of five components. Especially, the proposed SLR framework consisted of two major parts. The first part was to obtain the component-based form of sign gestures and establish the code table of target sign gesture set using data from a reference subject. In the second part, which was designed for new users, component classifiers were trained using a training set suggested by the reference subject and the classification of unknown gestures was performed with a code matching method. Five subjects participated in this study and recognition experiments under different size of training sets were implemented on a target gesture set consisting of 110 frequently-used Chinese Sign Language (CSL) sign words. The experimental results demonstrated that the proposed framework can realize large-scale gesture set recognition with a small-scale training set. With the smallest training sets (containing about one-third gestures of the target gesture set) suggested by two reference subjects, (82.6 ± 13.2)% and (79.7 ± 13.4)% average recognition accuracy were obtained for 110 words respectively, and the average recognition accuracy climbed up to (88 ± 13.7)% and (86.3 ± 13.7)% when the training set included 50~60 gestures (about half of the target gesture set). The proposed framework can significantly reduce the user's training burden in large-scale gesture recognition, which will facilitate the implementation of a practical SLR system. PMID:27104534

  6. A Component-Based Vocabulary-Extensible Sign Language Gesture Recognition Framework

    Directory of Open Access Journals (Sweden)

    Shengjing Wei

    2016-04-01

    Full Text Available Sign language recognition (SLR can provide a helpful tool for the communication between the deaf and the external world. This paper proposed a component-based vocabulary extensible SLR framework using data from surface electromyographic (sEMG sensors, accelerometers (ACC, and gyroscopes (GYRO. In this framework, a sign word was considered to be a combination of five common sign components, including hand shape, axis, orientation, rotation, and trajectory, and sign classification was implemented based on the recognition of five components. Especially, the proposed SLR framework consisted of two major parts. The first part was to obtain the component-based form of sign gestures and establish the code table of target sign gesture set using data from a reference subject. In the second part, which was designed for new users, component classifiers were trained using a training set suggested by the reference subject and the classification of unknown gestures was performed with a code matching method. Five subjects participated in this study and recognition experiments under different size of training sets were implemented on a target gesture set consisting of 110 frequently-used Chinese Sign Language (CSL sign words. The experimental results demonstrated that the proposed framework can realize large-scale gesture set recognition with a small-scale training set. With the smallest training sets (containing about one-third gestures of the target gesture set suggested by two reference subjects, (82.6 ± 13.2% and (79.7 ± 13.4% average recognition accuracy were obtained for 110 words respectively, and the average recognition accuracy climbed up to (88 ± 13.7% and (86.3 ± 13.7% when the training set included 50~60 gestures (about half of the target gesture set. The proposed framework can significantly reduce the user’s training burden in large-scale gesture recognition, which will facilitate the implementation of a practical SLR system.

  7. A thermopneumatic micropump based on micro-engineering techniques

    NARCIS (Netherlands)

    Pol, van de F.C.M.; Lintel, van H.T.G.; Elwenspoek, M.; Fluitman, J.H.J.

    1990-01-01

    The design, working principle and realization of an electro-thermopneumatic liquid pump based on micro-engineering techniques are described. The pump, which is of the reciprocating displacement type, comprises a pump chamber, a thin silicon pump membrane and two silicon check valves to direct the fl

  8. "Ayeli": Centering Technique Based on Cherokee Spiritual Traditions.

    Science.gov (United States)

    Garrett, Michael Tlanusta; Garrett, J. T.

    2002-01-01

    Presents a centering technique called "Ayeli," based on Cherokee spiritual traditions as a way of incorporating spirituality into counseling by helping clients identify where they are in their journey, where they want to be, and how they can get there. Relevant Native cultural traditions and meanings are explored. (Contains 25 references.) (GCP)

  9. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune

    1998-01-01

    This article describes the work carried out within the project: Modal Analysis Based on the Random Decrement Technique - Application to Civil Engineering Structures. The project is part of the research programme: Dynamics of Structures sponsored by the Danish Technical Research Counsil. The planned...

  10. Cost-optimal power system extension under flow-based market coupling

    Energy Technology Data Exchange (ETDEWEB)

    Hagspiel, Simeon; Jaegemann, Cosima; Lindenberger, Dietmar [Koeln Univ. (Germany). Energiewirtschaftliches Inst.; Brown, Tom; Cherevatskiy, Stanislav; Troester, Eckehard [Energynautics GmbH, Langen (Germany)

    2013-05-15

    Electricity market models, implemented as dynamic programming problems, have been applied widely to identify possible pathways towards a cost-optimal and low carbon electricity system. However, the joint optimization of generation and transmission remains challenging, mainly due to the fact that different characteristics and rules apply to commercial and physical exchanges of electricity in meshed networks. This paper presents a methodology that allows to optimize power generation and transmission infrastructures jointly through an iterative approach based on power transfer distribution factors (PTDFs). As PTDFs are linear representations of the physical load flow equations, they can be implemented in a linear programming environment suitable for large scale problems. The algorithm iteratively updates PTDFs when grid infrastructures are modified due to cost-optimal extension and thus yields an optimal solution with a consistent representation of physical load flows. The method is first demonstrated on a simplified three-node model where it is found to be robust and convergent. It is then applied to the European power system in order to find its cost-optimal development under the prescription of strongly decreasing CO{sub 2} emissions until 2050.

  11. Extension of the COSYMA-ECONOMICS module - cost calculations based on different economic sectors

    International Nuclear Information System (INIS)

    The COSYMA program system for evaluating the off-site consequences of accidental releases of radioactive material to the atmosphere includes an ECONOMICS module for assessing economic consequences. The aim of this module is to convert various consequences (radiation-induced health effects and impacts resulting from countermeasures) caused by an accident into the common framework of economic costs; this allows different effects to be expressed in the same terms and thus to make these effects comparable. With respect to the countermeasure 'movement of people', the dominant cost categories are 'loss-of-income costs' and 'costs of lost capital services'. In the original version of the ECONOMICS module these costs are calculated on the basis of the total number of people moved. In order to take into account also regional or local economic peculiarities of a nuclear site, the ECONOMICS module has been extended: Calculation of the above mentioned cost categories is now based on the number of employees in different economic sectors in the affected area. This extension of the COSYMA ECONOMICS module is described in more detail. (orig.)

  12. The Real-Time Image Processing Technique Based on DSP

    Institute of Scientific and Technical Information of China (English)

    QI Chang; CHEN Yue-hua; HUANG Tian-shu

    2005-01-01

    This paper proposes a novel real-time image processing technique based on digital singnal processor (DSP). At the aspect of wavelet transform(WT) algorithm, the technique uses algorithm of second generation wavelet transform-lifting scheme WT that has low calculation complexity property for the 2-D image data processing. Since the processing effect of lifting scheme WT for 1-D data is better than the effect of it for 2-D data obviously, this paper proposes a reformative processing method: Transform 2-D image data to 1-D data sequence by linearization method, then process the 1-D data sequence by algorithm of lifting scheme WT. The method changes the image convolution mode,which based on the cross filtering of rows and columns. At the aspect of hardware realization, the technique optimizes the program structure of DSP to exert the operation power with the in-chip memorizer of DSP. The experiment results show that the real-time image processing technique proposed in this paper can meet the real-time requirement of video-image transmitting in the video surveillance system of electric power. So the technique is a feasible and efficient DSP solution.

  13. Video Multiple Watermarking Technique Based on Image Interlacing Using DWT

    Science.gov (United States)

    Ibrahim, Mohamed M.; Abdel Kader, Neamat S.; Zorkany, M.

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth. PMID:25587570

  14. Video multiple watermarking technique based on image interlacing using DWT.

    Science.gov (United States)

    Ibrahim, Mohamed M; Abdel Kader, Neamat S; Zorkany, M

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth.

  15. CDAPubMed: a browser extension to retrieve EHR-based biomedical literature

    Directory of Open Access Journals (Sweden)

    Perez-Rey David

    2012-04-01

    Full Text Available Abstract Background Over the last few decades, the ever-increasing output of scientific publications has led to new challenges to keep up to date with the literature. In the biomedical area, this growth has introduced new requirements for professionals, e.g., physicians, who have to locate the exact papers that they need for their clinical and research work amongst a huge number of publications. Against this backdrop, novel information retrieval methods are even more necessary. While web search engines are widespread in many areas, facilitating access to all kinds of information, additional tools are required to automatically link information retrieved from these engines to specific biomedical applications. In the case of clinical environments, this also means considering aspects such as patient data security and confidentiality or structured contents, e.g., electronic health records (EHRs. In this scenario, we have developed a new tool to facilitate query building to retrieve scientific literature related to EHRs. Results We have developed CDAPubMed, an open-source web browser extension to integrate EHR features in biomedical literature retrieval approaches. Clinical users can use CDAPubMed to: (i load patient clinical documents, i.e., EHRs based on the Health Level 7-Clinical Document Architecture Standard (HL7-CDA, (ii identify relevant terms for scientific literature search in these documents, i.e., Medical Subject Headings (MeSH, automatically driven by the CDAPubMed configuration, which advanced users can optimize to adapt to each specific situation, and (iii generate and launch literature search queries to a major search engine, i.e., PubMed, to retrieve citations related to the EHR under examination. Conclusions CDAPubMed is a platform-independent tool designed to facilitate literature searching using keywords contained in specific EHRs. CDAPubMed is visually integrated, as an extension of a widespread web browser, within the standard

  16. GIS-based assessment of groundwater level on extensive karst areas

    Science.gov (United States)

    Kopecskó, Zsanett; Józsa, Edina

    2016-04-01

    Karst topographies represent unique geographical regions containing caves and extensive underground water systems developed especially on soluble rocks such as limestone, marble and gypsum. The significance of these areas is evident considering that 12% of the ice-free continental area consists of landscapes developed on carbonate rocks and 20-25% of the global population depends mostly on groundwater obtained from these systems. Karst water reservoirs already give the 25% of the freshwater resources globally. Comprehensive studies considering these regions are the key to explore chances of the exploitation and to analyze the consequences of contamination, anthropogenic effects and natural processes within these specific hydro-geological characteristics. For the proposed work we chose several of the largest karst regions over the ice-free part of continents, representing diverse climatic and topographic characteristics. An important aspect of the study is that there are no available in situ hydrologic measurements over the entire research area that would provide discrete sampling of soil, ground and surface water. As replacement for the detailed surveys, multi remote sensing data (Gravity Recovery and Climate Experiment (GRACE) satellite derivatives products, Moderate Resolution Imaging Spectroradiometer (MODIS) satellite products and Tropical Rainfall Measuring Mission (TRMM) monthly rainfalls satellite datasets) are used along with model reanalysis data (Global Precipitation Climate Center data (GPCC) and Global Land Data Assimilation System (GLDAS)) to study the variation on extensive karst areas in response to the changing climate and anthropogenic effects. The analyses are carried out within open source software environment to enable sharing of the proposed algorithm. The GRASS GIS geoinformatic software and the R statistical program proved to be adequate choice to collect and analyze the above mentioned datasets by taking advantage of their interoperability

  17. Knowledge based systems advanced concepts, techniques and applications

    CERN Document Server

    1997-01-01

    The field of knowledge-based systems (KBS) has expanded enormously during the last years, and many important techniques and tools are currently available. Applications of KBS range from medicine to engineering and aerospace.This book provides a selected set of state-of-the-art contributions that present advanced techniques, tools and applications. These contributions have been prepared by a group of eminent researchers and professionals in the field.The theoretical topics covered include: knowledge acquisition, machine learning, genetic algorithms, knowledge management and processing under unc

  18. Proposing a Wiki-Based Technique for Collaborative Essay Writing

    Directory of Open Access Journals (Sweden)

    Mabel Ortiz Navarrete

    2014-10-01

    Full Text Available This paper aims at proposing a technique for students learning English as a foreign language when they collaboratively write an argumentative essay in a wiki environment. A wiki environment and collaborative work play an important role within the academic writing task. Nevertheless, an appropriate and systematic work assignment is required in order to make use of both. In this paper the proposed technique when writing a collaborative essay mainly attempts to provide the most effective way to enhance equal participation among group members by taking as a base computer mediated collaboration. Within this context, the students’ role is clearly defined and individual and collaborative tasks are explained.

  19. Line impedance estimation using model based identification technique

    DEFF Research Database (Denmark)

    Ciobotaru, Mihai; Agelidis, Vassilios; Teodorescu, Remus

    2011-01-01

    into the operation of the grid-connected power converters. This paper describes a quasi passive method for estimating the line impedance of the distribution electricity network. The method uses the model based identification technique to obtain the resistive and inductive parts of the line impedance. The quasi......The estimation of the line impedance can be used by the control of numerous grid-connected systems, such as active filters, islanding detection techniques, non-linear current controllers, detection of the on/off grid operation mode. Therefore, estimating the line impedance can add extra functions...

  20. An Observed Voting System Based On Biometric Technique

    Directory of Open Access Journals (Sweden)

    B. Devikiruba

    2015-08-01

    Full Text Available ABSTRACT This article describes a computational framework which can run almost on every computer connected to an IP based network to study biometric techniques. This paper discusses with a system protecting confidential information puts strong security demands on the identification. Biometry provides us with a user-friendly method for this identification and is becoming a competitor for current identification mechanisms. The experimentation section focuses on biometric verification specifically based on fingerprints. This article should be read as a warning to those thinking of using methods of identification without first examine the technical opportunities for compromising mechanisms and the associated legal consequences. The development is based on the java language that easily improves software packages that is useful to test new control techniques.

  1. A Survey on Statistical Based Single Channel Speech Enhancement Techniques

    Directory of Open Access Journals (Sweden)

    Sunnydayal. V

    2014-11-01

    Full Text Available Speech enhancement is a long standing problem with various applications like hearing aids, automatic recognition and coding of speech signals. Single channel speech enhancement technique is used for enhancement of the speech degraded by additive background noises. The background noise can have an adverse impact on our ability to converse without hindrance or smoothly in very noisy environments, such as busy streets, in a car or cockpit of an airplane. Such type of noises can affect quality and intelligibility of speech. This is a survey paper and its object is to provide an overview of speech enhancement algorithms so that enhance the noisy speech signal which is corrupted by additive noise. The algorithms are mainly based on statistical based approaches. Different estimators are compared. Challenges and Opportunities of speech enhancement are also discussed. This paper helps in choosing the best statistical based technique for speech enhancement

  2. Characteristic Modules of Dual Extensions and Gr(o)bner Bases

    Institute of Scientific and Technical Information of China (English)

    Yun Ge XU; Long Cai LI

    2004-01-01

    Let C be a finite dimensional directed algebra over an algebraically closed field k and A = A(C) the dual extension of C. The characteristic modules of A are constructed explicitly for a class of directed algebras, which generalizes the results of Xi. Furthermore, it is shown that the characteristic modules of dual extensions of a certain class of directed algebras admit the left Grobner basis theory in the sense of E. L. Green.

  3. Multivariate discrimination technique based on the Bayesian theory

    Institute of Scientific and Technical Information of China (English)

    JIN Ping; PAN Chang-zhou; XIAO Wei-guo

    2007-01-01

    A multivariate discrimination technique was established based on the Bayesian theory. Using this technique, P/S ratios of different types (e.g., Pn/Sn, Pn/Lg, Pg/Sn or Pg/Lg) measured within different frequency bands and from different stations were combined together to discriminate seismic events in Central Asia. Major advantages of the Bayesian approach are that the probability to be an explosion for any unknown event can be directly calculated given the measurements of a group of discriminants, and at the same time correlations among these discriminants can be fully taken into account. It was proved theoretically that the Bayesian technique would be optimal and its discriminating performance would be better than that of any individual discriminant as well as better than that yielded by the linear combination approach ignoring correlations among discriminants. This conclusion was also validated in this paper by applying the Bayesian approach to the above-mentioned observed data.

  4. RANKINGTHEREFACTORING TECHNIQUES BASED ON THE INTERNAL QUALITY ATTRIBUTES

    Directory of Open Access Journals (Sweden)

    Sultan Alshehri

    2014-01-01

    Full Text Available The analytic hierarchy process (AHP has been applied in many fields and especially to complex engineering problems and applications. The AHP is capable of structuring decision problems and finding mathematically determined judgments built on knowledge and experience. This suggests that AHP should prove useful in agile software development where complex decisions occur routinely. In this paper, the AHP is used to rank the refactoring techniques based on the internal code quality attributes. XP encourages applying the refactoring where the code smells bad. However, refactoring may consume more time and efforts.So, to maximize the benefits of the refactoring in less time and effort, AHP has been applied to achieve this purpose. It was found that ranking the refactoring techniques helped the XP team to focus on the technique that improve the code and the XP development process in general.

  5. An Evaluation of the Relationship between Supervisory Techniques and Organizational Outcomes among the Supervisors in the Agricultural Extension Service in the Eastern Region Districts of Uganda. Summary of Research 81.

    Science.gov (United States)

    Padde, Paul; And Others

    A descriptive study examined the relationship between supervisory techniques and organizational outcomes among supervisors in the agricultural extension service in eight districts in eastern Uganda. Self-rating and rater forms of the Multifactor Leadership Questionnaire were sent to 220 extension agents, 8 field supervisors, and 8 deputy field…

  6. Statistics and Machine Learning based Outlier Detection Techniques for Exoplanets

    Science.gov (United States)

    Goel, Amit; Montgomery, Michele

    2015-08-01

    Architectures of planetary systems are observable snapshots in time that can indicate formation and dynamic evolution of planets. The observable key parameters that we consider are planetary mass and orbital period. If planet masses are significantly less than their host star masses, then Keplerian Motion is defined as P^2 = a^3 where P is the orbital period in units of years and a is the orbital period in units of Astronomical Units (AU). Keplerian motion works on small scales such as the size of the Solar System but not on large scales such as the size of the Milky Way Galaxy. In this work, for confirmed exoplanets of known stellar mass, planetary mass, orbital period, and stellar age, we analyze Keplerian motion of systems based on stellar age to seek if Keplerian motion has an age dependency and to identify outliers. For detecting outliers, we apply several techniques based on statistical and machine learning methods such as probabilistic, linear, and proximity based models. In probabilistic and statistical models of outliers, the parameters of a closed form probability distributions are learned in order to detect the outliers. Linear models use regression analysis based techniques for detecting outliers. Proximity based models use distance based algorithms such as k-nearest neighbour, clustering algorithms such as k-means, or density based algorithms such as kernel density estimation. In this work, we will use unsupervised learning algorithms with only the proximity based models. In addition, we explore the relative strengths and weaknesses of the various techniques by validating the outliers. The validation criteria for the outliers is if the ratio of planetary mass to stellar mass is less than 0.001. In this work, we present our statistical analysis of the outliers thus detected.

  7. Fabrication of thermoplastics chips through lamination based techniques.

    Science.gov (United States)

    Miserere, Sandrine; Mottet, Guillaume; Taniga, Velan; Descroix, Stephanie; Viovy, Jean-Louis; Malaquin, Laurent

    2012-04-24

    In this work, we propose a novel strategy for the fabrication of flexible thermoplastic microdevices entirely based on lamination processes. The same low-cost laminator apparatus can be used from master fabrication to microchannel sealing. This process is appropriate for rapid prototyping at laboratory scale, but it can also be easily upscaled to industrial manufacturing. For demonstration, we used here Cycloolefin Copolymer (COC), a thermoplastic polymer that is extensively used for microfluidic applications. COC is a thermoplastic polymer with good chemical resistance to common chemicals used in microfluidics such as acids, bases and most polar solvents. Its optical quality and mechanical resistance make this material suitable for a large range of applications in chemistry or biology. As an example, the electrokinetic separation of pollutants is proposed in the present study.

  8. An Efficient Image Compression Technique Based on Arithmetic Coding

    Directory of Open Access Journals (Sweden)

    Prof. Rajendra Kumar Patel

    2012-12-01

    Full Text Available The rapid growth of digital imaging applications, including desktop publishing, multimedia, teleconferencing, and high visual definition has increased the need for effective and standardized image compression techniques. Digital Images play a very important role for describing the detailed information. The key obstacle for many applications is the vast amount of data required to represent a digital image directly. The various processes of digitizing the images to obtain it in the best quality for the more clear and accurate information leads to the requirement of more storage space and better storage and accessing mechanism in the form of hardware or software. In this paper we concentrate mainly on the above flaw so that we reduce the space with best quality image compression. State-ofthe-art techniques can compress typical images from 1/10 to 1/50 their uncompressed size without visibly affecting image quality. From our study I observe that there is a need of good image compression technique which provides better reduction technique in terms of storage and quality. Arithmetic coding is the best way to reducing encoding data. So in this paper we propose arithmetic coding with walsh transformation based image compression technique which is an efficient way of reduction

  9. Finding Within Cluster Dense Regions Using Distance Based Technique

    Directory of Open Access Journals (Sweden)

    Wesam Ashour

    2012-03-01

    Full Text Available One of the main categories in Data Clustering is density based clustering. Density based clustering techniques like DBSCAN are attractive because they can find arbitrary shaped clusters along with noisy outlier. The main weakness of the traditional density based algorithms like DBSCAN is clustering the different density level data sets. DBSCAN calculations done according to given parameters applied to all points in a data set, while densities of the data set clusters may be totally different. The proposed algorithm overcomes this weakness of the traditional density based algorithms. The algorithm starts with partitioning the data within a cluster to units based on a user parameter and compute the density for each unit separately. Consequently, the algorithm compares the results and merges neighboring units with closer approximate density values to become a new cluster. The experimental results of the simulation show that the proposed algorithm gives good results in finding clusters for different density cluster data set.

  10. Vibration based fault detection techniques for mechanical structures

    International Nuclear Information System (INIS)

    Fault detection techniques for mechanical structures and their application are becoming more important in recent years in the field of structure health monitoring. The intention of this paper is to present available state of the art methods that could be implemented in mechanical structures. Global based methods that contribute on detection, isolation and analysis of fault from changes in vibration characteristics of the structure are presented. Techniques are based on the idea that modal frequencies, mode shapes and modal damping as model properties of the structure can be determine as function of physical properties. In addition, if a fault appears in mechanical structure, this can be recognized as changes in the physical properties, which leads to cause changes in the modal properties of the structure. (Author)

  11. A thermopneumatic micropump based on micro-engineering techniques

    OpenAIRE

    Pol, van der, P.; Lintel, van, H.T.G.; Elwenspoek, M; Fluitman, J.H.J.

    1990-01-01

    The design, working principle and realization of an electro-thermopneumatic liquid pump based on micro-engineering techniques are described. The pump, which is of the reciprocating displacement type, comprises a pump chamber, a thin silicon pump membrane and two silicon check valves to direct the flow. The dynamic pressure of an amount of gas contained in a cavity, controlled by resistive heating, actuates the pump membrane. The cavity, chambers, channels and valves are realized in silicon wa...

  12. Quantum state tomography of orbital angular momentum photonics qubits via a projection-based technique

    CERN Document Server

    Nicolas, Adrien; Giacobino, Elisabeth; Maxein, Dominik; Laurat, Julien

    2014-01-01

    While measuring the orbital angular momentum state of bright light beams can be performed using imaging techniques, a full characterization at the single-photon level is challenging. For applications to quantum optics and quantum information science, such characterization is an essential capability. Here, we present a setup to perform the quantum state tomography of photonic qubits encoded in this degree of freedom. The method is based on a projective technique using spatial mode projection via fork holograms and single-mode fibers inserted into an interferometer. The alignment and calibration of the device is detailed as well as the measurement sequence to reconstruct the associated density matrix. Possible extensions to higher-dimensional spaces are discussed.

  13. NEW VERSATILE CAMERA CALIBRATION TECHNIQUE BASED ON LINEAR RECTIFICATION

    Institute of Scientific and Technical Information of China (English)

    Pan Feng; Wang Xuanyin

    2004-01-01

    A new versatile camera calibration technique for machine vision using off-the-shelf cameras is described. Aimed at the large distortion of the off-the-shelf cameras, a new camera distortion rectification technology based on line-rectification is proposed. A full-camera-distortion model is introduced and a linear algorithm is provided to obtain the solution. After the camera rectification intrinsic and extrinsic parameters are obtained based on the relationship between the homograph and absolute conic. This technology needs neither a high-accuracy three-dimensional calibration block, nor a complicated translation or rotation platform. Both simulations and experiments show that this method is effective and robust.

  14. An image morphing technique based on optimal mass preserving mapping.

    Science.gov (United States)

    Zhu, Lei; Yang, Yan; Haker, Steven; Tannenbaum, Allen

    2007-06-01

    Image morphing, or image interpolation in the time domain, deals with the metamorphosis of one image into another. In this paper, a new class of image morphing algorithms is proposed based on the theory of optimal mass transport. The L(2) mass moving energy functional is modified by adding an intensity penalizing term, in order to reduce the undesired double exposure effect. It is an intensity-based approach and, thus, is parameter free. The optimal warping function is computed using an iterative gradient descent approach. This proposed morphing method is also extended to doubly connected domains using a harmonic parameterization technique, along with finite-element methods. PMID:17547128

  15. Comparing Four Touch-Based Interaction Techniques for an Image-Based Audience Response System

    NARCIS (Netherlands)

    Jorritsma, Wiard; Prins, Jonatan T.; van Ooijen, Peter M. A.

    2015-01-01

    This study aimed to determine the most appropriate touch-based interaction technique for I2Vote, an image-based audience response system for radiology education in which users need to accurately mark a target on a medical image. Four plausible techniques were identified: land-on, take-off, zoom-poin

  16. Establishment of safety verification method for life extension based on periodic safety review

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Soong Pyung; Yeom, Yu Son; Yoon, In Sik; Lee, Jeo Young [Chosun Univ., Gwangju (Korea, Republic of)

    2004-02-15

    Safe management of operating lifetimes of Nuclear Power Plants is a subject of prime interests. As the design life of the Nuclear Power Plant will be ended in 2008, an appropriate procedure for the design life re-assessment or lifetime extension is necessary in Korea. Therefore, the objective of this work is to develop procedural requirements which can be applied to the regulation of lifetime management or life extension of Nuclear Power Plants in Korea. Review on the linkage of the PSR with the extension of the operating lifetime of Nuclear Power Plants was performed to enhance the utilization of PSR results and analysis of the insufficiencies in the license rule in Korea.

  17. DEVA: An extensible ontology-based annotation model for visual document collections

    Science.gov (United States)

    Jelmini, Carlo; Marchand-Maillet, Stephane

    2003-01-01

    The description of visual documents is a fundamental aspect of any efficient information management system, but the process of manually annotating large collections of documents is tedious and far from being perfect. The need for a generic and extensible annotation model therefore arises. In this paper, we present DEVA, an open, generic and expressive multimedia annotation framework. DEVA is an extension of the Dublin Core specification. The model can represent the semantic content of any visual document. It is described in the ontology language DAML+OIL and can easily be extended with external specialized ontologies, adapting the vocabulary to the given application domain. In parallel, we present the Magritte annotation tool, which is an early prototype that validates the DEVA features. Magritte allows to manually annotating image collections. It is designed with a modular and extensible architecture, which enables the user to dynamically adapt the user interface to specialized ontologies merged into DEVA.

  18. A Different Web-Based Geocoding Service Using Fuzzy Techniques

    Science.gov (United States)

    Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.

    2015-12-01

    Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.

  19. A Review of Financial Accounting Fraud Detection based on Data Mining Techniques

    Science.gov (United States)

    Sharma, Anuj; Kumar Panigrahi, Prabin

    2012-02-01

    With an upsurge in financial accounting fraud in the current economic scenario experienced, financial accounting fraud detection (FAFD) has become an emerging topic of great importance for academic, research and industries. The failure of internal auditing system of the organization in identifying the accounting frauds has lead to use of specialized procedures to detect financial accounting fraud, collective known as forensic accounting. Data mining techniques are providing great aid in financial accounting fraud detection, since dealing with the large data volumes and complexities of financial data are big challenges for forensic accounting. This paper presents a comprehensive review of the literature on the application of data mining techniques for the detection of financial accounting fraud and proposes a framework for data mining techniques based accounting fraud detection. The systematic and comprehensive literature review of the data mining techniques applicable to financial accounting fraud detection may provide a foundation to future research in this field. The findings of this review show that data mining techniques like logistic models, neural networks, Bayesian belief network, and decision trees have been applied most extensively to provide primary solutions to the problems inherent in the detection and classification of fraudulent data.

  20. Extensive analysis of potentialities and limitations of a maximum cross-correlation technique for surface circulation by using realistic ocean model simulations

    Science.gov (United States)

    Doronzo, Bartolomeo; Taddei, Stefano; Brandini, Carlo; Fattorini, Maria

    2015-08-01

    As shown in the literature, ocean surface circulation can be estimated from sequential satellite imagery by using the maximum cross-correlation (MCC) technique. This approach is very promising since it offers the potential to acquire synoptic-scale coverage of the surface currents on a quasi-continuous temporal basis. However, MCC has also many limits due, for example, to cloud cover or the assumption that Sea Surface Temperature (SST) or other surface parameters from satellite imagery are considered as conservative passive tracers. Also, since MCC can detect only advective flows, it might not work properly in shallow water, where local heating and cooling, upwelling and other small-scale processes have a strong influence. Another limitation of the MCC technique is the impossibility of detecting currents moving along surface temperature fronts. The accuracy and reliability of MCC can be analysed by comparing the estimated velocities with those measured by in situ instrumentation, but the low number of experimental measurements does not allow a systematic statistical study of the potentials and limitations of the method. Instead, an extensive analysis of these features can be done by applying the MCC to synthetic imagery obtained from a realistic numerical ocean model that takes into account most physical phenomena. In this paper a multi-window (MW-) MCC technique is proposed, and its application to synthetic imagery obtained by a regional high-resolution implementation of the Regional Ocean Modeling System (ROMS) is discussed. An application of the MW-MCC algorithm to a real case and a comparison with experimental measurements are then shown.

  1. Eat, Grow, Lead 4-H: An Innovative Approach to Deliver Campus- Based Field Experiences to Pre-Entry Extension Educators

    Science.gov (United States)

    Weeks, Penny Pennington; Weeks, William G.

    2012-01-01

    Eat, Grow, Lead 4-H Club was created as a pilot program for college students seeking to gain experience as non-formal youth educators, specifically serving pre-entry level Extension educators through a university-based 4-H club. Seventeen student volunteers contributed an estimated 630 hours of service to the club during spring 2011. The club…

  2. Designing a Competency-Based New County Extension Personnel Training Program: A Novel Approach

    Science.gov (United States)

    Brodeur, Cheri Winton; Higgins, Cynthia; Galindo-Gonzalez, Sebastian; Craig, Diane D.; Haile, Tyann

    2011-01-01

    Voluntary county personnel turnover occurs for a multitude of reasons, including the lack of job satisfaction, organizational commitment, and job embeddedness and lack of proper training. Loss of personnel can be costly both economically and in terms of human capital. Retention of Extension professionals can be improved through proper training or…

  3. IMAGE SEGMENTATION BASED ON MARKOV RANDOM FIELD AND WATERSHED TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    This paper presented a method that incorporates Markov Random Field(MRF), watershed segmentation and merging techniques for performing image segmentation and edge detection tasks. MRF is used to obtain an initial estimate of x regions in the image under process where in MRF model, gray level x, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The process needs an initial segmented result. An initial segmentation is got based on K-means clustering technique and the minimum distance, then the region process in modeled by MRF to obtain an image contains different intensity regions. Starting from this we calculate the gradient values of that image and then employ a watershed technique. When using MRF method it obtains an image that has different intensity regions and has all the edge and region information, then it improves the segmentation result by superimpose closed and an accurate boundary of each region using watershed algorithm. After all pixels of the segmented regions have been processed, a map of primitive region with edges is generated. Finally, a merge process based on averaged mean values is employed. The final segmentation and edge detection result is one closed boundary per actual region in the image.

  4. Herd-scale measurements of methane emissions from cattle grazing extensive sub-tropical grasslands using the open-path laser technique.

    Science.gov (United States)

    Tomkins, N W; Charmley, E

    2015-12-01

    Methane (CH4) emissions associated with beef production systems in northern Australia are yet to be quantified. Methodologies are available to measure emissions, but application in extensive grazing environments is challenging. A micrometeorological methodology for estimating herd-scale emissions using an indirect open-path spectroscopic technique and an atmospheric dispersion model is described. The methodology was deployed on five cattle properties across Queensland and Northern Territory, with measurements conducted during two occasions at one site. On each deployment, data were collected every 10 min for up to 7 h a day over 4 to 16 days. To increase the atmospheric concentration of CH4 to measurable levels, cattle were confined to a known area around water points from ~0800 to 1600 h, during which time measurements of wind statistics and line-averaged CH4 concentration were taken. Filtering to remove erroneous data accounted for 35% of total observations. For five of the six deployments CH4 emissions were within the expected range of 0.4 to 0.6 g/kg BW. At one site, emissions were ~2 times expected values. There was small but consistent variation with time of day, although for some deployments measurements taken early in the day tended to be higher than at the other times. There was a weak linear relationship (R 2=0.47) between animal BW and CH4 emission per kg BW. Where it was possible to compare emissions in the early and late dry season at one site, it was speculated that higher emissions at the late dry season may have been attributed to poorer diet quality. It is concluded that the micrometeorological methodology using open-path lasers can be successfully deployed in extensive grazing conditions to directly measure CH4 emissions from cattle at a herd scale. PMID:26290115

  5. Studying Satellite Image Quality Based on the Fusion Techniques

    CERN Document Server

    Al-Wassai, Firouz Abdullah; Al-Zaky, Ali A

    2011-01-01

    Various and different methods can be used to produce high-resolution multispectral images from high-resolution panchromatic image (PAN) and low-resolution multispectral images (MS), mostly on the pixel level. However, the jury is still out on the benefits of a fused image compared to its original images. There is also a lack of measures for assessing the objective quality of the spatial resolution for the fusion methods. Therefore, an objective quality of the spatial resolution assessment for fusion images is required. So, this study attempts to develop a new qualitative assessment to evaluate the spatial quality of the pan sharpened images by many spatial quality metrics. Also, this paper deals with a comparison of various image fusion techniques based on pixel and feature fusion techniques.

  6. An Improved Face Recognition Technique Based on Modular LPCA Approach

    Directory of Open Access Journals (Sweden)

    Mathu S.S. Kumar

    2011-01-01

    Full Text Available Problem statement: A face identification algorithm based on modular localized variation by Eigen Subspace technique, also called modular localized principal component analysis, is presented in this study. Approach: The face imagery was partitioned into smaller sub-divisions from a predefined neighborhood and they were ultimately fused to acquire many sets of features. Since a few of the normal facial features of an individual do not differ even when the pose and illumination may differ, the proposed method manages these variations. Results: The proposed feature selection module has significantly, enhanced the identification precision using standard face databases when compared to conservative and modular PCA techniques. Conclusion: The proposed algorithm, when related with conservative PCA algorithm and modular PCA, has enhanced recognition accuracy for face imagery with illumination, expression and pose variations.

  7. Feature-based multiresolution techniques for product design

    Institute of Scientific and Technical Information of China (English)

    LEE Sang Hun; LEE Kunwoo

    2006-01-01

    3D computer-aided design (CAD) systems based on feature-based solid modelling technique have been widely spread and used for product design. However, when part models associated with features are used in various downstream applications,simplified models in various levels of detail (LODs) are frequently more desirable than the full details of the parts. In particular,the need for feature-based multiresolution representation of a solid model representing an object at multiple LODs in the feature unit is increasing for engineering tasks. One challenge is to generate valid models at various LODs after an arbitrary rearrangement of features using a certain LOD criterion, because composite Boolean operations consisting of union and subtraction are not commutative. The other challenges are to devise proper topological framework for multiresolution representation, to suggest more reasonable LOD criteria, and to extend applications. This paper surveys the recent research on these issues.

  8. Implementation of Obstacle-Avoidance Control for an Autonomous Omni-Directional Mobile Robot Based on Extension Theory

    Directory of Open Access Journals (Sweden)

    Yi-Chung Lai

    2012-10-01

    Full Text Available The paper demonstrates a following robot with omni-directional wheels, which is able to take action to avoid obstacles. The robot design is based on both fuzzy and extension theory. Fuzzy theory was applied to tune the PMW signal of the motor revolution, and correct path deviation issues encountered when the robot is moving. Extension theory was used to build a robot obstacle-avoidance model. Various mobile models were developed to handle different types of obstacles. The ultrasonic distance sensors mounted on the robot were used to estimate the distance to obstacles. If an obstacle is encountered, the correlation function is evaluated and the robot avoids the obstacle autonomously using the most appropriate mode. The effectiveness of the proposed approach was verified through several tracking experiments, which demonstrates the feasibility of a fuzzy path tracker as well as the extensible collision avoidance system.

  9. Implementation of obstacle-avoidance control for an autonomous omni-directional mobile robot based on extension theory.

    Science.gov (United States)

    Pai, Neng-Sheng; Hsieh, Hung-Hui; Lai, Yi-Chung

    2012-10-16

    The paper demonstrates a following robot with omni-directional wheels, which is able to take action to avoid obstacles. The robot design is based on both fuzzy and extension theory. Fuzzy theory was applied to tune the PMW signal of the motor revolution, and correct path deviation issues encountered when the robot is moving. Extension theory was used to build a robot obstacle-avoidance model. Various mobile models were developed to handle different types of obstacles. The ultrasonic distance sensors mounted on the robot were used to estimate the distance to obstacles. If an obstacle is encountered, the correlation function is evaluated and the robot avoids the obstacle autonomously using the most appropriate mode. The effectiveness of the proposed approach was verified through several tracking experiments, which demonstrates the feasibility of a fuzzy path tracker as well as the extensible collision avoidance system.

  10. Extensible Component Based Architecture for FLASH, A Massively Parallel, Multiphysics Simulation Code

    OpenAIRE

    Dubey, A.; Reid, L. B.; Weide, K.; Antypas, K.; Ganapathy, M. K.; Riley, K.; Sheeler, D.; Siegal, A

    2009-01-01

    FLASH is a publicly available high performance application code which has evolved into a modular, extensible software system from a collection of unconnected legacy codes. FLASH has been successful because its capabilities have been driven by the needs of scientific applications, without compromising maintainability, performance, and usability. In its newest incarnation, FLASH3 consists of inter-operable modules that can be combined to generate different applications. The FLASH architecture a...

  11. Water-based technique to produce porous PZT materials

    Science.gov (United States)

    Galassi, C.; Capiani, C.; Craciun, F.; Roncari, E.

    2005-09-01

    Water based colloidal processing of PZT materials was investigated in order to reduce costs and employ more environmental friendly manufacturing. The technique addressed was the production of porous thick samples by the so called “starch consolidation”. PZT “soft” compositions were used. The “starch consolidation” process allows to obtain the green body by raising the temperature of a suspension of PZT powder, soluble starch and water, cast into a metal mould. The influence of the processing parameters and composition on the morphology, pore volumes, pore size distributions and piezoelectric properties are investigated. Zeta potential determination and titration with different deflocculants were essential tools to adjust the slurry formulation.

  12. Transformer-based design techniques for oscillators and frequency dividers

    CERN Document Server

    Luong, Howard Cam

    2016-01-01

    This book provides in-depth coverage of transformer-based design techniques that enable CMOS oscillators and frequency dividers to achieve state-of-the-art performance.  Design, optimization, and measured performance of oscillators and frequency dividers for different applications are discussed in detail, focusing on not only ultra-low supply voltage but also ultra-wide frequency tuning range and locking range.  This book will be an invaluable reference for anyone working or interested in CMOS radio-frequency or mm-Wave integrated circuits and systems.

  13. Simultaneous algebraic reconstruction technique based on guided image filtering.

    Science.gov (United States)

    Ji, Dongjiang; Qu, Gangrong; Liu, Baodong

    2016-07-11

    The challenge of computed tomography is to reconstruct high-quality images from few-view projections. Using a prior guidance image, guided image filtering smoothes images while preserving edge features. The prior guidance image can be incorporated into the image reconstruction process to improve image quality. We propose a new simultaneous algebraic reconstruction technique based on guided image filtering. Specifically, the prior guidance image is updated in the image reconstruction process, merging information iteratively. To validate the algorithm practicality and efficiency, experiments were performed with numerical phantom projection data and real projection data. The results demonstrate that the proposed method is effective and efficient for nondestructive testing and rock mechanics. PMID:27410859

  14. Architecture of an Open-Sourced, Extensible Data Warehouse Builder: InterBase 6 Data Warehouse Builder (IB-DWB)

    OpenAIRE

    Ling, Maurice HT; So, Chi Wai

    2003-01-01

    We report the development of an open-sourced data warehouse builder, InterBase Data Warehouse Builder (IB-DWB), based on Borland InterBase 6 Open Edition Database Server. InterBase 6 is used for its low maintenance and small footprint. IB-DWB is designed modularly and consists of 5 main components, Data Plug Platform, Discoverer Platform, Multi-Dimensional Cube Builder, and Query Supporter, bounded together by a Kernel. It is also an extensible system, made possible by the Data Plug Platform ...

  15. Research on Deep Joints and Lode Extension Based on Digital Borehole Camera Technology

    Directory of Open Access Journals (Sweden)

    Han Zengqiang

    2015-09-01

    Full Text Available Structure characteristics of rock and orebody in deep borehole are obtained by borehole camera technology. By investigating on the joints and fissures in Shapinggou molybdenum mine, the dominant orientation of joint fissure in surrounding rock and orebody were statistically analyzed. Applying the theory of metallogeny and geostatistics, the relationship between joint fissure and lode’s extension direction is explored. The results indicate that joints in the orebody of ZK61borehole have only one dominant orientation SE126° ∠68°, however, the dominant orientations of joints in surrounding rock were SE118° ∠73°, SW225° ∠70° and SE122° ∠65°, NE79° ∠63°. Then a preliminary conclusion showed that the lode’s extension direction is specific and it is influenced by joints of surrounding rock. Results of other boreholes are generally agree well with the ZK61, suggesting the analysis reliably reflects the lode’s extension properties and the conclusion presents important references for deep ore prospecting.

  16. Radiation synthesized protein-based nanoparticles: A technique overview

    International Nuclear Information System (INIS)

    Seeking for alternative routes for protein engineering a novel technique – radiation induced synthesis of protein nanoparticles – to achieve size controlled particles with preserved bioactivity has been recently reported. This work aimed to evaluate different process conditions to optimize and provide an overview of the technique using γ-irradiation. Papain was used as model protease and the samples were irradiated in a gamma cell irradiator in phosphate buffer (pH=7.0) containing ethanol (0–35%). The dose effect was evaluated by exposure to distinct γ-irradiation doses (2.5, 5, 7.5 and 10 kGy) and scale up experiments involving distinct protein concentrations (12.5–50 mg mL−1) were also performed. Characterization involved size monitoring using dynamic light scattering. Bityrosine detection was performed using fluorescence measurements in order to provide experimental evidence of the mechanism involved. Best dose effects were achieved at 10 kGy with regard to size and no relevant changes were observed as a function of papain concentration, highlighting very broad operational concentration range. Bityrosine changes were identified for the samples as a function of the process confirming that such linkages play an important role in the nanoparticle formation. - Highlights: • Synthesis of protein-based nanoparticles by γ-irradiation. • Optimization of the technique. • Overview of mechanism involved in the nanoparticle formation. • Engineered papain nanoparticles for biomedical applications

  17. A Comparative Analysis of Exemplar Based and Wavelet Based Inpainting Technique

    Directory of Open Access Journals (Sweden)

    Vaibhav V Nalawade

    2012-06-01

    Full Text Available Image inpainting is the process of filling in of missing region so as to preserve its overall continuity. Image inpainting is manipulation and modification of an image in a form that is not easily detected. Digital image inpainting is relatively new area of research, but numerous and different approaches to tackle the inpainting problem have been proposed since the concept was first introduced. This paper compares two separate techniques viz, Exemplar based inpainting technique and Wavelet based inpainting technique, each portraying a different set of characteristics. The algorithms analyzed under exemplar technique are large object removal by exemplar based inpainting technique (Criminisi’s and modified exemplar (Cheng. The algorithm analyzed under wavelet is Chen’s visual image inpainting method. A number of examples on real and synthetic images are demonstrated to compare the results of different algorithms using both qualitative and quantitative parameters.

  18. A New Rerouting Technique for the Extensor Pollicis Longus in Palliative Treatment for Wrist and Finger Extension Paralysis Resulting From Radial Nerve and C5C6C7 Root Injury.

    Science.gov (United States)

    Laravine, Jennifer; Cambon-Binder, Adeline; Belkheyar, Zoubir

    2016-03-01

    Wrist and finger extension paralysis is a consequence of an injury to the radial nerve or the C5C6C7 roots. Despite these 2 different levels of lesions, palliative treatment for this type of paralysis depends on the same tendon transfers. A large majority of the patients are able to compensate for a deficiency of the extension of the wrist and fingers. However, a deficiency in the opening of the first web space, which could be responsible for transfers to the abductor pollicis longus, the extensor pollicis brevis, and the extensor pollicis longus (EPL), frequently exists. The aim of this work was to evaluate the feasibility of a new EPL rerouting technique outside of Lister's tubercle. Another aim was to verify whether this technique allows a better opening of the thumb-index pinch in this type of paralysis. In the first part, we performed an anatomic study comparing the EPL rerouting technique and the frequently used technique for wrist and finger extension paralyses. In the second part, we present 2 clinical cases in which this new technique will be practiced. Preliminary results during this study favor the EPL rerouting technique. This is a simple and reproducible technique that allows for good opening of the first web space in the treatment of wrist and finger extension paralysis. PMID:26709570

  19. A HYBRID APPROACH BASED SEGMENTATION TECHNIQUE FOR BRAIN TUMOR IN MRI IMAGES

    Directory of Open Access Journals (Sweden)

    D. Anithadevi

    2016-02-01

    Full Text Available Automatic image segmentation becomes very crucial for tumor detection in medical image processing. Manual and semi automatic segmentation techniques require more time and knowledge. However these drawbacks had overcome by automatic segmentation still there needs to develop more appropriate techniques for medical image segmentation. Therefore, we proposed hybrid approach based image segmentation using the combined features of region growing and threshold segmentation technique. It is followed by pre-processing stage to provide an accurate brain tumor extraction by the help of Magnetic Resonance Imaging (MRI. If the tumor has holes in it, the region growing segmentation algorithm can’t reveal but the proposed hybrid segmentation technique can be achieved and the result as well improved. Hence the result used to made assessment with the various performance measures as DICE, Jaccard similarity, accuracy, sensitivity and specificity. These similarity measures have been extensively used for evaluation with the ground truth of each processed image and its results are compared and analyzed.

  20. ONLINE GRINDING WHEEL WEAR COMPENSATION BY IMAGE BASED MEASURING TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    WAN Daping; HU Dejin; WU Qi; ZHANG Yonghong

    2006-01-01

    Automatic compensation of grinding wheel wear in dry grinding is accomplished by an image based online measurement method. A kind of PC-based charge-coupled device image recognition system is schemed out, which detects the topography changes of the grinding wheel surface. Profile data, which corresponds to the wear and the topography, is measured by using a digital image processing method. The grinding wheel wear is evaluated by analyzing the position deviation of the grinding wheel edge. The online wear compensation is achieved according to the measure results. The precise detection and automatic compensation system is integrated into an open structure CNC curve grinding machine. A practical application is carried out to fulfil the precision curve grinding. The experimental results confirm the benefits of the proposed techniques, and the online detection accuracy is less than 5 μm. The grinding machine provides higher precision according to the in-process grinding wheel error compensation.

  1. Hash Based Least Significant Bit Technique For Video Steganography

    Directory of Open Access Journals (Sweden)

    Prof. Dr. P. R. Deshmukh ,

    2014-01-01

    Full Text Available The Hash Based Least Significant Bit Technique For Video Steganography deals with hiding secret message or information within a video.Steganography is nothing but the covered writing it includes process that conceals information within other data and also conceals the fact that a secret message is being sent.Steganography is the art of secret communication or the science of invisible communication. In this paper a Hash based least significant bit technique for video steganography has been proposed whose main goal is to embed a secret information in a particular video file and then extract it using a stego key or password. In this Least Significant Bit insertion method is used for steganography so as to embed data in cover video with change in the lower bit.This LSB insertion is not visible.Data hidding is the process of embedding information in a video without changing its perceptual quality. The proposed method involve with two terms that are Peak Signal to Noise Ratio (PSNR and the Mean Square Error (MSE .This two terms measured between the original video files and steganographic video files from all video frames where a distortion is measured using PSNR. A hash function is used to select the particular position for insertion of bits of secret message in LSB bits.

  2. Semantic-based technique for thai documents plagiarism detection

    Directory of Open Access Journals (Sweden)

    Sorawat Prapanitisatian

    2014-03-01

    Full Text Available Plagiarism is the act of taking another person's writing or idea without referring to the source of information. This is one of major problems in educational institutes. There is a number of plagiarism detection software available on the Internet. However, a few numbers of them works. Typically, they use a simple method for plagiarism detection e.g. string matching. The main weakness of this method is it cannot detect the plagiarism when the author replaces some words using synonyms. As such, this paper presents a new technique for a semantic-based plagiarism detection using Semantic Role Labeling (SRL and term weighting. SRL is deployed in order to calculate the semantic-based similarity. The main different from the existing framework is terms in a sentence are weighted dynamically depending on their roles in the sentence e.g. subject, verb or object. This technique enhances the plagiarism detection mechanism more efficiently than existing system although positions of terms in a sentence are reordered. The experimental results show that the proposed method can detect the plagiarism document more effective than the existing methods, Anti-kobpae, Turnit-in and Traditional Semantic Role Labeling.

  3. Generation of Quasi-Gaussian Pulses Based on Correlation Techniques

    Directory of Open Access Journals (Sweden)

    POHOATA, S.

    2012-02-01

    Full Text Available The Gaussian pulses have been mostly used within communications, where some applications can be emphasized: mobile telephony (GSM, where GMSK signals are used, as well as the UWB communications, where short-period pulses based on Gaussian waveform are generated. Since the Gaussian function signifies a theoretical concept, which cannot be accomplished from the physical point of view, this should be expressed by using various functions, able to determine physical implementations. New techniques of generating the Gaussian pulse responses of good precision are approached, proposed and researched in this paper. The second and third order derivatives with regard to the Gaussian pulse response are accurately generated. The third order derivates is composed of four individual rectangular pulses of fixed amplitudes, being easily to be generated by standard techniques. In order to generate pulses able to satisfy the spectral mask requirements, an adequate filter is necessary to be applied. This paper emphasizes a comparative analysis based on the relative error and the energy spectra of the proposed pulses.

  4. Galaxy Cluster Mass Reconstruction Project: I. Methods and first results on galaxy-based techniques

    CERN Document Server

    Old, L; Pearce, F R; Croton, D; Muldrew, S I; Muñoz-Cuartas, J C; Gifford, D; Gray, M E; von der Linden, A; Mamon, G A; Merrifield, M R; Müller, V; Pearson, R J; Ponman, T J; Saro, A; Sepp, T; Sifón, C; Tempel, E; Tundo, E; Wang, Y O; Wojtak, R

    2014-01-01

    This paper is the first in a series in which we perform an extensive comparison of various galaxy-based cluster mass estimation techniques that utilise the positions, velocities and colours of galaxies. Our primary aim is to test the performance of these cluster mass estimation techniques on a diverse set of models that will increase in complexity. We begin by providing participating methods with data from a simple model that delivers idealised clusters, enabling us to quantify the underlying scatter intrinsic to these mass estimation techniques. The mock catalogue is based on a Halo Occupation Distribution (HOD) model that assumes spherical Navarro, Frenk and White (NFW) haloes truncated at R_200, with no substructure nor colour segregation, and with isotropic, isothermal Maxwellian velocities. We find that, above 10^14 M_solar, recovered cluster masses are correlated with the true underlying cluster mass with an intrinsic scatter of typically a factor of two. Below 10^14 M_solar, the scatter rises as the nu...

  5. An extension of the immersed boundary method based on the distributed Lagrange multiplier approach

    Science.gov (United States)

    Feldman, Yuri; Gulberg, Yosef

    2016-10-01

    An extended formulation of the immersed boundary method, which facilitates simulation of incompressible isothermal and natural convection flows around immersed bodies and which may be applied for linear stability analysis of the flows, is presented. The Lagrangian forces and heat sources are distributed on the fluid-structure interface. The method treats pressure, the Lagrangian forces, and heat sources as distributed Lagrange multipliers, thereby implicitly providing the kinematic constraints of no-slip and the corresponding thermal boundary conditions for immersed surfaces. Extensive verification of the developed method for both isothermal and natural convection 2D flows is provided. Strategies for adapting the developed approach to realistic 3D configurations are discussed.

  6. Generalisation and extension of a web-based data collection system for clinical studies using Java and CORBA.

    Science.gov (United States)

    Eich, H P; Ohmann, C

    1999-01-01

    Inadequate informatical support of multi-centre clinical trials lead to pure quality. In order to support a multi-centre clinical trial a data collection via WWW and Internet based on Java has been developed. In this study a generalization and extension of this prototype has been performed. The prototype has been applied to another clinical trial and a knowledge server based on C+t has been integrated via CORBA. The investigation and implementation of security aspects of web-based data collection is now under evaluation.

  7. Enhancing the effectiveness of IST through risk-based techniques

    Energy Technology Data Exchange (ETDEWEB)

    Floyd, S.D.

    1996-12-01

    Current IST requirements were developed mainly through deterministic-based methods. While this approach has resulted in an adequate level of safety and reliability for pumps and valves, insights from probabilistic safety assessments suggest a better safety focus can be achieved at lower costs. That is, some high safety impact pumps and valves are currently not tested under the IST program and should be added, while low safety impact valves could be tested at significantly greater intervals than allowed by the current IST program. The nuclear utility industry, through the Nuclear Energy Institute (NEI), has developed a draft guideline for applying risk-based techniques to focus testing on those pumps and valves with a high safety impact while reducing test frequencies on low safety impact pumps and valves. The guideline is being validated through an industry pilot application program that is being reviewed by the U.S. Nuclear Regulatory Commission. NEI and the ASME maintain a dialogue on the two groups` activities related to risk-based IST. The presenter will provide an overview of the NEI guideline, discuss the methodological approach for applying risk-based technology to IST and provide the status of the industry pilot plant effort.

  8. An interactive tutorial-based training technique for vertebral morphometry.

    Science.gov (United States)

    Gardner, J C; von Ingersleben, G; Heyano, S L; Chesnut, C H

    2001-01-01

    The purpose of this work was to develop a computer-based procedure for training technologists in vertebral morphometry. The utility of the resulting interactive, tutorial based training method was evaluated in this study. The training program was composed of four steps: (1) review of an online tutorial, (2) review of analyzed spine images, (3) practice in fiducial point placement and (4) testing. During testing, vertebral heights were measured from digital, lateral spine images containing osteoporotic fractures. Inter-observer measurement precision was compared between research technicians, and between technologists and radiologist. The technologists participating in this study had no prior experience in vertebral morphometry. Following completion of the online training program, good inter-observer measurement precision was seen between technologists, showing mean coefficients of variation of 2.33% for anterior, 2.87% for central and 2.65% for posterior vertebral heights. Comparisons between the technicians and radiologist ranged from 2.19% to 3.18%. Slightly better precision values were seen with height measurements compared with height ratios, and with unfractured compared with fractured vertebral bodies. The findings of this study indicate that self-directed, tutorial-based training for spine image analyses is effective, resulting in good inter-observer measurement precision. The interactive tutorial-based approach provides standardized training methods and assures consistency of instructional technique over time.

  9. Evaluations of mosquito age grading techniques based on morphological changes.

    Science.gov (United States)

    Hugo, L E; Quick-Miles, S; Kay, B H; Ryan, P A

    2008-05-01

    Evaluations were made of the accuracy and practicality of mosquito age grading methods based on changes to mosquito morphology; including the Detinova ovarian tracheation, midgut meconium, Polovodova ovariole dilatation, ovarian injection, and daily growth line methods. Laboratory maintained Aedes vigilax (Skuse) and Culex annulirostris (Skuse) females of known chronological and physiological ages were used for these assessments. Application of the Detinova technique to laboratory reared Ae. vigilax females in a blinded trial enabled the successful identification of nulliparous and parous females in 83.7-89.8% of specimens. The success rate for identifying nulliparous females increased to 87.8-98.0% when observations of ovarian tracheation were combined with observations of the presence of midgut meconium. However, application of the Polovodova method only enabled 57.5% of nulliparous, 1-parous, 2-parous, and 3-parous Ae. vigilax females to be correctly classified, and ovarian injections were found to be unfeasible. Poor correlation was observed between the number of growth lines per phragma and the calendar age of laboratory reared Ae. vigilax females. In summary, morphological age grading methods that offer simple two-category predictions (ovarian tracheation and midgut meconium methods) were found to provide high-accuracy classifications, whereas methods that offer the separation of multiple age categories (ovariolar dilatation and growth line methods) were found to be extremely difficult and of low accuracy. The usefulness of the morphology-based methods is discussed in view of the availability of new mosquito age grading techniques based on cuticular hydrocarbon and gene transcription changes. PMID:18533427

  10. An Empirical Comparative Study of Checklist based and Ad Hoc Code Reading Techniques in a Distributed Groupware Environment

    CERN Document Server

    Akinola, Olalekan S

    2009-01-01

    Software inspection is a necessary and important tool for software quality assurance. Since it was introduced by Fagan at IBM in 1976, arguments exist as to which method should be adopted to carry out the exercise, whether it should be paper based or tool based, and what reading technique should be used on the inspection document. Extensive works have been done to determine the effectiveness of reviewers in paper based environment when using ad hoc and checklist reading techniques. In this work, we take the software inspection research further by examining whether there is going to be any significant difference in defect detection effectiveness of reviewers when they use either ad hoc or checklist reading techniques in a distributed groupware environment. Twenty final year undergraduate students of computer science, divided into ad hoc and checklist reviewers groups of ten members each were employed to inspect a medium sized java code synchronously on groupware deployed on the Internet. The data obtained were...

  11. On The Multi-Hop Extension of Energy-Efficient WSN Time Synchronization Based on Time-Translating Gateways

    OpenAIRE

    Liao, Qimeng; Kim, Kyeong Soo

    2016-01-01

    We report preliminary results of a simulation study on the multi-hop extension of the recently-proposed energy-efficient wireless sensor network time synchronization scheme based on time-translating gateways. Unlike the single-hop case, in multi-hop time synchronization a sensor node sends measurement data to a head node through gateways which translate the timestamp values of the received measurement data. Through simulations for two-hop time synchronization, we analyze the impact of the add...

  12. Chemistry research and chemical techniques based on research reactors

    International Nuclear Information System (INIS)

    Chemistry has occupied an important position historically in the sciences associated with nuclear reactors and it continues to play a prominent role in reactor-based research investigations. This Panel of prominent scientists in the field was convened by the International Atomic Energy Agency (IAEA) to assess the present state of such chemistry research for the information of its Member States and others interested in the subject. There are two ways in which chemistry is associated with nuclear reactors: (a) general applications to many scientific fields in which chemical techniques are involved as essential service functions; and (b) specific applications of reactor facilities to the solution of chemical problems themselves. Twenty years of basic research with nuclear reactors have demonstrated a very widespread, and still increasing, demand for radioisotopes and isotopically-labelled molecules in all fields of the physical and biological sciences. Similarly, the determination of the elemental composition of a material through the analytical technique of activation analysis can be applied throughout experimental science. Refs, figs and tabs

  13. Investigations on landmine detection by neutron-based techniques

    Energy Technology Data Exchange (ETDEWEB)

    Csikai, J. E-mail: csikai@delfin.klte.hu; Doczi, R.; Kiraly, B

    2004-07-01

    Principles and techniques of some neutron-based methods used to identify the antipersonnel landmines (APMs) are discussed. New results have been achieved in the field of neutron reflection, transmission, scattering and reaction techniques. Some conclusions are as follows: The neutron hand-held detector is suitable for the observation of anomaly caused by a DLM2-like sample in different soils with a scanning speed of 1 m{sup 2}/1.5 min; the reflection cross section of thermal neutrons rendered the determination of equivalent thickness of different soil components possible; a simple method was developed for the determination of the thermal neutron flux perturbation factor needed for multi-elemental analysis of bulky samples; unfolded spectra of elastically backscattered neutrons using broad-spectrum sources render the identification of APMs possible; the knowledge of leakage spectra of different source neutrons is indispensable for the determination of the differential and integrated reaction rates and through it the dimension of the interrogated volume; the precise determination of the C/O atom fraction requires the investigations on the angular distribution of the 6.13 MeV gamma-ray emitted in the {sup 16}O(n,n'{gamma}) reaction. These results, in addition to the identification of landmines, render the improvement of the non-intrusive neutron methods possible.

  14. Detecting Molecular Properties by Various Laser-Based Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hsin, Tse-Ming [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    Four different laser-based techniques were applied to study physical and chemical characteristics of biomolecules and dye molecules. These techniques are liole burning spectroscopy, single molecule spectroscopy, time-resolved coherent anti-Stokes Raman spectroscopy and laser-induced fluorescence microscopy. Results from hole burning and single molecule spectroscopy suggested that two antenna states (C708 & C714) of photosystem I from cyanobacterium Synechocystis PCC 6803 are connected by effective energy transfer and the corresponding energy transfer time is ~6 ps. In addition, results from hole burning spectroscopy indicated that the chlorophyll dimer of the C714 state has a large distribution of the dimer geometry. Direct observation of vibrational peaks and evolution of coumarin 153 in the electronic excited state was demonstrated by using the fs/ps CARS, a variation of time-resolved coherent anti-Stokes Raman spectroscopy. In three different solvents, methanol, acetonitrile, and butanol, a vibration peak related to the stretch of the carbonyl group exhibits different relaxation dynamics. Laser-induced fluorescence microscopy, along with the biomimetic containers-liposomes, allows the measurement of the enzymatic activity of individual alkaline phosphatase from bovine intestinal mucosa without potential interferences from glass surfaces. The result showed a wide distribution of the enzyme reactivity. Protein structural variation is one of the major reasons that are responsible for this highly heterogeneous behavior.

  15. NVC Based Model for Selecting Effective Requirement Elicitation Technique

    Directory of Open Access Journals (Sweden)

    Md. Rizwan Beg

    2012-10-01

    Full Text Available Requirement Engineering process starts from gathering of requirements i.e.; requirements elicitation. Requirementselicitation (RE is the base building block for a software project and has very high impact onsubsequent design and builds phases as well. Accurately capturing system requirements is the major factorin the failure of most of software projects. Due to the criticality and impact of this phase, it is very importantto perform the requirements elicitation in no less than a perfect manner. One of the most difficult jobsfor elicitor is to select appropriate technique for eliciting the requirement. Interviewing and Interactingstakeholder during Elicitation process is a communication intensive activity involves Verbal and Nonverbalcommunication (NVC. Elicitor should give emphasis to Non-verbal communication along with verbalcommunication so that requirements recorded more efficiently and effectively. In this paper we proposea model in which stakeholders are classified by observing non-verbal communication and use it as a basefor elicitation technique selection. We also propose an efficient plan for requirements elicitation which intendsto overcome on the constraints, faced by elicitor.

  16. Investigations on landmine detection by neutron-based techniques.

    Science.gov (United States)

    Csikai, J; Dóczi, R; Király, B

    2004-07-01

    Principles and techniques of some neutron-based methods used to identify the antipersonnel landmines (APMs) are discussed. New results have been achieved in the field of neutron reflection, transmission, scattering and reaction techniques. Some conclusions are as follows: The neutron hand-held detector is suitable for the observation of anomaly caused by a DLM2-like sample in different soils with a scanning speed of 1m(2)/1.5 min; the reflection cross section of thermal neutrons rendered the determination of equivalent thickness of different soil components possible; a simple method was developed for the determination of the thermal neutron flux perturbation factor needed for multi-elemental analysis of bulky samples; unfolded spectra of elastically backscattered neutrons using broad-spectrum sources render the identification of APMs possible; the knowledge of leakage spectra of different source neutrons is indispensable for the determination of the differential and integrated reaction rates and through it the dimension of the interrogated volume; the precise determination of the C/O atom fraction requires the investigations on the angular distribution of the 6.13MeV gamma-ray emitted in the (16)O(n,n'gamma) reaction. These results, in addition to the identification of landmines, render the improvement of the non-intrusive neutron methods possible.

  17. A New Particle Swarm Optimization Based Stock Market Prediction Technique

    Directory of Open Access Journals (Sweden)

    Essam El. Seidy

    2016-04-01

    Full Text Available Over the last years, the average person's interest in the stock market has grown dramatically. This demand has doubled with the advancement of technology that has opened in the International stock market, so that nowadays anybody can own stocks, and use many types of software to perform the aspired profit with minimum risk. Consequently, the analysis and prediction of future values and trends of the financial markets have got more attention, and due to large applications in different business transactions, stock market prediction has become a critical topic of research. In this paper, our earlier presented particle swarm optimization with center of mass technique (PSOCoM is applied to the task of training an adaptive linear combiner to form a new stock market prediction model. This prediction model is used with some common indicators to maximize the return and minimize the risk for the stock market. The experimental results show that the proposed technique is superior than the other PSO based models according to the prediction accuracy.

  18. A restrained-torque-based motion instructor: forearm flexion/extension-driving exoskeleton

    Science.gov (United States)

    Nishimura, Takuya; Nomura, Yoshihiko; Sakamoto, Ryota

    2013-01-01

    When learning complicated movements by ourselves, we encounter such problems as a self-rightness. The self-rightness results in a lack of detail and objectivity, and it may cause to miss essences and even twist the essences. Thus, we sometimes fall into the habits of doing inappropriate motions. To solve these problems or to alleviate the problems as could as possible, we have been developed mechanical man-machine human interfaces to support us learning such motions as cultural gestures and sports form. One of the promising interfaces is a wearable exoskeleton mechanical system. As of the first try, we have made a prototype of a 2-link 1-DOF rotational elbow joint interface that is applied for teaching extension-flexion operations with forearms and have found its potential abilities for teaching the initiating and continuing flection motion of the elbow.

  19. An extensive survey of dayside diffuse aurora based on optical observations at Yellow River Station

    CERN Document Server

    Han, De-Sheng; Liu, Jian-Jun; Qiu, Qi; Keika, K; Hu, Ze-Jun; Liu, Jun-Ming; Hu, Hong-Qiao; Yang, Hui-Gen

    2016-01-01

    By using 7 years optical auroral observations obtained at Yellow River Station (magnetic latitude $76.24\\,^{\\circ}{\\rm C}$N) at Ny-Alesund, Svalbard, we performed the first extensive survey for the dayside diffuse auroras (DDAs) and acquired observational results as follows. (1) The DDAs can be classified into two broad categories, i.e., unstructured and structured DDAs. The unstructured DDAs are mainly distributed in the morning and afternoon, but the structured DDAs predominantly occurred around the magnetic local noon (MLN). (2) The unstructured DDAs observed in morning and afternoon present obviously different properties. The afternoon ones are much stable and seldom show pulsating property. (3) The DDAs are more easily observed under geomagnetically quiet times. (4) The structured DDAsmainly show patchy, stripy, and irregular forms and are often pulsating and drifting. The drifting directions are mostly westward (with speed $\\sim$5km/s), but there are cases showing eastward or poleward drifting. (5) The ...

  20. DanteR: an extensible R-based tool for quantitative analysis of -omics data

    Energy Technology Data Exchange (ETDEWEB)

    Taverner, Thomas; Karpievitch, Yuliya; Polpitiya, Ashoka D.; Brown, Joseph N.; Dabney, Alan R.; Anderson, Gordon A.; Smith, Richard D.

    2012-09-15

    Motivation: The size and complex nature of LC-MS proteomics data sets motivates development of specialized software for statistical data analysis and exploration. We present DanteR, a graphical R package that features extensive statistical and diagnostic functions for quantitative proteomics data analysis, including normalization, imputation, hypothesis testing, interactive visualization and peptide-to-protein rollup. More importantly, users can easily extend the existing functionality by including their own algorithms under the Add-On tab. Availability: DanteR and its associated user guide are available for download at http://omics.pnl.gov/software/. For Windows, a single click automatically installs DanteR along with the R programming environment. For Linux and Mac OS X, users must first install R and then follow instructions on the DanteR web site for package installation.

  1. Astronomical Image Compression Techniques Based on ACC and KLT Coder

    Directory of Open Access Journals (Sweden)

    J. Schindler

    2011-01-01

    Full Text Available This paper deals with a compression of image data in applications in astronomy. Astronomical images have typical specific properties — high grayscale bit depth, size, noise occurrence and special processing algorithms. They belong to the class of scientific images. Their processing and compression is quite different from the classical approach of multimedia image processing. The database of images from BOOTES (Burst Observer and Optical Transient Exploring System has been chosen as a source of the testing signal. BOOTES is a Czech-Spanish robotic telescope for observing AGN (active galactic nuclei and the optical transient of GRB (gamma ray bursts searching. This paper discusses an approach based on an analysis of statistical properties of image data. A comparison of two irrelevancy reduction methods is presented from a scientific (astrometric and photometric point of view. The first method is based on a statistical approach, using the Karhunen-Loeve transform (KLT with uniform quantization in the spectral domain. The second technique is derived from wavelet decomposition with adaptive selection of used prediction coefficients. Finally, the comparison of three redundancy reduction methods is discussed. Multimedia format JPEG2000 and HCOMPRESS, designed especially for astronomical images, are compared with the new Astronomical Context Coder (ACC coder based on adaptive median regression.

  2. Demand Management Based on Model Predictive Control Techniques

    Directory of Open Access Journals (Sweden)

    Yasser A. Davizón

    2014-01-01

    Full Text Available Demand management (DM is the process that helps companies to sell the right product to the right customer, at the right time, and for the right price. Therefore the challenge for any company is to determine how much to sell, at what price, and to which market segment while maximizing its profits. DM also helps managers efficiently allocate undifferentiated units of capacity to the available demand with the goal of maximizing revenue. This paper introduces control system approach to demand management with dynamic pricing (DP using the model predictive control (MPC technique. In addition, we present a proper dynamical system analogy based on active suspension and a stability analysis is provided via the Lyapunov direct method.

  3. Crop Yield Forecasted Model Based on Time Series Techniques

    Institute of Scientific and Technical Information of China (English)

    Li Hong-ying; Hou Yan-lin; Zhou Yong-juan; Zhao Hui-ming

    2012-01-01

    Traditional studies on potential yield mainly referred to attainable yield: the maximum yield which could be reached by a crop in a given environment. The new concept of crop yield under average climate conditions was defined in this paper, which was affected by advancement of science and technology. Based on the new concept of crop yield, the time series techniques relying on past yield data was employed to set up a forecasting model. The model was tested by using average grain yields of Liaoning Province in China from 1949 to 2005. The testing combined dynamic n-choosing and micro tendency rectification, and an average forecasting error was 1.24%. In the trend line of yield change, and then a yield turning point might occur, in which case the inflexion model was used to solve the problem of yield turn point.

  4. Clustering economies based on multiple criteria decision making techniques

    Directory of Open Access Journals (Sweden)

    Mansour Momeni

    2011-10-01

    Full Text Available One of the primary concerns on many countries is to determine different important factors affecting economic growth. In this paper, we study some factors such as unemployment rate, inflation ratio, population growth, average annual income, etc to cluster different countries. The proposed model of this paper uses analytical hierarchy process (AHP to prioritize the criteria and then uses a K-mean technique to cluster 59 countries based on the ranked criteria into four groups. The first group includes countries with high standards such as Germany and Japan. In the second cluster, there are some developing countries with relatively good economic growth such as Saudi Arabia and Iran. The third cluster belongs to countries with faster rates of growth compared with the countries located in the second group such as China, India and Mexico. Finally, the fourth cluster includes countries with relatively very low rates of growth such as Jordan, Mali, Niger, etc.

  5. SAR IMAGE ENHANCEMENT BASED ON BEAM SHARPENING TECHNIQUE

    Institute of Scientific and Technical Information of China (English)

    LIYong; ZI-IANGKun-hui; ZHUDai-yin; ZHUZhao-da

    2004-01-01

    A major problem encountered in enhancing SAR image is the total loss of phase information and the unknown parameters of imaging system. The beam sharpening technique, combined with synthetic aperture radiation pattern estimation provides an approach to process this kind of data to achieve higher apparent resolution. Based on the criterion of minimizing the expected quadratic estimation error, an optimum FIR filter with a symmetrical structure is designed whose coefficients depend on the azimuth response of local isolated prominent points because this response can be approximately regarded as the synthetic aperture radiation pattern of the imaging system. The point target simulation shows that the angular resolution is improved by a ratio of almost two to one. The processing results of a live SAR image demonstrate the validity of the method.

  6. Diagnosis of Dengue Infection Using Conventional and Biosensor Based Techniques

    Directory of Open Access Journals (Sweden)

    Om Parkash

    2015-10-01

    Full Text Available Dengue is an arthropod-borne viral disease caused by four antigenically different serotypes of dengue virus. This disease is considered as a major public health concern around the world. Currently, there is no licensed vaccine or antiviral drug available for the prevention and treatment of dengue disease. Moreover, clinical features of dengue are indistinguishable from other infectious diseases such as malaria, chikungunya, rickettsia and leptospira. Therefore, prompt and accurate laboratory diagnostic test is urgently required for disease confirmation and patient triage. The traditional diagnostic techniques for the dengue virus are viral detection in cell culture, serological testing, and RNA amplification using reverse transcriptase PCR. This paper discusses the conventional laboratory methods used for the diagnosis of dengue during the acute and convalescent phase and highlights the advantages and limitations of these routine laboratory tests. Subsequently, the biosensor based assays developed using various transducers for the detection of dengue are also reviewed.

  7. Complete denture impression techniques: Evidence-based or philosophical

    Directory of Open Access Journals (Sweden)

    Singla Shefali

    2007-01-01

    Full Text Available Code of practice is dangerous and ever-changing in today′s world. Relating this to complete denture impression technique, we have been provided with a set of philosophies - "no pressure, minimal pressure, definite pressure and selective pressure". The objectives and principles of impression-making have been clearly defined. Do you think any philosophy can satisfy any operator to work on these principles and achieve these objectives? These philosophies take into consideration only the tissue part and not the complete basal seat, which comprises the periphery, the tissues and the bone structure. Under such circumstances, should we consider a code of practice dangerous or should we develop an evidence-based approach having a scientific background following certain principles, providing the flexibility to adapt to clinical procedures and to normal biological variations in patients rather than the rigidity imposed by strict laws?

  8. Dynamic analysis of granite rockburst based on the PIV technique

    Institute of Scientific and Technical Information of China (English)

    Wang Hongjian; Liu Da’an; Gong Weili; Li Liyun

    2015-01-01

    This paper describes the deep rockburst simulation system to reproduce the granite instantaneous rock-burst process. Based on the PIV (Particle Image Velocimetry) technique, quantitative analysis of a rock-burst, the images of tracer particle, displacement and strain fields can be obtained, and the debris trajectory described. According to the observation of on-site tests, the dynamic rockburst is actually a gas–solid high speed flow process, which is caused by the interaction of rock fragments and surrounding air. With the help of analysis on high speed video and PIV images, the granite rockburst failure process is composed of six stages of platey fragment spalling and debris ejection. Meanwhile, the elastic energy for these six stages has been calculated to study the energy variation. The results indicate that the rockburst process can be summarized as:an initiating stage, intensive developing stage and gradual decay stage. This research will be helpful for our further understanding of the rockburst mechanism.

  9. Whitelists Based Multiple Filtering Techniques in SCADA Sensor Networks

    Directory of Open Access Journals (Sweden)

    DongHo Kang

    2014-01-01

    Full Text Available Internet of Things (IoT consists of several tiny devices connected together to form a collaborative computing environment. Recently IoT technologies begin to merge with supervisory control and data acquisition (SCADA sensor networks to more efficiently gather and analyze real-time data from sensors in industrial environments. But SCADA sensor networks are becoming more and more vulnerable to cyber-attacks due to increased connectivity. To safely adopt IoT technologies in the SCADA environments, it is important to improve the security of SCADA sensor networks. In this paper we propose a multiple filtering technique based on whitelists to detect illegitimate packets. Our proposed system detects the traffic of network and application protocol attacks with a set of whitelists collected from normal traffic.

  10. A Novel Technique Based on Node Registration in MANETs

    Directory of Open Access Journals (Sweden)

    Rashid Jalal Qureshi

    2012-09-01

    Full Text Available In ad hoc network communication links between the nodes are wireless and each node acts as a router for the other node and packet is forward from one node to other. This type of networks helps in solving challenges and problems that may arise in every day communication. Mobile Ad Hoc Networks is a new field of research and it is particularly useful in situations where network infrastructure is costly. Protecting MANETs from security threats is a challenging task because of the MANETs dynamic topology. Every node in a MANETs is independent and is free to move in any direction, therefore change its connections to other nodes frequently. Due to its decentralized nature different types of attacks can be occur. The aim of this research paper is to investigate different MANETs security attacks and proposed nodes registration based technique by using cryptography functions.

  11. An RSS based location estimation technique for cognitive relay networks

    KAUST Repository

    Qaraqe, Khalid A.

    2010-11-01

    In this paper, a received signal strength (RSS) based location estimation method is proposed for a cooperative wireless relay network where the relay is a cognitive radio. We propose a method for the considered cognitive relay network to determine the location of the source using the direct and the relayed signal at the destination. We derive the Cramer-Rao lower bound (CRLB) expressions separately for x and y coordinates of the location estimate. We analyze the effects of cognitive behaviour of the relay on the performance of the proposed method. We also discuss and quantify the reliability of the location estimate using the proposed technique if the source is not stationary. The overall performance of the proposed method is presented through simulations. ©2010 IEEE.

  12. Ionospheric Plasma Drift Analysis Technique Based On Ray Tracing

    Science.gov (United States)

    Ari, Gizem; Toker, Cenk

    2016-07-01

    Ionospheric drift measurements provide important information about the variability in the ionosphere, which can be used to quantify ionospheric disturbances caused by natural phenomena such as solar, geomagnetic, gravitational and seismic activities. One of the prominent ways for drift measurement depends on instrumentation based measurements, e.g. using an ionosonde. The drift estimation of an ionosonde depends on measuring the Doppler shift on the received signal, where the main cause of Doppler shift is the change in the length of the propagation path of the signal between the transmitter and the receiver. Unfortunately, ionosondes are expensive devices and their installation and maintenance require special care. Furthermore, the ionosonde network over the world or even Europe is not dense enough to obtain a global or continental drift map. In order to overcome the difficulties related to an ionosonde, we propose a technique to perform ionospheric drift estimation based on ray tracing. First, a two dimensional TEC map is constructed by using the IONOLAB-MAP tool which spatially interpolates the VTEC estimates obtained from the EUREF CORS network. Next, a three dimensional electron density profile is generated by inputting the TEC estimates to the IRI-2015 model. Eventually, a close-to-real situation electron density profile is obtained in which ray tracing can be performed. These profiles can be constructed periodically with a period of as low as 30 seconds. By processing two consequent snapshots together and calculating the propagation paths, we estimate the drift measurements over any coordinate of concern. We test our technique by comparing the results to the drift measurements taken at the DPS ionosonde at Pruhonice, Czech Republic. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.

  13. An investigation of a video-based patient repositioning technique

    International Nuclear Information System (INIS)

    Purpose: We have investigated a video-based patient repositioning technique designed to use skin features for radiotherapy repositioning. We investigated the feasibility of the clinical application of this system by quantitative evaluation of performance characteristics of the methodology. Methods and Materials: Multiple regions of interest (ROI) were specified in the field of view of video cameras. We used a normalized correlation pattern-matching algorithm to compute the translations of each ROI pattern in a target image. These translations were compared against trial translations using a quadratic cost function for an optimization process in which the patient rotation and translational parameters were calculated. Results: A hierarchical search technique achieved high-speed (compute correlation for 128x128 ROI in 512x512 target image within 0.005 s) and subpixel spatial accuracy (as high as 0.2 pixel). By treating the observed translations as movements of points on the surfaces of a hypothetical cube, we were able to estimate accurately the actual translations and rotations of the test phantoms used in our experiments to less than 1 mm and 0.2 deg. with a standard deviation of 0.3 mm and 0.5 deg. respectively. For human volunteer cases, we estimated the translations and rotations to have an accuracy of 2 mm and 1.2 deg. Conclusion: A personal computer-based video system is suitable for routine patient setup of fractionated conformal radiotherapy. It is expected to achieve high-precision repositioning of the skin surface with high efficiency

  14. Cryptanalysis of a technique to transform discrete logarithm based cryptosystems into identity-based cryptosystems

    OpenAIRE

    TANG, QIANG; MITCHELL, CHRIS J.

    2005-01-01

    In this paper we analyse a technique designed to transform any discrete logarithm based cryptosystem into an identity-based cryptosystem. The transformation method is claimed to be efficient and secure and to eliminate the need to invent new identity-based cryptosystems. However, we show that the identity-based cryptosystem created by the proposed transformation method suffers from a number of security and efficiency problems.

  15. CANDU in-reactor quantitative visual-based inspection techniques

    Science.gov (United States)

    Rochefort, P. A.

    2009-02-01

    This paper describes two separate visual-based inspection procedures used at CANDU nuclear power generating stations. The techniques are quantitative in nature and are delivered and operated in highly radioactive environments with access that is restrictive, and in one case is submerged. Visual-based inspections at stations are typically qualitative in nature. For example a video system will be used to search for a missing component, inspect for a broken fixture, or locate areas of excessive corrosion in a pipe. In contrast, the methods described here are used to measure characteristic component dimensions that in one case ensure ongoing safe operation of the reactor and in the other support reactor refurbishment. CANDU reactors are Pressurized Heavy Water Reactors (PHWR). The reactor vessel is a horizontal cylindrical low-pressure calandria tank approximately 6 m in diameter and length, containing heavy water as a neutron moderator. Inside the calandria, 380 horizontal fuel channels (FC) are supported at each end by integral end-shields. Each FC holds 12 fuel bundles. The heavy water primary heat transport water flows through the FC pressure tube, removing the heat from the fuel bundles and delivering it to the steam generator. The general design of the reactor governs both the type of measurements that are required and the methods to perform the measurements. The first inspection procedure is a method to remotely measure the gap between FC and other in-core horizontal components. The technique involves delivering vertically a module with a high-radiation-resistant camera and lighting into the core of a shutdown but fuelled reactor. The measurement is done using a line-of-sight technique between the components. Compensation for image perspective and viewing elevation to the measurement is required. The second inspection procedure measures flaws within the reactor's end shield FC calandria tube rolled joint area. The FC calandria tube (the outer shell of the FC) is

  16. Streaming Media over a Color Overlay Based on Forward Error Correction Technique

    Institute of Scientific and Technical Information of China (English)

    张晓瑜; 沈国斌; 李世鹏; 钟玉琢

    2004-01-01

    The number of clients that receive high-quality streaming video from a source is greatly limited by the application requirements,such as the high bandwidth and reliability.In this work,a method was developed to construct a color overlay,which enables clients to receive data across multiple paths,based on the forward error correction technique.The color overlay enlarges system capacity by reducing the bottlenecks and extending the bandwidth,improves reliability against node failure,and is more resilient to fluctuations of network metrics.A light-weight protocol for building the overlay is also presented.Extensive simulations were conducted and the results clearly support the claimed advantages.

  17. Energy-Efficient Network Transmission between Satellite Swarms and Earth Stations Based on Lyapunov Optimization Techniques

    Directory of Open Access Journals (Sweden)

    Weiwei Fang

    2014-01-01

    Full Text Available The recent advent of satellite swarm technologies has enabled space exploration with a massive number of picoclass, low-power, and low-weight spacecraft. However, developing swarm-based satellite systems, from conceptualization to validation, is a complex multidisciplinary activity. One of the primary challenges is how to achieve energy-efficient data transmission between the satellite swarm and terrestrial terminal stations. Employing Lyapunov optimization techniques, we present an online control algorithm to optimally dispatch traffic load among different satellite-ground links for minimizing overall energy consumption over time. Our algorithm is able to independently and simultaneously make control decisions on traffic dispatching over intersatellite-links and up-down-links so as to offer provable energy and delay guarantees, without requiring any statistical information of traffic arrivals and link condition. Rigorous analysis and extensive simulations have demonstrated the performance and robustness of the proposed new algorithm.

  18. Protein-Protein Interactions Prediction Based on Iterative Clique Extension with Gene Ontology Filtering

    OpenAIRE

    Lei Yang; Xianglong Tang

    2014-01-01

    Cliques (maximal complete subnets) in protein-protein interaction (PPI) network are an important resource used to analyze protein complexes and functional modules. Clique-based methods of predicting PPI complement the data defection from biological experiments. However, clique-based predicting methods only depend on the topology of network. The false-positive and false-negative interactions in a network usually interfere with prediction. Therefore, we propose a method combining clique-based m...

  19. Structural design systems using knowledge-based techniques

    International Nuclear Information System (INIS)

    Engineering information management and the corresponding information systems are of a strategic importance for industrial enterprises. This thesis treats the interdisciplinary field of designing computing systems for structural design and analysis using knowledge-based techniques. Specific conceptual models have been designed for representing the structure and the process of objects and activities in a structural design and analysis domain. In this thesis, it is shown how domain knowledge can be structured along several classification principles in order to reduce complexity and increase flexibility. By increasing the conceptual level of the problem description and representation of the domain knowledge in a declarative form, it is possible to enhance the development, maintenance and use of software for mechanical engineering. This will result in a corresponding increase of the efficiency of the mechanical engineering design process. These ideas together with the rule-based control point out the leverage of declarative knowledge representation within this domain. Used appropriately, a declarative knowledge representation preserves information better, is more problem-oriented and change-tolerant than procedural representations. 74 refs

  20. Damage detection technique by measuring laser-based mechanical impedance

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hyeonseok; Sohn, Hoon [Department of Civil and Environmental Engineering, Korea Advanced Institute of Science and Technology (Daehak-ro 291, Yuseong-gu, Daejeon 305-701) (Korea, Republic of)

    2014-02-18

    This study proposes a method for measurement of mechanical impedance using noncontact laser ultrasound. The measurement of mechanical impedance has been of great interest in nondestructive testing (NDT) or structural health monitoring (SHM) since mechanical impedance is sensitive even to small-sized structural defects. Conventional impedance measurements, however, have been based on electromechanical impedance (EMI) using contact-type piezoelectric transducers, which show deteriorated performances induced by the effects of a) Curie temperature limitations, b) electromagnetic interference (EMI), c) bonding layers and etc. This study aims to tackle the limitations of conventional EMI measurement by utilizing laser-based mechanical impedance (LMI) measurement. The LMI response, which is equivalent to a steady-state ultrasound response, is generated by shooting the pulse laser beam to the target structure, and is acquired by measuring the out-of-plane velocity using a laser vibrometer. The formation of the LMI response is observed through the thermo-mechanical finite element analysis. The feasibility of applying the LMI technique for damage detection is experimentally verified using a pipe specimen under high temperature environment.

  1. Hyperspectral-imaging-based techniques applied to wheat kernels characterization

    Science.gov (United States)

    Serranti, Silvia; Cesare, Daniela; Bonifazi, Giuseppe

    2012-05-01

    Single kernels of durum wheat have been analyzed by hyperspectral imaging (HSI). Such an approach is based on the utilization of an integrated hardware and software architecture able to digitally capture and handle spectra as an image sequence, as they results along a pre-defined alignment on a surface sample properly energized. The study was addressed to investigate the possibility to apply HSI techniques for classification of different types of wheat kernels: vitreous, yellow berry and fusarium-damaged. Reflectance spectra of selected wheat kernels of the three typologies have been acquired by a laboratory device equipped with an HSI system working in near infrared field (1000-1700 nm). The hypercubes were analyzed applying principal component analysis (PCA) to reduce the high dimensionality of data and for selecting some effective wavelengths. Partial least squares discriminant analysis (PLS-DA) was applied for classification of the three wheat typologies. The study demonstrated that good classification results were obtained not only considering the entire investigated wavelength range, but also selecting only four optimal wavelengths (1104, 1384, 1454 and 1650 nm) out of 121. The developed procedures based on HSI can be utilized for quality control purposes or for the definition of innovative sorting logics of wheat.

  2. Introducing Risk Management Techniques Within Project Based Software Engineering Courses

    Science.gov (United States)

    Port, Daniel; Boehm, Barry

    2002-03-01

    In 1996, USC switched its core two-semester software engineering course from a hypothetical-project, homework-and-exam course based on the Bloom taxonomy of educational objectives (knowledge, comprehension, application, analysis, synthesis, and evaluation). The revised course is a real-client team-project course based on the CRESST model of learning objectives (content understanding, problem solving, collaboration, communication, and self-regulation). We used the CRESST cognitive demands analysis to determine the necessary student skills required for software risk management and the other major project activities, and have been refining the approach over the last 5 years of experience, including revised versions for one-semester undergraduate and graduate project course at Columbia. This paper summarizes our experiences in evolving the risk management aspects of the project course. These have helped us mature more general techniques such as risk-driven specifications, domain-specific simplifier and complicator lists, and the schedule as an independent variable (SAIV) process model. The largely positive results in terms of review of pass / fail rates, client evaluations, product adoption rates, and hiring manager feedback are summarized as well.

  3. Parameter tuning of PVD process based on artificial intelligence technique

    Science.gov (United States)

    Norlina, M. S.; Diyana, M. S. Nor; Mazidah, P.; Rusop, M.

    2016-07-01

    In this study, an artificial intelligence technique is proposed to be implemented in the parameter tuning of a PVD process. Due to its previous adaptation in similar optimization problems, genetic algorithm (GA) is selected to optimize the parameter tuning of the RF magnetron sputtering process. The most optimized parameter combination obtained from GA's optimization result is expected to produce the desirable zinc oxide (ZnO) thin film from the sputtering process. The parameters involved in this study were RF power, deposition time and substrate temperature. The algorithm was tested to optimize the 25 datasets of parameter combinations. The results from the computational experiment were then compared with the actual result from the laboratory experiment. Based on the comparison, GA had shown that the algorithm was reliable to optimize the parameter combination before the parameter tuning could be done to the RF magnetron sputtering machine. In order to verify the result of GA, the algorithm was also been compared to other well known optimization algorithms, which were, particle swarm optimization (PSO) and gravitational search algorithm (GSA). The results had shown that GA was reliable in solving this RF magnetron sputtering process parameter tuning problem. GA had shown better accuracy in the optimization based on the fitness evaluation.

  4. Crack identification based on synthetic artificial intelligent technique

    Energy Technology Data Exchange (ETDEWEB)

    Shim, Mun Bo; Suh, Myung Won [Sungkyunkwan Univ., Suwon (Korea, Republic of)

    2001-07-01

    It has been established that a crack has an important effect on the dynamic behavior of a structure. This effect depends mainly on the location and depth of the crack. To identify the location and depth of a crack in a structure, a method is presented in this paper which uses synthetic artificial intelligent technique, that is, Adaptive-Network-based Fuzzy Inference System(ANFIS) solved via hybrid learning algorithm(the back-propagation gradient descent and the least-squares method) are used to learn the input(the location and depth of a crack)-output(the structural eigenfrequencies) relation of the structural system. With this ANFIS and a Continuous Evolutionary Algorithm(CEA), it is possible to formulate the inverse problem. CEAs based on genetic algorithms work efficiently for continuous search space optimization problems like a parameter identification problem. With this ANFIS, CEAs are used to identify the crack location and depth minimizing the difference from the measured frequencies. We have tried this new idea on a simple beam structure and the results are promising.

  5. Electron tomography based on a total variation minimization reconstruction technique

    Energy Technology Data Exchange (ETDEWEB)

    Goris, B., E-mail: bart.goris@ua.ac.be [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Van den Broek, W. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Batenburg, K.J. [Centrum Wiskunde and Informatica, Science Park 123, NL-1098XG Amsterdam (Netherlands); Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Heidari Mezerji, H.; Bals, S. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium)

    2012-02-15

    The 3D reconstruction of a tilt series for electron tomography is mostly carried out using the weighted backprojection (WBP) algorithm or using one of the iterative algorithms such as the simultaneous iterative reconstruction technique (SIRT). However, it is known that these reconstruction algorithms cannot compensate for the missing wedge. Here, we apply a new reconstruction algorithm for electron tomography, which is based on compressive sensing. This is a field in image processing specialized in finding a sparse solution or a solution with a sparse gradient to a set of ill-posed linear equations. Therefore, it can be applied to electron tomography where the reconstructed objects often have a sparse gradient at the nanoscale. Using a combination of different simulated and experimental datasets, it is shown that missing wedge artefacts are reduced in the final reconstruction. Moreover, it seems that the reconstructed datasets have a higher fidelity and are easier to segment in comparison to reconstructions obtained by more conventional iterative algorithms. -- Highlights: Black-Right-Pointing-Pointer A reconstruction algorithm for electron tomography is investigated based on total variation minimization. Black-Right-Pointing-Pointer Missing wedge artefacts are reduced by this algorithm. Black-Right-Pointing-Pointer The reconstruction is easier to segment. Black-Right-Pointing-Pointer More reliable quantitative information can be obtained.

  6. Shellac and Aloe vera gel based surface coating for shelf life extension of tomatoes.

    Science.gov (United States)

    Chauhan, O P; Nanjappa, C; Ashok, N; Ravi, N; Roopa, N; Raju, P S

    2015-02-01

    Shellac (S) and Aloe vera gel (AG) were used to develop edible surface coatings for shelf-life extension of tomato fruits. The coating was prepared by dissolving de-waxed and bleached shellac in an alkaline aqueous medium as such as well as in combination with AG. Incorporation of AG in shellac coating improved permeability characteristics of the coating film towards oxygen and carbon dioxide and water vapours. The coatings when applied to tomatoes delayed senescence which was characterized by restricted changes in respiration and ethylene synthesis rates during storage. Texture of the fruits when measured in terms of firmness showed restricted changes as compared to untreated control. Similar observations were also recorded in the case of instrumental colour (L*, a* and b* values). The developed coatings extended shelf-life of tomatoes by 10, 8 and 12 days in case of shellac (S), AG and composite coating (S + AG) coated fruits, respectively; when kept at ambient storage conditions (28 ± 2 °C). PMID:25694740

  7. Use of extension-deformation-based crystallisation of silk fibres to differentiate their functions in nature.

    Science.gov (United States)

    Numata, Keiji; Masunaga, Hiroyasu; Hikima, Takaaki; Sasaki, Sono; Sekiyama, Kazuhide; Takata, Masaki

    2015-08-21

    β-Sheet crystals play an important role in determining the stiffness, strength, and optical properties of silk and in the exhibition of silk-type-specific functions. It is important to elucidate the structural changes that occur during the stretching of silk fibres to understand the functions of different types of fibres. Herein, we elucidate the initial crystallisation behaviour of silk molecules during the stretching of three types of silk fibres using synchrotron radiation X-ray analysis. When spider dragline silk was stretched, it underwent crystallisation and the alignment of the β-sheet crystals became disordered initially but was later recovered. On the other hand, silkworm cocoon silk did not exhibit further crystallisation, whereas capture spiral silk was predominantly amorphous. Structural analyses showed that the crystallisation of silks following extension deformation has a critical effect on their mechanical and optical properties. These findings should aid the production of artificial silk fibres and facilitate the development of silk-inspired functional materials. PMID:26166211

  8. Quantum extension of European option pricing based on the Ornstein-Uhlenbeck process

    CERN Document Server

    Piotrowski, E W; Zambrzycka, A; Piotrowski, Edward W.; Schroeder, Malgorzata; Zambrzycka, Anna

    2005-01-01

    In this work we propose a option pricing model based on the Ornstein-Uhlenbeck process. It is a new look at the Black-Scholes formula which is based on the quantum game theory. We show the differences between a classical look which is price changing by a Wiener process and the pricing is supported by a quantum model.

  9. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  10. Ground-based intercomparison of two isoprene measurement techniques

    Directory of Open Access Journals (Sweden)

    E. Leibrock

    2003-01-01

    Full Text Available An informal intercomparison of two isoprene (C5H8 measurement techniques was carried out during Fall of 1998 at a field site located approximately 3 km west of Boulder, Colorado, USA. A new chemical ionization mass spectrometric technique (CIMS was compared to a well-established gas chromatographic technique (GC. The CIMS technique utilized benzene cation chemistry to ionize isoprene. The isoprene levels measured by the CIMS were often larger than those obtained with the GC. The results indicate that the CIMS technique suffered from an anthropogenic interference associated with air masses from the Denver, CO metropolitan area as well as an additional interference occurring in clean conditions. However, the CIMS technique is also demonstrated to be sensitive and fast. Especially after introduction of a tandem mass spectrometric technique, it is therefore a candidate for isoprene measurements in remote environments near isoprene sources.

  11. Application of rule-based data mining techniques to real time ATLAS Grid job monitoring data

    International Nuclear Information System (INIS)

    The Job Execution Monitor (JEM) is a job-centric grid job monitoring software developed at the University of Wuppertal and integrated into the pilot-based PanDA job brokerage system leveraging physics analysis and Monte Carlo event production for the ATLAS experiment on the Worldwide LHC Computing Grid (WLCG). With JEM, job progress and grid worker node health can be supervised in real time by users, site admins and shift personnel. Imminent error conditions can be detected early and countermeasures can be initiated by the Job's owner immedeatly. Grid site admins can access aggregated data of all monitored jobs to infer the site status and to detect job and Grid worker node misbehavior. Shifters can use the same aggregated data to quickly react to site error conditions and broken production tasks. In this work, the application of novel data-centric rule based methods and data-mining techniques to the real time monitoring data is discussed. The usage of such automatic inference techniques on monitoring data to provide job and site health summary information to users and admins is presented. Finally, the provision of a secure real-time control and steering channel to the job as extension of the presented monitoring software is considered and a possible model of such the control method is presented.

  12. Physical, Chemical and Biochemical Modifications of Protein-Based Films and Coatings: An Extensive Review

    Science.gov (United States)

    Zink, Joël; Wyrobnik, Tom; Prinz, Tobias; Schmid, Markus

    2016-01-01

    Protein-based films and coatings are an interesting alternative to traditional petroleum-based materials. However, their mechanical and barrier properties need to be enhanced in order to match those of the latter. Physical, chemical, and biochemical methods can be used for this purpose. The aim of this article is to provide an overview of the effects of various treatments on whey, soy, and wheat gluten protein-based films and coatings. These three protein sources have been chosen since they are among the most abundantly used and are well described in the literature. Similar behavior might be expected for other protein sources. Most of the modifications are still not fully understood at a fundamental level, but all the methods discussed change the properties of the proteins and resulting products. Mastering these modifications is an important step towards the industrial implementation of protein-based films. PMID:27563881

  13. Physical, Chemical and Biochemical Modifications of Protein-Based Films and Coatings: An Extensive Review.

    Science.gov (United States)

    Zink, Joël; Wyrobnik, Tom; Prinz, Tobias; Schmid, Markus

    2016-01-01

    Protein-based films and coatings are an interesting alternative to traditional petroleum-based materials. However, their mechanical and barrier properties need to be enhanced in order to match those of the latter. Physical, chemical, and biochemical methods can be used for this purpose. The aim of this article is to provide an overview of the effects of various treatments on whey, soy, and wheat gluten protein-based films and coatings. These three protein sources have been chosen since they are among the most abundantly used and are well described in the literature. Similar behavior might be expected for other protein sources. Most of the modifications are still not fully understood at a fundamental level, but all the methods discussed change the properties of the proteins and resulting products. Mastering these modifications is an important step towards the industrial implementation of protein-based films. PMID:27563881

  14. Automata-Based Programming Technology Extension for Generation of JML Annotated Java Card Code

    OpenAIRE

    Andrey, A.

    2008-01-01

    This paper gives an overview of the ongoing research project which concerns generation of dependable Java Card code. According to the automata-based programming technology, code is generated from a high-level application behavior description which is based on finite state machines. An extra benefit from the use of such description is the possibility of generation of formal application specification in Java Modeling Language. Conformance of the code against its specification could be checked b...

  15. Image-based Virtual Exhibit and Its Extension to 3D

    Institute of Scientific and Technical Information of China (English)

    Ming-Min Zhang; Zhi-Geng Pan; Li-Feng Ren; Peng Wang

    2007-01-01

    In this paper we introduce an image-based virtual exhibition system especially for clothing product. It can provide a powerful material substitution function, which is very useful for customization clothing-built. A novel color substitution algorithm and two texture morphing methods are designed to ensure realistic substitution result. To extend it to 3D, we need to do the model reconstruction based on photos. Thus we present an improved method for modeling human body. It deforms a generic model with shape details extracted from pictures to generate a new model. Our method begins with model image generation followed by silhouette extraction and segmentation. Then it builds a mapping between pixels inside every pair of silhouette segments in the model image and in the picture. Our mapping algorithm is based on a slice space representation that conforms to the natural features of human body.

  16. Whole Genome Sequencing Based Characterization of Extensively Drug-Resistant Mycobacterium tuberculosis Isolates from Pakistan

    KAUST Repository

    Ali, Asho

    2015-02-26

    Improved molecular diagnostic methods for detection drug resistance in Mycobacterium tuberculosis (MTB) strains are required. Resistance to first- and second- line anti-tuberculous drugs has been associated with single nucleotide polymorphisms (SNPs) in particular genes. However, these SNPs can vary between MTB lineages therefore local data is required to describe different strain populations. We used whole genome sequencing (WGS) to characterize 37 extensively drug-resistant (XDR) MTB isolates from Pakistan and investigated 40 genes associated with drug resistance. Rifampicin resistance was attributable to SNPs in the rpoB hot-spot region. Isoniazid resistance was most commonly associated with the katG codon 315 (92%) mutation followed by inhA S94A (8%) however, one strain did not have SNPs in katG, inhA or oxyR-ahpC. All strains were pyrazimamide resistant but only 43% had pncA SNPs. Ethambutol resistant strains predominantly had embB codon 306 (62%) mutations, but additional SNPs at embB codons 406, 378 and 328 were also present. Fluoroquinolone resistance was associated with gyrA 91-94 codons in 81% of strains; four strains had only gyr B mutations, while others did not have SNPs in either gyrA or gyrB. Streptomycin resistant strains had mutations in ribosomal RNA genes; rpsL codon 43 (42%); rrs 500 region (16%), and gidB (34%) while six strains did not have mutations in any of these genes. Amikacin/kanamycin/capreomycin resistance was associated with SNPs in rrs at nt1401 (78%) and nt1484 (3%), except in seven (19%) strains. We estimate that if only the common hot-spot region targets of current commercial assays were used, the concordance between phenotypic and genotypic testing for these XDR strains would vary between rifampicin (100%), isoniazid (92%), flouroquinolones (81%), aminoglycoside (78%) and ethambutol (62%); while pncA sequencing would provide genotypic resistance in less than half the isolates. This work highlights the importance of expanded

  17. Extensions to Regret-based Decision Curve Analysis: An application to hospice referral for terminal patients

    Directory of Open Access Journals (Sweden)

    Tsalatsanis Athanasios

    2011-12-01

    Full Text Available Abstract Background Despite the well documented advantages of hospice care, most terminally ill patients do not reap the maximum benefit from hospice services, with the majority of them receiving hospice care either prematurely or delayed. Decision systems to improve the hospice referral process are sorely needed. Methods We present a novel theoretical framework that is based on well-established methodologies of prognostication and decision analysis to assist with the hospice referral process for terminally ill patients. We linked the SUPPORT statistical model, widely regarded as one of the most accurate models for prognostication of terminally ill patients, with the recently developed regret based decision curve analysis (regret DCA. We extend the regret DCA methodology to consider harms associated with the prognostication test as well as harms and effects of the management strategies. In order to enable patients and physicians in making these complex decisions in real-time, we developed an easily accessible web-based decision support system available at the point of care. Results The web-based decision support system facilitates the hospice referral process in three steps. First, the patient or surrogate is interviewed to elicit his/her personal preferences regarding the continuation of life-sustaining treatment vs. palliative care. Then, regret DCA is employed to identify the best strategy for the particular patient in terms of threshold probability at which he/she is indifferent between continuation of treatment and of hospice referral. Finally, if necessary, the probabilities of survival and death for the particular patient are computed based on the SUPPORT prognostication model and contrasted with the patient's threshold probability. The web-based design of the CDSS enables patients, physicians, and family members to participate in the decision process from anywhere internet access is available. Conclusions We present a theoretical

  18. Research on technique of wavefront retrieval based on Foucault test

    Science.gov (United States)

    Yuan, Lvjun; Wu, Zhonghua

    2010-05-01

    During finely grinding the best fit sphere and initial stage of polishing, surface error of large aperture aspheric mirrors is too big to test using common interferometer. Foucault test is widely used in fabricating large aperture mirrors. However, the optical path is disturbed seriously by air turbulence, and changes of light and dark zones can not be identified, which often lowers people's judging ability and results in making mistake to diagnose surface error of the whole mirror. To solve the problem, the research presents wavefront retrieval based on Foucault test through digital image processing and quantitative calculation. Firstly, real Foucault image can be gained through collecting a variety of images by CCD, and then average these image to eliminate air turbulence. Secondly, gray values are converted into surface error values through principle derivation, mathematical modeling, and software programming. Thirdly, linear deviation brought by defocus should be removed by least-square method to get real surface error. At last, according to real surface error, plot wavefront map, gray contour map and corresponding pseudo color contour map. The experimental results indicates that the three-dimensional wavefront map and two-dimensional contour map are able to accurately and intuitively show surface error on the whole mirrors under test, and they are beneficial to grasp surface error as a whole. The technique can be used to guide the fabrication of large aperture and long focal mirrors during grinding and initial stage of polishing the aspheric surface, which improves fabricating efficiency and precision greatly.

  19. A formal model for integrity protection based on DTE technique

    Institute of Scientific and Technical Information of China (English)

    JI Qingguang; QING Sihan; HE Yeping

    2006-01-01

    In order to provide integrity protection for the secure operating system to satisfy the structured protection class' requirements, a DTE technique based integrity protection formalization model is proposed after the implications and structures of the integrity policy have been analyzed in detail. This model consists of some basic rules for configuring DTE and a state transition model, which are used to instruct how the domains and types are set, and how security invariants obtained from initial configuration are maintained in the process of system transition respectively. In this model, ten invariants are introduced, especially, some new invariants dealing with information flow are proposed, and their relations with corresponding invariants described in literatures are also discussed.The thirteen transition rules with well-formed atomicity are presented in a well-operational manner. The basic security theorems correspond to these invariants and transition rules are proved. The rationalities for proposing the invariants are further annotated via analyzing the differences between this model and ones described in literatures. At last but not least, future works are prospected, especially, it is pointed out that it is possible to use this model to analyze SE-Linux security.

  20. Image content authentication technique based on Laplacian Pyramid

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    This paper proposes a technique of image content authentication based on the Laplacian Pyramid to verify the authenticity of image content.First,the image is decomposed into Laplacian Pyramid before the transformation.Next,the smooth and detail properties of the original image are analyzed according to the Laplacian Pyramid,and the properties are classified and encoded to get the corresponding characteristic values.Then,the signature derived from the encrypted characteristic values is embedded in the original image as a watermark.After the reception,the characteristic values of the received image are compared with the watermark drawn out from the image.The algorithm automatically identifies whether the content is tampered by means of morphologic filtration.The information of tampered location is Presented at the same time.Experimental results show that the pro posed authentication algorithm can effectively detect the event and location when the original image content is tampered.Moreover,it can tolerate some distortions produced by compression,filtration and noise degradation.

  1. A response surface methodology based damage identification technique

    International Nuclear Information System (INIS)

    Response surface methodology (RSM) is a combination of statistical and mathematical techniques to represent the relationship between the inputs and outputs of a physical system by explicit functions. This methodology has been widely employed in many applications such as design optimization, response prediction and model validation. But so far the literature related to its application in structural damage identification (SDI) is scarce. Therefore this study attempts to present a systematic SDI procedure comprising four sequential steps of feature selection, parameter screening, primary response surface (RS) modeling and updating, and reference-state RS modeling with SDI realization using the factorial design (FD) and the central composite design (CCD). The last two steps imply the implementation of inverse problems by model updating in which the RS models substitute the FE models. The proposed method was verified against a numerical beam, a tested reinforced concrete (RC) frame and an experimental full-scale bridge with the modal frequency being the output responses. It was found that the proposed RSM-based method performs well in predicting the damage of both numerical and experimental structures having single and multiple damage scenarios. The screening capacity of the FD can provide quantitative estimation of the significance levels of updating parameters. Meanwhile, the second-order polynomial model established by the CCD provides adequate accuracy in expressing the dynamic behavior of a physical system

  2. WORMHOLE ATTACK MITIGATION IN MANET: A CLUSTER BASED AVOIDANCE TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Subhashis Banerjee

    2014-01-01

    Full Text Available A Mobile Ad-Hoc Network (MANET is a self configuring, infrastructure less network of mobile devices connected by wireless links. Loopholes like wireless medium, lack of a fixed infrastructure, dynamic topology, rapid deployment practices, and the hostile environments in which they may be deployed, make MANET vulnerable to a wide range of security attacks and Wormhole attack is one of them. During this attack a malicious node captures packets from one location in the network, and tunnels them to another colluding malicious node at a distant point, which replays them locally. This paper presents a cluster based Wormhole attack avoidance technique. The concept of hierarchical clustering with a novel hierarchical 32- bit node addressing scheme is used for avoiding the attacking path during the route discovery phase of the DSR protocol, which is considered as the under lying routing protocol. Pinpointing the location of the wormhole nodes in the case of exposed attack is also given by using this method.

  3. Orientation of student entrepreneurial practices based on administrative techniques

    Directory of Open Access Journals (Sweden)

    Héctor Horacio Murcia Cabra

    2005-07-01

    Full Text Available As part of the second phase of the research project «Application of a creativity model to update the teaching of the administration in Colombian agricultural entrepreneurial systems» it was decided to re-enforce student planning and execution of the students of the Agricultural business Administration Faculty of La Salle University. Those finishing their studies were given special attention. The plan of action was initiated in the second semester of 2003. It was initially defined as a model of entrepreneurial strengthening based on a coherent methodology that included the most recent administration and management techniques. Later, the applicability of this model was tested in some organizations of the agricultural sector that had asked for support in their planning processes. Through an investigation-action process the methodology was redefined in order to arrive at a final model that could be used by faculty students and graduates. The results obtained were applied to the teaching of Entrepreneurial Laboratory of ninth semester students with the hope of improving administrative support to agricultural enterprises. Following this procedure more than 100 students and 200 agricultural producers have applied this procedure between June 2003 and July 2005. The methodology used and the results obtained are presented in this article.

  4. A new membrane-based crystallization technique: tests on lysozyme

    Science.gov (United States)

    Curcio, Efrem; Profio, Gianluca Di; Drioli, Enrico

    2003-01-01

    The great importance of protein science both in industrial and scientific fields, in conjunction with the intrinsic difficulty to grow macromolecular crystals, stimulates the development of new observations and ideas that can be useful in initiating more systematic studies using novel approaches. In this regard, an innovative technique, based on the employment of microporous hydrophobic membranes in order to promote the formation of lysozyme crystals from supersaturated solutions, is introduced in this work. Operational principles and possible advantages, both in terms of controlled extraction of solvent by acting on the concentration of the stripping solution and reduced induction times, are outlined. Theoretical developments and experimental results concerning the mass transfer, in vapour phase, through the membrane are presented, as well as the results from X-ray diffraction to 1.7 Å resolution of obtained lysozyme crystals using NaCl as the crystallizing agent and sodium acetate as the buffer. Crystals were found to be tetragonal with unit cell dimensions of a= b=79.1 Å and c=37.9 Å; the overall Rmerge on intensities in the resolution range from 25 to 1.7 Å was, in the best case, 4.4%.

  5. Security extension for the Canetti-Krawczyk model in identity-based systems

    Institute of Scientific and Technical Information of China (English)

    LI Xinghua; MA Jianfeng; SangJae Moon

    2005-01-01

    The Canetti-Krawczyk (CK) model is a formalism for the analysis of keyexchange protocols, which can guarantee many security properties for the protocols proved secure by this model. But we find this model lacks the ability to guarantee key generation center (KGC) forward secrecy, which is an important security property for key-agreement protocols based on Identity. The essential reason leading to this weakness is that it does not fully consider the attacker's capabilities. In this paper, the CK model is accordingly extended with a new additional attacker's capability of the KGC corruption in Identity-based systems, which enables it to support KGC forward secrecy.

  6. Structuring Task-based Interaction through Collaborative Learning Techniques (2)

    Institute of Scientific and Technical Information of China (English)

    William Littlewood

    2004-01-01

    @@ Techniques for collaborative learning In this section the focus will move from broad strategies to specific techniques (often also called "structures") through which the strategies can be realized. It gives a selection of techniques which have proved (in my own experience as well as that of others) particularly useful in pro-viding contexts for practice, exploration and /or interaction in the second language classroom.

  7. Extensive simulation studies on the reconstructed image resolution of a position sensitive detector based on pixelated CdTe crystals

    CERN Document Server

    Zachariadou, K; Kaissas, I; Seferlis, S; Lambropoulos, C; Loukas, D; Potiriadis, C

    2011-01-01

    We present results on the reconstructed image resolution of a position sensitive radiation instrument (COCAE) based on extensive simulation studies. The reconstructed image resolution has been investigated in a wide range of incident photon energies emitted by point-like sources located at different source-to-detector distances on and off the detector's symmetry axis. The ability of the detector to distinguish multiple radioactive sources observed simultaneously is investigating by simulating point-like sources of different energies located on and off the detector's symmetry axis and at different positions

  8. 78 FR 7654 - Extension of Exemptions for Security-Based Swaps

    Science.gov (United States)

    2013-02-04

    ... Swaps, Release No. 33-9231 (Jul. 1, 2011), 76 FR 40605 (Jul. 11, 2011) (``Interim Final Rules Adopting... Security-Based Swaps Issued By Certain Clearing Agencies, Release No. 33-9308 (Mar. 30, 2012), 77 FR 20536... Comptroller of the Currency, 100.0% of credit default swap positions held by U.S. commercial banks and...

  9. Extension of information entropy-based measures in incomplete information systems

    Institute of Scientific and Technical Information of China (English)

    LI Ren-pu; HUANG Dao; GAO Mao-ting

    2005-01-01

    It is helpful for people to understand the essence of rough set theory to study the concepts and operations of rough set theory from its information view. In this paper we address knowledge expression and knowledge reduction in incomplete information systems from the information view of rough set theory. First, by extending information entropy-based measures in complete information systems, two new measures of incomplete entropy and incomplete conditional entropy are presented for incomplete information systems. And then, based on these measures the problem of knowledge reduction in incomplete information systems is analyzed and the reduct definitions in incomplete information system and incomplete decision table are proposed respectively. Finally,the reduct definitions based on incomplete entropy and the reduct definitions based on similarity relation are compared. Two equivalent relationships between them are proved by theorems and an in equivalent relationship between them is illustrated by an example. The work of this paper extends the research of rough set theory from information view to incomplete information systems and establishes the theoretical basis for seeking efficient algorithm of knowledge acquisition in incomplete information systems.

  10. Truth-value semantics and functional extensions for classical logic of partial terms based on equality

    CERN Document Server

    Parlamento, Franco

    2011-01-01

    We develop a bottom-up approach to truth-value semantics for classical logic of partial terms based on equality and apply it to prove the conservativity of the addition of partial description and partial selection functions, independently of any strictness assumption.

  11. An extensible agent architecture for a competitive market-based allocation of consumer attention space

    NARCIS (Netherlands)

    Hoen, P.J. 't; Bohte, S.M.; Gerding, E.H.; La Poutré, J.A.

    2002-01-01

    A competitive distributed recommendation mechanism is introduced based on adaptive software agents for efficiently allocating the ``customer attention space'', or banners. In the example of an electronic shopping mall, the task of correctly profiling and analyzing the customers is delegated to the

  12. Water-based oligochitosan and nanowhisker chitosan as potential food preservatives for shelf-life extension of minced pork.

    Science.gov (United States)

    Chantarasataporn, Patomporn; Tepkasikul, Preenapha; Kingcha, Yutthana; Yoksan, Rangrong; Pichyangkura, Rath; Visessanguan, Wonnop; Chirachanchai, Suwabun

    2014-09-15

    Water-based chitosans in the forms of oligochitosan (OligoCS) and nanowhisker chitosan (CSWK) are proposed as a novel food preservative based on a minced pork model study. The high surface area with a positive charge over the neutral pH range (pH 5-8) of OligoCS and CSWK lead to an inhibition against Gram-positive (Staphylococcus aureus, Listeria monocytogenes, and Bacillus cereus) and Gram-negative microbes (Salmonella enteritidis and Escherichia coli O157:H7). In the minced pork model, OligoCS effectively performs a food preservative for shelf-life extension as clarified from the retardation of microbial growth, biogenic amine formation and lipid oxidation during the storage. OligoCS maintains almost all myosin heavy chain protein degradation as observed in the electrophoresis. The present work points out that water-based chitosan with its unique morphology not only significantly inhibits antimicrobial activity but also maintains the meat quality with an extension of shelf-life, and thus has the potential to be used as a food preservative.

  13. Park-based and zero sequence-based relaying techniques with application to transformers protection

    Energy Technology Data Exchange (ETDEWEB)

    Diaz, G.; Arboleya, P.; Gomez-Aleixandre, J. [University of Oviedo (Spain). Dept. of Electrical Engineering

    2004-09-01

    Two relaying techniques for protecting power transformers are presented and discussed. Very often, differential relaying is used for this purpose. A comparison between the two proposed techniques and conventional differential relaying is thus presented. The first technique, based on the measurements of zero sequence current within a delta winding, performs best in multiwinding transformers, since only measurement of the coil currents is needed. Thus, great simplicity is achieved. The second one is based on the differential procedure, but its analysis of asymmetries in the plot in Park's plane avoids problems related to spectral analysis in conventional differential relaying. The technique is justified from the analysis of symmetrical components. Misoperation in conventional differential relaying has been observed in some cases as a function of switching instant and fault location. This issue is discussed in the paper, and a statistical analysis of a large number of laboratory tests, in which both factors were controlled, is presented. As a conclusion, both relaying techniques proposed succeed in protecting the transformer. Additionally, the Park-based relay exhibits three characteristics of most importance: fastest performance, robustness and simplicity in its formulation. (author)

  14. Adoption of farm-based irrigation water-saving techniques in the Guanzhong Plain, China

    NARCIS (Netherlands)

    Tang, Jianjun; Folmer, Henk; Xue, Jianhong

    2016-01-01

    This article analyses adoption of farm-based irrigation water saving techniques, based on a cross-sectional data set of 357 farmers in the Guanzhong Plain, China. Approximately 83% of the farmers use at least one farm-based water-saving technique. However, the traditional, inefficient techniques bor

  15. Extension of a Kolmogorov Atmospheric Turbulence Model for Time-Based Simulation Implementation

    Science.gov (United States)

    McMinn, John D.

    1997-01-01

    The development of any super/hypersonic aircraft requires the interaction of a wide variety of technical disciplines to maximize vehicle performance. For flight and engine control system design and development on this class of vehicle, realistic mathematical simulation models of atmospheric turbulence, including winds and the varying thermodynamic properties of the atmosphere, are needed. A model which has been tentatively selected by a government/industry group of flight and engine/inlet controls representatives working on the High Speed Civil Transport is one based on the Kolmogorov spectrum function. This report compares the Dryden and Kolmogorov turbulence forms, and describes enhancements that add functionality to the selected Kolmogorov model. These added features are: an altitude variation of the eddy dissipation rate based on Dryden data, the mapping of the eddy dissipation rate database onto a regular latitude and longitude grid, a method to account for flight at large vehicle attitude angles, and a procedure for transitioning smoothly across turbulence segments.

  16. Extension of direct displacement-based design methodology for bridges to account for higher mode effects

    OpenAIRE

    Kappos, A. J.; Gkatzogias, K.I.; Gidaris, I.G.

    2013-01-01

    An improvement is suggested to the direct displacement-based design (DDBD) procedure for bridges to account for higher mode effects, the key idea being not only the proper prediction of a target-displacement profile through the effective mode shape (EMS) method (wherein all significant modes are considered), but also the proper definition of the corresponding peak structural response. The proposed methodology is then applied to an actual concrete bridge wherein the different pier heights and ...

  17. WSN- and IOT-Based Smart Homes and Their Extension to Smart Buildings

    OpenAIRE

    Hemant Ghayvat; Subhas Mukhopadhyay; Xiang Gui; Nagender Suryadevara

    2015-01-01

    Our research approach is to design and develop reliable, efficient, flexible, economical, real-time and realistic wellness sensor networks for smart home systems. The heterogeneous sensor and actuator nodes based on wireless networking technologies are deployed into the home environment. These nodes generate real-time data related to the object usage and movement inside the home, to forecast the wellness of an individual. Here, wellness stands for how efficiently someone stays fit in the home...

  18. A GIS extension model to calculate urban heat island intensity based on urban geometry

    OpenAIRE

    Nakata, C. M.; Souza, Léa Cristina Lucas; Rodrigues, Daniel Souto

    2015-01-01

    This paper presents a simulation model, which was incorporated into a Geographic Information System (GIS), in order to calculate the maximum intensity of urban heat islands based on urban geometry data. The method-ology of this study stands on a theoretical-numerical basis (Okeâ s model), followed by the study and selection of existing GIS tools, the design of the calculation model, the incorporation of the resulting algorithm into the GIS platform and the application of the tool, developed ...

  19. Numerical Techniques for Simulation of Tsunami Based on Finite Elements

    OpenAIRE

    Watanabe, Masaji; Liu, Ying; Wang, Ming Jun

    2006-01-01

    Numerical techniques to simulate tsunamis are described. Partial differential equations are reduced to a system of ordinary differential equations to which appropriate numerical solvers can be applied. The techniques are illustrated with an example in which tsunami due to an earthquake is simulated.

  20. A Technique for Volumetric CSG Based on Morphology

    DEFF Research Database (Denmark)

    Bærentzen, Jakob Andreas; Christensen, Niels Jørgen

    2001-01-01

    In this paper, a new technique for volumetric CSG is presented. The technique requires the input volumes to correspond to solids which fulfill a voxelization suitability criterion. Assume the CSG operation is union. The volumetric union of two such volumes is defined in terms of the voxelization...

  1. Evidence-based surgical techniques for caesarean section

    DEFF Research Database (Denmark)

    Aabakke, Anna J M; Secher, Niels Jørgen; Krebs, Lone

    2014-01-01

    Caesarean section (CS) is a common surgical procedure, and in Denmark 21% of deliveries is by CS. There is an increasing amount of scientific evidence to support the different surgical techniques used at CS. This article reviews the literature regarding CS techniques. There is still a lack of evi...

  2. A Lossless Data Hiding Technique based on AES-DWT

    Directory of Open Access Journals (Sweden)

    Gustavo Fernandaacute;ndez Torres2

    2012-09-01

    Full Text Available In this paper we propose a new data hiding technique. The new technique uses steganography and cryptography on images with a size of 256x256 pixels and an 8-bit grayscale format. There are design restrictions such as a fixed-size cover image, and reconstruction without error of the hidden image. The steganography technique uses a Haar-DWT (Discrete Wavelet Transform with hard thresholding and LSB (Less Significant Bit technique on the cover image. The algorithms used for compressing and ciphering the secret image are lossless JPG and AES, respectively. The proposed technique is used to generate a stego image which provides a double type of security that is robust against attacks. Results are reported for different thresholds levels in terms of PSNR.

  3. Anterolateral Ligament Reconstruction Technique: An Anatomic-Based Approach.

    Science.gov (United States)

    Chahla, Jorge; Menge, Travis J; Mitchell, Justin J; Dean, Chase S; LaPrade, Robert F

    2016-06-01

    Restoration of anteroposterior laxity after an anterior cruciate ligament reconstruction has been predictable with traditional open and endoscopic techniques. However, anterolateral rotational stability has been difficult to achieve in a subset of patients, even with appropriate anatomic techniques. Therefore, differing techniques have attempted to address this rotational laxity by augmenting or reconstructing lateral-sided structures about the knee. In recent years, there has been a renewed interest in the anterolateral ligament as a potential contributor to residual anterolateral rotatory instability in anterior cruciate ligament-deficient patients. Numerous anatomic and biomechanical studies have been performed to further define the functional importance of the anterolateral ligament, highlighting the need for surgical techniques to address these injuries in the unstable knee. This article details our technique for an anatomic anterolateral ligament reconstruction using a semitendinosus tendon allograft. PMID:27656361

  4. Extensive validation of the code FUROM based on the IFPE database

    International Nuclear Information System (INIS)

    The fuel modelling code FUROM (FUel ROd Model), suitable for calculating the normal operation condition behaviour of PWR and WWER fuels, has been developed at AEKI for several years. The validation of the code has so far been based on the individual calculation of many relevant experiments. This, however, was a time consuming process that could give rise to errors both at the input and at the comparison stage. A new methodology is implemented to build up a uniform database from the IFPE data and run automated validation tasks depending on the model or phenomenon of interest. The general problems encountered and some results are presented here. (authors)

  5. Development of a Flexible and Extensible Computer-based Simulation Platform for Healthcare Students.

    Science.gov (United States)

    Bindoff, Ivan; Cummings, Elizabeth; Ling, Tristan; Chalmers, Leanne; Bereznicki, Luke

    2015-01-01

    Accessing appropriate clinical placement positions for all health profession students can be expensive and challenging. Increasingly simulation, in a range of modes, is being used to enhance student learning and prepare them for clinical placement. Commonly these simulations are focused on the use of simulated patient mannequins which typically presented as single-event scenarios, difficult to organise, and usually scenarios include only a single healthcare profession. Computer based simulation is relatively under-researched and under-utilised but is beginning to demonstrate potential benefits. This paper describes the development and trialling of an entirely virtual 3D simulated environment for inter-professional student education. PMID:25676952

  6. Carbon Storage in an Extensive Karst-distributed Region of Southwestern China based on Multiple Methods

    Science.gov (United States)

    Guo, C.; Wu, Y.; Yang, H.; Ni, J.

    2015-12-01

    Accurate estimation of carbon storage is crucial to better understand the processes of global and regional carbon cycles and to more precisely project ecological and economic scenarios for the future. Southwestern China has broadly and continuously distribution of karst landscapes with harsh and fragile habitats which might lead to rocky desertification, an ecological disaster which has significantly hindered vegetation succession and economic development in karst regions of southwestern China. In this study we evaluated the carbon storage in eight political divisions of southwestern China based on four methods: forest inventory, carbon density based on field investigations, CASA model driven by remote sensing data, and BIOME4/LPJ global vegetation models driven by climate data. The results show that: (1) The total vegetation carbon storage (including agricultural ecosystem) is 6763.97 Tg C based on the carbon density, and the soil organic carbon (SOC) storage (above 20cm depth) is 12475.72 Tg C. Sichuan Province (including Chongqing) possess the highest carbon storage in both vegetation and soil (1736.47 Tg C and 4056.56 Tg C, respectively) among the eight political divisions because of the higher carbon density and larger distribution area. The vegetation carbon storage in Hunan Province is the smallest (565.30 Tg C), and the smallest SOC storage (1127.40 Tg C) is in Guangdong Province; (2) Based on forest inventory data, the total aboveground carbon storage in the woody vegetation is 2103.29 Tg C. The carbon storage in Yunnan Province (819.01 Tg C) is significantly higher than other areas while tropical rainforests and seasonal forests in Yunnan contribute the maximum of the woody vegetation carbon storage (account for 62.40% of the total). (3) The net primary production (NPP) simulated by the CASA model is 68.57 Tg C/yr, while the forest NPP in the non-karst region (account for 72.50% of the total) is higher than that in the karst region. (4) BIOME4 and LPJ

  7. A study on development of a rule based expert system for steam generator life extension

    International Nuclear Information System (INIS)

    The need of predicting the integrity of the steam generator(SG) tubes and environmental conditions that affect their integrity is growing to secure nuclear power plant(NPP) safety and enhance plant availability. To achieve their objectives it is important to diagnose the integrity of the SG tubes. An expert system called FEMODES(failure mode diagnosis expert system) has been developed for diagnosis of such tube degradation phenomena as denting, intergranular attack(IGA) and stress corrosion cracking(SCC) in the secondary side of the SG. It is possible with use of FEMODES to estimate possibilities of SG tube degradation and diagnosis environmental conditions that influence such tube degradation. The method of certainty factor theory(CFT) and the rule based backward reasoning inference strategy are used to develop FEMODES. The information required for diagnosis is acquired from SG tube degradation experiences of two local reference plants, some limited oversea plants and technical reports/research papers about such tube degradation. Overall results estimated with use of FEMODES are in reasonable agreement with actual SG tube degradation. Some discrepancy observed in several estimated values of SG tube degradation appears to be due to insufficient heuristic knowledge for knowledge data base of FEMODES

  8. The Numerical Simulation of the Crack Elastoplastic Extension Based on the Extended Finite Element Method

    Directory of Open Access Journals (Sweden)

    Xia Xiaozhou

    2013-01-01

    Full Text Available In the frame of the extended finite element method, the exponent disconnected function is introduced to reflect the discontinuous characteristic of crack and the crack tip enrichment function which is made of triangular basis function, and the linear polar radius function is adopted to describe the displacement field distribution of elastoplastic crack tip. Where, the linear polar radius function form is chosen to decrease the singularity characteristic induced by the plastic yield zone of crack tip, and the triangle basis function form is adopted to describe the displacement distribution character with the polar angle of crack tip. Based on the displacement model containing the above enrichment displacement function, the increment iterative form of elastoplastic extended finite element method is deduced by virtual work principle. For nonuniform hardening material such as concrete, in order to avoid the nonsymmetry characteristic of stiffness matrix induced by the non-associate flowing of plastic strain, the plastic flowing rule containing cross item based on the least energy dissipation principle is adopted. Finally, some numerical examples show that the elastoplastic X-FEM constructed in this paper is of validity.

  9. New Extensions of Pairing-based Signatures into Universal (Multi) Designated Verifier Signatures

    CERN Document Server

    Vergnaud, Damien

    2008-01-01

    The concept of universal designated verifier signatures was introduced by Steinfeld, Bull, Wang and Pieprzyk at Asiacrypt 2003. These signatures can be used as standard publicly verifiable digital signatures but have an additional functionality which allows any holder of a signature to designate the signature to any desired verifier. This designated verifier can check that the message was indeed signed, but is unable to convince anyone else of this fact. We propose new efficient constructions for pairing-based short signatures. Our first scheme is based on Boneh-Boyen signatures and its security can be analyzed in the standard security model. We prove its resistance to forgery assuming the hardness of the so-called strong Diffie-Hellman problem, under the knowledge-of-exponent assumption. The second scheme is compatible with the Boneh-Lynn-Shacham signatures and is proven unforgeable, in the random oracle model, under the assumption that the computational bilinear Diffie-Hellman problem is untractable. Both s...

  10. An Architecture for Intrusion Detection Based on an Extension of the Method of Remaining Elements

    Directory of Open Access Journals (Sweden)

    P. Velarde-Alvarado

    2010-08-01

    Full Text Available This paper introduces an Anomaly-based Intrusion Detection architecture based on behavioral traffic profiles createdby using our enhanced version of the Method of Remaining Elements (MRE. This enhanced version includes: aredefinition of the exposure threshold through the entropy and cardinality of residual sequences, a dualcharacterization for two types of traffic slots, the introduction of the Anomaly Level Exposure (ALE that gives a betterquantification of anomalies for a given traffic slot and r-feature, an alternative support that extends its detectioncapabilities, and a new procedure to obtain the exposure threshold through an analysis of outliers on the trainingdataset. Regarding the original MRE, we incorporate the refinements outlined resulting in a reliable method, whichgives an improved sensitivity to the detection of a broader range of attacks. The experiments were conducted on theMIT-DARPA dataset and also on an academic LAN by implementing real attacks. The results show that the proposedarchitecture is effective in early detection of intrusions, as well as some kind of attacks designed to bypass detectionmeasures.

  11. Consumer Decision-Making Styles Extension to Trust-Based Product Comparison Site Usage Model

    Directory of Open Access Journals (Sweden)

    Radoslaw Macik

    2016-09-01

    Full Text Available The paper describes an implementation of extended consumer decision-making styles concept in explaining consumer choices made in product comparison site environment in the context of trust-based information technology acceptance model. Previous research proved that trust-based acceptance model is useful in explaining purchase intention and anticipated satisfaction in product comparison site environment, as an example of online decision shopping aids. Trust to such aids is important in explaining their usage by consumers. The connections between consumer decision-making styles, product and sellers opinions usage, cognitive and affective trust toward online product comparison site, as well as choice outcomes (purchase intention and brand choice are explored trough structural equation models using PLS-SEM approach, using a sample of 461 young consumers. Research confirmed the validity of research model in explaining product comparison usage, and some consumer decision-making styles influenced consumers’ choices and purchase intention. Product and sellers reviews usage were partially mediating mentioned relationships.

  12. SPAM CLASSIFICATION BASED ON SUPERVISED LEARNING USING MACHINE LEARNING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    T. Hamsapriya

    2011-12-01

    Full Text Available E-mail is one of the most popular and frequently used ways of communication due to its worldwide accessibility, relatively fast message transfer, and low sending cost. The flaws in the e-mail protocols and the increasing amount of electronic business and financial transactions directly contribute to the increase in e-mail-based threats. Email spam is one of the major problems of the today’s Internet, bringing financial damage to companies and annoying individual users. Spam emails are invading users without their consent and filling their mail boxes. They consume more network capacity as well as time in checking and deleting spam mails. The vast majority of Internet users are outspoken in their disdain for spam, although enough of them respond to commercial offers that spam remains a viable source of income to spammers. While most of the users want to do right think to avoid and get rid of spam, they need clear and simple guidelines on how to behave. In spite of all the measures taken to eliminate spam, they are not yet eradicated. Also when the counter measures are over sensitive, even legitimate emails will be eliminated. Among the approaches developed to stop spam, filtering is the one of the most important technique. Many researches in spam filtering have been centered on the more sophisticated classifier-related issues. In recent days, Machine learning for spam classification is an important research issue. The effectiveness of the proposed work is explores and identifies the use of different learning algorithms for classifying spam messages from e-mail. A comparative analysis among the algorithms has also been presented.

  13. Development of Acoustic Model-Based Iterative Reconstruction Technique for Thick-Concrete Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Almansouri, Hani [Purdue University; Clayton, Dwight A [ORNL; Kisner, Roger A [ORNL; Polsky, Yarom [ORNL; Bouman, Charlie [Purdue University; Santos-Villalobos, Hector J [ORNL

    2016-01-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well's health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.

  14. Development of acoustic model-based iterative reconstruction technique for thick-concrete imaging

    Science.gov (United States)

    Almansouri, Hani; Clayton, Dwight; Kisner, Roger; Polsky, Yarom; Bouman, Charles; Santos-Villalobos, Hector

    2016-02-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well's health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.1

  15. Development of Acoustic Model-Based Iterative Reconstruction Technique for Thick-Concrete Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Almansouri, Hani [Purdue University; Clayton, Dwight A [ORNL; Kisner, Roger A [ORNL; Polsky, Yarom [ORNL; Bouman, Charlie [Purdue University; Santos-Villalobos, Hector J [ORNL

    2015-01-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well s health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.

  16. Bio-inspired computational techniques based on advanced condition monitoring

    Institute of Scientific and Technical Information of China (English)

    Su Liangcheng; He Shan; Li Xiaoli; Li Xinglin

    2011-01-01

    The application of bio-inspired computational techniques to the field of condition monitoring is addressed.First, the bio-inspired computational techniques are briefly addressed; the advantages and disadvantages of these computational methods are made clear. Then, the roles of condition monitoring in the predictive maintenance and failures prediction and the development trends of condition monitoring are discussed. Finally, a case study on the condition monitoring of grinding machine is described, which shows the application of bio-inspired computational technique to a practical condition monitoring system.

  17. An Agent-based Extensible Climate Control System for Sustainable Greenhouse Production

    DEFF Research Database (Denmark)

    Sørensen, Jan Corfixen; Jørgensen, Bo Nørregaard; Klein, Mark;

    2011-01-01

    The slow adoption pace of new control strategies for sustainable greenhouse climate control by industrial growers is mainly due to the complexity of identifying and resolving potentially conflicting climate control requirements. In this paper, we present a multi-agent-based climate control system....... Negotiation is done using a novel multi-issue negotiation protocol that uses a generic algorithm to find an optimized solution within the search space. The Multi-Agent control system has been empirically evaluated in an ornamental floriculture research facility in Denmark. The evaluation showed...... that it is realistic to implement the climate control requirements as individual agents, thereby opening greenhouse climate control systems for integration of independently produced control strategies....

  18. Size measurement of gold and silver nanostructures based on their extinction spectrum: limitations and extensions

    Directory of Open Access Journals (Sweden)

    A A Ashkarran

    2013-09-01

    Full Text Available  This paper reports on physical principles and the relations between extinction cross section and geometrical properties of silver and gold nanostructures. We introduce some simple relations for determining geometrical properties of silver and gold nanospheres based on the situation of their plasmonic peak. We also applied, investigated and compared the accuracy of these relations using other published works in order to make clear the effects of shape, size distribution and refractive index of particles’ embedding medium. Finally, we extended the equations to non-spherical particles and investigated their accuracy. We found that modified forms of the equations may lead to more exact results for non-spherical metal particles, but for better results, modified equations should depend on shape and size distribution of particles. It seems that these equations are not applicable to particles with corners sharper than cubes' corners i.e. nanostructures with spatial angles less than π/2 sr.

  19. AN EXTENSION OF TOPSIS FOR FUZZY MCDM BASED ON VAGUE SET THEORY

    Institute of Scientific and Technical Information of China (English)

    Jue WANG; San-Yang LIU; Jie ZHANG

    2005-01-01

    This paper extends the TOPSIS to fuzzy MCDM based on vague set theory, where the characteristics of the alternatives are represented by vague sets. A novel score function is proposed in order to determine the vague positive-ideal solution (VPIS) and vague negative-ideal solution (VNIS).We present a weighted difference index to calculate the distance between vague values, by means of which the distance of alternatives to VPIS and VNIS can be calculated. Finally, the relative closeness values of various alternatives to the positive-ideal solution are ranked to determine the best alternative.An example is shown to illustrate the procedure of the proposed method at the end of this paper.

  20. Spatial extension of excitons in triphenylene based polymers given by range-separated functionals

    CERN Document Server

    Kociper, B

    2013-01-01

    Motivated by an experiment in which the singlet-triplet gap in triphenylene based copolymers was effectively tuned, we used time dependent density functional theory (TDDFT) to reproduce the main results. By means of conventional and long-range corrected exchange correlation functionals, the luminescence energies and the exciton localization were calculated for a triphenylene homopolymer and several different copolymers. The phosphorescence energy of the pure triphenylene chain is predicted accurately by means of the optimally tuned long-range corrected LC-PBE functional and slightly less accurate by the global hybrid B3LYP. However, the experimentally observed fixed phosphorescence energy could not be reproduced because the localization pattern is different to the expectations: Instead of localizing on the triphenylene moiety - which is present in all types of polymers - the triplet state localizes on the different bridging units in the TDDFT calculations. This leads to different triplet emission energies for...

  1. Research Extension and Education Programs on Bio-based Energy Technologies and Products

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, Sam [University of Tennessee, Knoxville, TN (United States). Tennessee Agricultural Experiment Station; Harper, David [University of Tennessee, Knoxville, TN (United States). Tennessee Agricultural Experiment Station; Womac, Al [University of Tennessee, Knoxville, TN (United States). Tennessee Agricultural Experiment Station

    2010-03-02

    The overall objectives of this project were to provide enhanced educational resources for the general public, educational and development opportunities for University faculty in the Southeast region, and enhance research knowledge concerning biomass preprocessing and deconstruction. All of these efforts combine to create a research and education program that enhances the biomass-based industries of the United States. This work was broken into five primary objective areas: • Task A - Technical research in the area of biomass preprocessing, analysis, and evaluation. • Tasks B&C - Technical research in the areas of Fluidized Beds for the Chemical Modification of Lignocellulosic Biomass and Biomass Deconstruction and Evaluation. • Task D - Analyses for the non-scientific community to provides a comprehensive analysis of the current state of biomass supply, demand, technologies, markets and policies; identify a set of feasible alternative paths for biomass industry development and quantify the impacts associated with alternative path. • Task E - Efforts to build research capacity and develop partnerships through faculty fellowships with DOE national labs The research and education programs conducted through this grant have led to three primary results. They include: • A better knowledge base related to and understanding of biomass deconstruction, through both mechanical size reduction and chemical processing • A better source of information related to biomass, bioenergy, and bioproducts for researchers and general public users through the BioWeb system. • Stronger research ties between land-grant universities and DOE National Labs through the faculty fellowship program. In addition to the scientific knowledge and resources developed, funding through this program produced a minimum of eleven (11) scientific publications and contributed to the research behind at least one patent.

  2. The Statistical methods of Pixel-Based Image Fusion Techniques

    CERN Document Server

    Al-Wassai, Firouz Abdullah; Al-Zaky, Ali A

    2011-01-01

    There are many image fusion methods that can be used to produce high-resolution mutlispectral images from a high-resolution panchromatic (PAN) image and low-resolution multispectral (MS) of remote sensed images. This paper attempts to undertake the study of image fusion techniques with different Statistical techniques for image fusion as Local Mean Matching (LMM), Local Mean and Variance Matching (LMVM), Regression variable substitution (RVS), Local Correlation Modeling (LCM) and they are compared with one another so as to choose the best technique, that can be applied on multi-resolution satellite images. This paper also devotes to concentrate on the analytical techniques for evaluating the quality of image fusion (F) by using various methods including Standard Deviation (SD), Entropy(En), Correlation Coefficient (CC), Signal-to Noise Ratio (SNR), Normalization Root Mean Square Error (NRMSE) and Deviation Index (DI) to estimate the quality and degree of information improvement of a fused image quantitatively...

  3. FPGA IMPLEMENTATION OF RSEPD TECHNIQUE BASED IMPULSE NOISE REMOVAL

    Directory of Open Access Journals (Sweden)

    M. Rajadurai

    2013-05-01

    Full Text Available In the process of signals transmission and acquisition, image signals might be corrupted by impulse noise. Generally, digital images are corrupted by impulse noises. These are short duration noises, which degrade an image and are randomly distributed over the image. An efficient FPGA implementation for removing impulse noise in an image is presented in this paper. Existing techniques use standard median filter. These existing approaches changes the pixel values of both noise less and noisy pixels, so image might be blurred in nature. To avoid the changes on noise less pixels, an efficient FPGA implementation of Simple Edge Preserved De-noising technique (SEPD and Reduced Simple Edge Preserved De-noising technique (RSEPD are presented in this paper. In this technique, noise detection and noise removal operations are performed. This VLSI design gives better image quality. For 10 percentage noise added image, the obtained PSNR value of the image is 31.68 while de-noising it.

  4. A New Image Steganography Based On First Component Alteration Technique

    Directory of Open Access Journals (Sweden)

    Amanpreet Kaur

    2009-12-01

    Full Text Available In this paper, A new image steganography scheme is proposed which is a kind of spatial domain technique. In order to hide secret data in cover-image, the first component alteration technique is used. Techniques used so far focuses only on the two or four bits of a pixel in a image (at the most five bits at the edge of an image which results in less peak to signal noise ratio and high root mean square error. In this technique, 8 bits of blue components of pixels are replaced with secret data bits. Proposed scheme can embed more data than previous schemes and shows better image quality. To prove this scheme, several experiments are performed, and are compared the experimental results with the related previous works.Keywords—image; mean square error; Peak signal to noise ratio; steganography;

  5. A New Image Steganography Based On First Component Alteration Technique

    CERN Document Server

    Kaur, Amanpreet; Sikka, Geeta

    2010-01-01

    In this paper, A new image steganography scheme is proposed which is a kind of spatial domain technique. In order to hide secret data in cover-image, the first component alteration technique is used. Techniques used so far focuses only on the two or four bits of a pixel in a image (at the most five bits at the edge of an image) which results in less peak to signal noise ratio and high root mean square error. In this technique, 8 bits of blue components of pixels are replaced with secret data bits. Proposed scheme can embed more data than previous schemes and shows better image quality. To prove this scheme, several experiments are performed, and are compared the experimental results with the related previous works.

  6. A local technique based on vectorized surfaces for craniofacial reconstruction.

    Science.gov (United States)

    Tilotta, Françoise M; Glaunès, Joan A; Richard, Frédéric J P; Rozenholc, Yves

    2010-07-15

    In this paper, we focus on the automation of facial reconstruction. Since they consider the whole head as the object of interest, usual reconstruction techniques are global and involve a large number of parameters to be estimated. We present a local technique which aims at reaching a good trade-off between bias and variance following the paradigm of non-parametric statistics. The estimation is localized on patches delimited by surface geodesics between anatomical points of the skull. The technique relies on a continuous representation of the individual surfaces embedded in the vectorial space of extended normal vector fields. This allows to compute deformations and averages of surfaces. It consists in estimating the soft-tissue surface over patches. Using a homogeneous database described in [31], we obtain results on the chin and nasal regions with an average error below 1mm, outperforming the global reconstruction techniques.

  7. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.

    The thesis describes and develops the theoretical foundations of the Random Decrement technique, while giving several examples of modal analysis of large building constructions (bridges). The connection between modal parameters and Random Decrement functions is described theoretically. The effici...

  8. Evaluation of a school-based diabetes education intervention, an extension of Program ENERGY

    Science.gov (United States)

    Conner, Matthew David

    Background: The prevalence of both obesity and type 2 diabetes in the United States has increased over the past two decades and rates remain high. The latest data from the National Center for Health Statistics estimates that 36% of adults and 17% of children and adolescents in the US are obese (CDC Adult Obesity, CDC Childhood Obesity). Being overweight or obese greatly increases one's risk of developing several chronic diseases, such as type 2 diabetes. Approximately 8% of adults in the US have diabetes, type 2 diabetes accounts for 90-95% of these cases. Type 2 diabetes in children and adolescents is still rare, however clinical reports suggest an increase in the frequency of diagnosis (CDC Diabetes Fact Sheet, 2011). Results from the Diabetes Prevention Program show that the incidence of type 2 diabetes can be reduced through the adoption of a healthier lifestyle among high-risk individuals (DPP, 2002). Objectives: This classroom-based intervention included scientific coverage of energy balance, diabetes, diabetes prevention strategies, and diabetes management. Coverage of diabetes management topics were included in lesson content to further the students' understanding of the disease. Measurable short-term goals of the intervention included increases in: general diabetes knowledge, diabetes management knowledge, and awareness of type 2 diabetes prevention strategies. Methods: A total of 66 sixth grade students at Tavelli Elementary School in Fort Collins, CO completed the intervention. The program consisted of nine classroom-based lessons; students participated in one lesson every two weeks. The lessons were delivered from November of 2005 to May of 2006. Each bi-weekly lesson included a presentation and interactive group activities. Participants completed two diabetes knowledge questionnaires at baseline and post intervention. A diabetes survey developed by Program ENERGY measured general diabetes knowledge and awareness of type 2 diabetes prevention strategies

  9. Hiding of Speech based on Chaotic Steganography and Cryptography Techniques

    OpenAIRE

    Abbas Salman Hameed

    2015-01-01

    The technique of embedding secret information into cover media, like image, video, audio and text called Steganography, so that only the sender and the authorized recipient who have a key can detect the presence of secret information. In this paper Steganography and Cryptography techniques of speech present with Chaos. Fractional order Lorenz and Chua systems that provides an expanded in key space are used to encrypt speech message. The large key space addition to all ...

  10. SNMP Based Network Optimization Technique Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    M. Mohamed Surputheen

    2012-03-01

    Full Text Available Genetic Algorithms (GAs has innumerable applications through the optimization techniques and network optimization is one of them. SNMP (Simple Network Management Protocol is used as the basic network protocol for monitoring the network activities health of the systems. This paper deals with adding Intelligence to the various aspects of SNMP by adding optimization techniques derived out of genetic algorithms, which enhances the performance of SNMP processes like routing.

  11. EDM COLLABORATIVE MANUFACTURING SYSTEM BASED ON MULTI-AGENT TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    Zhao Wansheng; Zhao Jinzhi; Song Yinghui; Yang Xiaodong

    2003-01-01

    A framework for building EDM collaborative manufacturing system using multi-agent technology to support organizations characterized by physically distributed, enterprise-wide, heterogeneous intelligent manufacturing system over Internet is proposed. Expert system theory is introduced.Design, manufacturing and technological knowledge are shared using artificial intelligence and web techniques by EDM-CADagent, EDM-CAMagent and EDM-CAPPagent. System structure, design process, network conditions, realization methods and other key techniques are discussed. Instances are also introduced to testify feasibility.

  12. Proposing a wiki-based technique for collaborative essay writing

    OpenAIRE

    Mabel Ortiz Navarrete; Anita Ferreira Cabrera

    2014-01-01

    This paper aims at proposing a technique for students learning English as a foreign language when they collaboratively write an argumentative essay in a wiki environment. A wiki environment and collaborative work play an important role within the academic writing task. Nevertheless, an appropriate and systematic work assignment is required in order to make use of both. In this paper the proposed technique when writing a collaborative essay mainly attempts to provide the most effective way to ...

  13. Towards a theoretically based Group Facilitation Technique for Project Teams

    OpenAIRE

    Witte, E.H.; Engelhardt, Gabriele

    2004-01-01

    A theoretical framework for developing the group facilitation technique PROMOD is presented here. The efficiency of this technique in improving group decision quality is supported by the results of three experimental studies involving different kinds of problem solving tasks. The author points towards the importance of integrating theoretical assumptions, theory testing and basic research with empirical application. Such a compelling strategy can lead to new insights in group performance dyna...

  14. Biogeosystem technique as a base of Sustainable Irrigated Agriculture

    Science.gov (United States)

    Batukaev, Abdulmalik

    2016-04-01

    The world water strategy is to be changed because the current imitational gravitational frontal isotropic-continual paradigm of irrigation is not sustainable. This paradigm causes excessive consumption of fresh water - global deficit - up to 4-15 times, adverse effects on soils and landscapes. Current methods of irrigation does not control the water spread throughout the soil continuum. The preferable downward fluxes of irrigation water are forming, up to 70% and more of water supply loses into vadose zone. The moisture of irrigated soil is high, soil loses structure in the process of granulometric fractions flotation decomposition, the stomatal apparatus of plant leaf is fully open, transpiration rate is maximal. We propose the Biogeosystem technique - the transcendental, uncommon and non-imitating methods for Sustainable Natural Resources Management. New paradigm of irrigation is based on the intra-soil pulse discrete method of water supply into the soil continuum by injection in small discrete portions. Individual volume of water is supplied as a vertical cylinder of soil preliminary watering. The cylinder position in soil is at depth form 10 to 30 cm. Diameter of cylinder is 1-2 cm. Within 5-10 min after injection the water spreads from the cylinder of preliminary watering into surrounding soil by capillary, film and vapor transfer. Small amount of water is transferred gravitationally to the depth of 35-40 cm. The soil watering cylinder position in soil profile is at depth of 5-50 cm, diameter of the cylinder is 2-4 cm. Lateral distance between next cylinders along the plant raw is 10-15 cm. The soil carcass which is surrounding the cylinder of non-watered soil remains relatively dry and mechanically stable. After water injection the structure of soil in cylinder restores quickly because of no compression from the stable adjoining volume of soil and soil structure memory. The mean soil thermodynamic water potential of watered zone is -0.2 MPa. At this potential

  15. WSN- and IOT-Based Smart Homes and Their Extension to Smart Buildings

    Directory of Open Access Journals (Sweden)

    Hemant Ghayvat

    2015-05-01

    Full Text Available Our research approach is to design and develop reliable, efficient, flexible, economical, real-time and realistic wellness sensor networks for smart home systems. The heterogeneous sensor and actuator nodes based on wireless networking technologies are deployed into the home environment. These nodes generate real-time data related to the object usage and movement inside the home, to forecast the wellness of an individual. Here, wellness stands for how efficiently someone stays fit in the home environment and performs his or her daily routine in order to live a long and healthy life. We initiate the research with the development of the smart home approach and implement it in different home conditions (different houses to monitor the activity of an inhabitant for wellness detection. Additionally, our research extends the smart home system to smart buildings and models the design issues related to the smart building environment; these design issues are linked with system performance and reliability. This research paper also discusses and illustrates the possible mitigation to handle the ISM band interference and attenuation losses without compromising optimum system performance.

  16. WSN- and IOT-Based Smart Homes and Their Extension to Smart Buildings.

    Science.gov (United States)

    Ghayvat, Hemant; Mukhopadhyay, Subhas; Gui, Xiang; Suryadevara, Nagender

    2015-05-04

    Our research approach is to design and develop reliable, efficient, flexible, economical, real-time and realistic wellness sensor networks for smart home systems. The heterogeneous sensor and actuator nodes based on wireless networking technologies are deployed into the home environment. These nodes generate real-time data related to the object usage and movement inside the home, to forecast the wellness of an individual. Here, wellness stands for how efficiently someone stays fit in the home environment and performs his or her daily routine in order to live a long and healthy life. We initiate the research with the development of the smart home approach and implement it in different home conditions (different houses) to monitor the activity of an inhabitant for wellness detection. Additionally, our research extends the smart home system to smart buildings and models the design issues related to the smart building environment; these design issues are linked with system performance and reliability. This research paper also discusses and illustrates the possible mitigation to handle the ISM band interference and attenuation losses without compromising optimum system performance.

  17. Retention of denture bases fabricated by three different processing techniques – An in vivo study

    Science.gov (United States)

    Chalapathi Kumar, V. H.; Surapaneni, Hemchand; Ravikiran, V.; Chandra, B. Sarat; Balusu, Srilatha; Reddy, V. Naveen

    2016-01-01

    Aim: Distortion due to Polymerization shrinkage compromises the retention. To evaluate the amount of retention of denture bases fabricated by conventional, anchorized, and injection molding polymerization techniques. Materials and Methods: Ten completely edentulous patients were selected, impressions were made, and master cast obtained was duplicated to fabricate denture bases by three polymerization techniques. Loop was attached to the finished denture bases to estimate the force required to dislodge them by retention apparatus. Readings were subjected to nonparametric Friedman two-way analysis of variance followed by Bonferroni correction methods and Wilcoxon matched-pairs signed-ranks test. Results: Denture bases fabricated by injection molding (3740 g), anchorized techniques (2913 g) recorded greater retention values than conventional technique (2468 g). Significant difference was seen between these techniques. Conclusions: Denture bases obtained by injection molding polymerization technique exhibited maximum retention, followed by anchorized technique, and least retention was seen in conventional molding technique. PMID:27382542

  18. Assessing the Utility of the Nominal Group Technique as a Consensus-Building Tool in Extension-Led Avian Influenza Response Planning

    Science.gov (United States)

    Kline, Terence R.

    2013-01-01

    The intent of the project described was to apply the Nominal Group Technique (NGT) to achieve a consensus on Avian Influenza (AI) planning in Northeastern Ohio. Nominal Group Technique is a process first developed by Delbecq, Vande Ven, and Gustafsen (1975) to allow all participants to have an equal say in an open forum setting. A very diverse…

  19. A novel technique for extracting clouds base height using ground based imaging

    Directory of Open Access Journals (Sweden)

    E. Hirsch

    2011-01-01

    Full Text Available The height of a cloud in the atmospheric column is a key parameter in its characterization. Several remote sensing techniques (passive and active, either ground-based or on space-borne platforms and in-situ measurements are routinely used in order to estimate top and base heights of clouds. In this article we present a novel method that combines thermal imaging from the ground and sounded wind profile in order to derive the cloud base height. This method is independent of cloud types, making it efficient for both low boundary layer and high clouds. In addition, using thermal imaging ensures extraction of clouds' features during daytime as well as at nighttime. The proposed technique was validated by comparison to active sounding by ceilometers (which is a standard ground based method, to lifted condensation level (LCL calculations, and to MODIS products obtained from space. As all passive remote sensing techniques, the proposed method extracts only the height of the lowest cloud layer, thus upper cloud layers are not detected. Nevertheless, the information derived from this method can be complementary to space-borne cloud top measurements when deep-convective clouds are present. Unlike techniques such as LCL, this method is not limited to boundary layer clouds, and can extract the cloud base height at any level, as long as sufficient thermal contrast exists between the radiative temperatures of the cloud and its surrounding air parcel. Another advantage of the proposed method is its simplicity and modest power needs, making it particularly suitable for field measurements and deployment at remote locations. Our method can be further simplified for use with visible CCD or CMOS camera (although nighttime clouds will not be observed.

  20. An Empirical Comparative Study of Checklist-based and Ad Hoc Code Reading Techniques in a Distributed Groupware Environment

    Directory of Open Access Journals (Sweden)

    Adenike O. Osofisan

    2009-09-01

    Full Text Available Software inspection is a necessary and important tool for software quality assurance. Since it was introduced by Fagan at IBM in 1976, arguments exist as to which method should be adopted to carry out the exercise, whether it should be paper-based or tool-based, and what reading technique should be used on the inspection document. Extensive works have been done to determine the effectiveness of reviewers in paper-based environment when using ad hoc and checklist reading techniques. In this work, we take the software inspection research further by examining whether there is going to be any significant difference in defect detection effectiveness of reviewers when they use either ad hoc or checklist reading techniques in a distributed groupware environment. Twenty final year undergraduate students of computer science, divided into ad hoc and checklist reviewers groups of ten members each were employed to inspect a medium-sized java code synchronously on groupware deployed on the Internet. The data obtained were subjected to tests of hypotheses using independent t-test and correlation coefficients. Results from the study indicate that there are no significant differences in the defect detection effectiveness, effort in terms of time taken in minutes and false positives reported by the reviewers using either ad hoc or checklist based reading techniques in the distributed groupware environment studied.Key words: Software Inspection, Ad hoc, Checklist, groupware.

  1. RP-based Abrading Technique for Graphite EDM Electrode

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Traditional processes for machining mold cavities are lengthy and costly. EDM (electro-discharge machining) is the most commonly used technique to obtain complex mold cavities. However, some electrodes are difficult to fabricate because of the complexity. Applying RP (rapid prototyping) technology to fabricate an abrading tool which is used to abrade graphite EDM electrodes, the cost and cycle time can greatly be reduced. The paper describes the work being conducted in this area by the authors. This technique will find widespread application in rapid steel mold manufacturing.

  2. FUZZY ENTROPY BASED OPTIMAL THRESHOLDING TECHNIQUE FOR IMAGE ENHANCEMENT

    Directory of Open Access Journals (Sweden)

    U.Sesadri

    2015-06-01

    Full Text Available Soft computing is likely to play aprogressively important role in many applications including image enhancement. The paradigm for soft computing is the human mind. The soft computing critique has been particularly strong with fuzzy logic. The fuzzy logic is facts representationas a rule for management of uncertainty. Inthis paperthe Multi-Dimensional optimized problem is addressed by discussing the optimal thresholding usingfuzzyentropyfor Image enhancement. This technique is compared with bi-level and multi-level thresholding and obtained optimal thresholding values for different levels of speckle noisy and low contrasted images. The fuzzy method has produced better results compared to bi-level and multi-level thresholding techniques.

  3. A Predicate Based Fault Localization Technique Based On Test Case Reduction

    Directory of Open Access Journals (Sweden)

    Rohit Mishra

    2015-08-01

    Full Text Available ABSTRACT In todays world software testing with statistical fault localization technique is one of most tedious expensive and time consuming activity. In faulty program a program element contrast dynamic spectra that estimate location of fault. There may have negative impact from coincidental correctness with these technique because in non failed run the fault can also be triggered out and if so disturb the assessment of fault location. Now eliminating of confounding rules on the recognizing the accuracy. In this paper coincidental correctness which is an effective interface is the reason of success of fault location. We can find out fault predicates by distribution overlapping of dynamic spectrum in failed runs and non failed runs and slacken the area by referencing the inter class distances of spectra to clamp the less suspicious candidate. After that we apply coverage matrix base reduction approach to reduce the test cases of that program and locate the fault in that program. Finally empirical result shows that our technique outshine with previous existing predicate based fault localization technique with test case reduction.

  4. Genotyping human ancient mtDNA control and coding region polymorphisms with a multiplexed Single-Base-Extension assay: the singular maternal history of the Tyrolean Iceman

    Directory of Open Access Journals (Sweden)

    Egarter-Vigl Eduard

    2009-06-01

    Full Text Available Abstract Background Progress in the field of human ancient DNA studies has been severely restricted due to the myriad sources of potential contamination, and because of the pronounced difficulty in identifying authentic results. Improving the robustness of human aDNA results is a necessary pre-requisite to vigorously testing hypotheses about human evolution in Europe, including possible admixture with Neanderthals. This study approaches the problem of distinguishing between authentic and contaminating sequences from common European mtDNA haplogroups by applying a multiplexed Single-Base-Extension assay, containing both control and coding region sites, to DNA extracted from the Tyrolean Iceman. Results The multiplex assay developed for this study was able to confirm that the Iceman's mtDNA belongs to a new European mtDNA clade with a very limited distribution amongst modern data sets. Controlled contamination experiments show that the correct results are returned by the multiplex assay even in the presence of substantial amounts of exogenous DNA. The overall level of discrimination achieved by targeting both control and coding region polymorphisms in a single reaction provides a methodology capable of dealing with most cases of homoplasy prevalent in European haplogroups. Conclusion The new genotyping results for the Iceman confirm the extreme fallibility of human aDNA studies in general, even when authenticated by independent replication. The sensitivity and accuracy of the multiplex Single-Base-Extension methodology forms part of an emerging suite of alternative techniques for the accurate retrieval of ancient DNA sequences from both anatomically modern humans and Neanderthals. The contamination of laboratories remains a pressing concern in aDNA studies, both in the pre and post-PCR environments, and the adoption of a forensic style assessment of a priori risks would significantly improve the credibility of results.

  5. Response Time Comparisons among Four Base Running Starting Techniques in Slow Pitch Softball.

    Science.gov (United States)

    Israel, Richard G.; Brown, Rodney L.

    1981-01-01

    Response times among four starting techniques (cross-over step, jab step, standing sprinter's start, and momentum start) were compared. The results suggest that the momentum start was the fastest starting technique for optimum speed in running bases. (FG)

  6. MR-based field-of-view extension in MR/PET: B0 homogenization using gradient enhancement (HUGE).

    Science.gov (United States)

    Blumhagen, Jan O; Ladebeck, Ralf; Fenchel, Matthias; Scheffler, Klaus

    2013-10-01

    In whole-body MR/PET, the human attenuation correction can be based on the MR data. However, an MR-based field-of-view (FoV) is limited due to physical restrictions such as B0 inhomogeneities and gradient nonlinearities. Therefore, for large patients, the MR image and the attenuation map might be truncated and the attenuation correction might be biased. The aim of this work is to explore extending the MR FoV through B0 homogenization using gradient enhancement in which an optimal readout gradient field is determined to locally compensate B0 inhomogeneities and gradient nonlinearities. A spin-echo-based sequence was developed that computes an optimal gradient for certain regions of interest, for example, the patient's arms. A significant distortion reduction was achieved outside the normal MR-based FoV. This FoV extension was achieved without any hardware modifications. In-plane distortions in a transaxially extended FoV of up to 600 mm were analyzed in phantom studies. In vivo measurements of the patient's arms lying outside the normal specified FoV were compared with and without the use of B0 homogenization using gradient enhancement. In summary, we designed a sequence that provides data for reducing the image distortions due to B0 inhomogeneities and gradient nonlinearities and used the data to extend the MR FoV. PMID:23203976

  7. An Image Inpainting Technique Based on the Fast Marching Method

    NARCIS (Netherlands)

    Telea, Alexandru

    2004-01-01

    Digital inpainting provides a means for reconstruction of small damaged portions of an image. Although the inpainting basics are straightforward, most inpainting techniques published in the literature are complex to understand and implement. We present here a new algorithm for digital inpainting bas

  8. A novel image inpainting technique based on median diffusion

    Indian Academy of Sciences (India)

    Rajkumar L Biradar; Vinayadatt V Kohir

    2013-08-01

    Image inpainting is the technique of filling-in the missing regions and removing unwanted objects from an image by diffusing the pixel information from the neighbourhood pixels. Image inpainting techniques are in use over a long time for various applications like removal of scratches, restoring damaged/missing portions or removal of objects from the images, etc. In this study, we present a simple, yet unexplored (digital) image inpainting technique using median filter, one of the most popular nonlinear (order statistics) filters. The median is maximum likelihood estimate of location for the Laplacian distribution. Hence, the proposed algorithm diffuses median value of pixels from the exterior area into the inner area to be inpainted. The median filter preserves the edge which is an important property needed to inpaint edges. This technique is stable. Experimental results show remarkable improvements and works for homogeneous as well as heterogeneous background. PSNR (quantitative assessment) is used to compare inpainting results.

  9. DSPI system based on spatial carrier phase shifting technique

    Science.gov (United States)

    Wang, Yonghong; Li, Junrui; Sun, Jianfei; Yang, Lianxiang

    2013-10-01

    Digital Speckle Pattern Interferometry (DSPI) is an optical method for measuring small displacement and deformation. It allows whole field, non-contacting measurement of micro deformation. Traditional Temporal phase shifting has been used for quantitative analyses in DSPI. The technique requires the recording of at least three phase-shifted interferograms, which must be taken sequentially. This can lead to disturbances by thermal and mechanical fluctuations during the required recording time. In addition, fast object deformations cannot be detected. In this paper a DSPI system using Spatial Carrier Phase Shifting (SCPS) technique is introduced, which is useful for extracting quantitative displacement data from the system with only two interferograms. The sensitive direction of this system refers to the illumination direction and observation direction. The frequencies of the spatial carrier relates to the angle between reference light and observation direction. Fourier transform is adopted in the digital evaluation to filter out the frequencies links to the deformation of testing object. The phase is obtained from the complex matrix formed by inverse Fourier transform, and the phase difference and deformation are calculated subsequently. Comparing with conventional temporal phase shifting, the technique can achieve measuring the vibration and transient deformation of testing object. Experiment set-ups and results are presented in this paper, and the experiment results have shown the effectiveness and advantages of the SCPS technique.

  10. Chain extension and branching of poly(L-lactic acid produced by reaction with a DGEBA-based epoxy resin

    Directory of Open Access Journals (Sweden)

    2007-11-01

    Full Text Available Dicarboxylated poly(L-lactic acid (PLLA was synthesized by reacting succinic anhydride with L-lactic acid prepolymer prepared by melt polycondensation. PLLA and epoxy resin based on diglycidyl ether of bisphenol A (DGEBA copolymers were prepared by chain extension of dicarboxylated PLLA with DGEBA. Infrared spectra confirmed the formation of dicarboxylated PLLA and PLLA/DGEBA copolymer. Influences of reaction temperature, reaction time, and the amount of DGEBA on the molecular weight and gel content of PLLA/DGEBA copolymer were studied. The viscosity average molecular weight of PLLA/DGEBA copolymer reached 87 900 when reaction temperature, reaction time, and mol ratio of dicarboxylated PLLA to DGEBA is 150°C, 30 min, and 1:1 respectively, while gel content of PLLA/DGEBA copolymer is almost zero.

  11. A Monte-Carlo based extension of the Meteor Orbit and Trajectory Software (MOTS) for computations of orbital elements

    Science.gov (United States)

    Albin, T.; Koschny, D.; Soja, R.; Srama, R.; Poppe, B.

    2016-01-01

    The Canary Islands Long-Baseline Observatory (CILBO) is a double station meteor camera system (Koschny et al., 2013; Koschny et al., 2014) that consists of 5 cameras. The two cameras considered in this report are ICC7 and ICC9, and are installed on Tenerife and La Palma. They point to the same atmospheric volume between both islands allowing stereoscopic observation of meteors. Since its installation in 2011 and the start of operation in 2012 CILBO has detected over 15000 simultaneously observed meteors. Koschny and Diaz (2002) developed the Meteor Orbit and Trajectory Software (MOTS) to compute the trajectory of such meteors. The software uses the astrometric data from the detection software MetRec (Molau, 1998) and determines the trajectory in geodetic coordinates. This work presents a Monte-Carlo based extension of the MOTS code to compute the orbital elements of simultaneously detected meteors by CILBO.

  12. The Assessment of Comprehensive Vulnerability of Chemical Industrial Park Based on Entropy Method and Matter-element Extension Model

    Directory of Open Access Journals (Sweden)

    Yan Jingyi

    2016-01-01

    Full Text Available The paper focuses on studying connotative meaning, evaluation methods and models for chemical industry park based on in-depth analysis of relevant research results in China and abroad, it summarizes and states the feature of menacing vulnerability and structural vulnerability and submits detailed influence factors such as personnel vulnerability, infrastructural vulnerability, environmental vulnerability and the vulnerability of safety managerial defeat. Using vulnerability scoping diagram establishes 21 evaluation indexes and an index system for the vulnerability evaluation of chemical industrial park. The comprehensive weights are calculated with entropy method, combining matter-element extension model to make the quantitative evaluation, then apply to evaluate some chemical industrial park successfully. This method provides a new ideas and ways for enhancing overall safety of the chemical industrial park.

  13. Lepton mass and mixing in a simple extension of the Standard Model based on T7 flavor symmetry

    CERN Document Server

    Vien, V V

    2016-01-01

    A simple Standard Model Extension based on $T_7$ flavor symmetry which accommodates lepton mass and mixing with non-zero $\\theta_{13}$ and CP violation phase is proposed. At the tree- level, the realistic lepton mass and mixing pattern is derived through the spontaneous symmetry breaking by just one vacuum expectation value ($v$) which is the same as in the Standard Model. Neutrinos get small masses from one $SU(2)_L$ doublet and two $SU(2)_L$ singlets in which one being in $\\underline{1}$ and the two others in $\\underline{3}$ and $\\underline{3}^*$ under $T_7$ , respectively. The model also gives a remarkable prediction of Dirac CP violation $\\delta_{CP}=172.598^\\circ$ in both normal and inverted hierarchies which is still missing in the neutrino mixing matrix.

  14. Lagrangian study of surface transport in the Kuroshio Extension area based on simulation of propagation of Fukushima-derived radionuclides

    CERN Document Server

    Prants, S V; Uleysky, M Yu

    2013-01-01

    Lagrangian approach is applied to study near-surface large-scale transport in the Kuroshio Extension area using a simulation with synthetic particles advected by AVISO altimetric velocity field. A material line technique is applied to find the origin of water masses in cold-core cyclonic rings pinched off from the jet in summer 2011. Tracking and Lagrangian maps provide the evidence of cross-jet transport. Fukushima derived caesium isotopes are used as Lagrangian tracers to study transport and mixing in the area a few months after the March of 2011 tsunami that caused a heavy damage of the Fukushima nuclear power plant (FNPP). Tracking maps are computed to trace the origin of water parcels with measured levels of Cs-134 and Cs-137 concentrations collected in two R/V cruises in June and July 2011 in the large area of the Northwest Pacific. It is shown that Lagrangian simulation is useful to finding the surface areas that are potentially dangerous due to the risk of radioactive contamination. The results of sim...

  15. An analysis of Greek seismicity based on Non Extensive Statistical Physics: The interdependence of magnitude, interevent time and interevent distance.

    Science.gov (United States)

    Efstathiou, Angeliki; Tzanis, Andreas; Vallianatos, Filippos

    2014-05-01

    The context of Non Extensive Statistical Physics (NESP) has recently been suggested to comprise an appropriate tool for the analysis of complex dynamic systems with scale invariance, long-range interactions, long-range memory and systems that evolve in a fractal-like space-time. This is because the active tectonic grain is thought to comprise a (self-organizing) complex system; therefore, its expression (seismicity) should be manifested in the temporal and spatial statistics of energy release rates. In addition to energy release rates expressed by the magnitude M, measures of the temporal and spatial interactions are the time (Δt) and hypocentral distance (Δd) between consecutive events. Recent work indicated that if the distributions of M, Δt and Δd are independent so that the joint probability p(M,Δt,Δd) factorizes into the probabilities of M, Δt and Δd, i.e. p(M,Δt,Δd)= p(M)p(Δt)p(Δd), then the frequency of earthquake occurrence is multiply related, not only to magnitude as the celebrated Gutenberg - Richter law predicts, but also to interevent time and distance by means of well-defined power-laws consistent with NESP. The present work applies these concepts to investigate the self-organization and temporal/spatial dynamics of seismicity in Greece and western Turkey, for the period 1964-2011. The analysis was based on the ISC earthquake catalogue which is homogenous by construction with consistently determined hypocenters and magnitude. The presentation focuses on the analysis of bivariate Frequency-Magnitude-Time distributions, while using the interevent distances as spatial constraints (or spatial filters) for studying the spatial dependence of the energy and time dynamics of the seismicity. It is demonstrated that the frequency of earthquake occurrence is multiply related to the magnitude and the interevent time by means of well-defined multi-dimensional power-laws consistent with NESP and has attributes of universality,as its holds for a broad

  16. Wavelet-based techniques for the gamma-ray sky

    Science.gov (United States)

    McDermott, Samuel D.; Fox, Patrick J.; Cholis, Ilias; Lee, Samuel K.

    2016-07-01

    We demonstrate how the image analysis technique of wavelet decomposition can be applied to the gamma-ray sky to separate emission on different angular scales. New structures on scales that differ from the scales of the conventional astrophysical foreground and background uncertainties can be robustly extracted, allowing a model-independent characterization with no presumption of exact signal morphology. As a test case, we generate mock gamma-ray data to demonstrate our ability to extract extended signals without assuming a fixed spatial template. For some point source luminosity functions, our technique also allows us to differentiate a diffuse signal in gamma-rays from dark matter annihilation and extended gamma-ray point source populations in a data-driven way.

  17. Wavelet-Based Techniques for the Gamma-Ray Sky

    CERN Document Server

    McDermott, Samuel D; Cholis, Ilias; Lee, Samuel K

    2015-01-01

    We demonstrate how the image analysis technique of wavelet decomposition can be applied to the gamma-ray sky to separate emission on different angular scales. New structures on scales that differ from the scales of the conventional astrophysical foreground and background uncertainties can be robustly extracted, allowing a model-independent characterization with no presumption of exact signal morphology. As a test case, we generate mock gamma-ray data to demonstrate our ability to extract extended signals without assuming a fixed spatial template. For some point source luminosity functions, our technique also allows us to differentiate a diffuse signal in gamma-rays from dark matter annihilation and extended gamma-ray point source populations in a data-driven way.

  18. EFFECTIVENESS OF TEST CASE PRIORITIZATION TECHNIQUES BASED ON REGRESSION TESTING

    Directory of Open Access Journals (Sweden)

    Thillaikarasi Muthusamy

    2014-12-01

    Full Text Available Regression testing concentrates on finding defects after a major code change has occurred. Specifically, it exposes software regressions or old bugs that have reappeared. It is an expensive testing process that has been estimated to account for almost half of the cost of software maintenance. To improve the regression testing process, test case prioritization techniques organizes the execution level of test cases. Further, it gives an improved rate of fault identification, when test suites cannot run to completion.

  19. FUZZY ENTROPY BASED OPTIMAL THRESHOLDING TECHNIQUE FOR IMAGE ENHANCEMENT

    OpenAIRE

    U.Sesadri; B. Siva Sankar; C. Nagaraju

    2015-01-01

    Soft computing is likely to play aprogressively important role in many applications including image enhancement. The paradigm for soft computing is the human mind. The soft computing critique has been particularly strong with fuzzy logic. The fuzzy logic is facts representationas a rule for management of uncertainty. Inthis paperthe Multi-Dimensional optimized problem is addressed by discussing the optimal thresholding usingfuzzyentropyfor Image enhancement. This technique is compared with bi...

  20. Failure Mechanism of Rock Bridge Based on Acoustic Emission Technique

    OpenAIRE

    Guoqing Chen; Yan Zhang; Runqiu Huang; Fan Guo; Guofeng Zhang

    2015-01-01

    Acoustic emission (AE) technique is widely used in various fields as a reliable nondestructive examination technology. Two experimental tests were carried out in a rock mechanics laboratory, which include (1) small scale direct shear tests of rock bridge with different lengths and (2) large scale landslide model with locked section. The relationship of AE event count and record time was analyzed during the tests. The AE source location technology and comparative analysis with its actual failu...

  1. The Influence of an Extensive Inquiry-Based Field Experience on Pre-Service Elementary Student Teachers' Science Teaching Beliefs

    Science.gov (United States)

    Bhattacharyya, Sumita; Volk, Trudi; Lumpe, Andrew

    2009-06-01

    This study examined the effects of an extensive inquiry-based field experience on pre service elementary teachers’ personal agency beliefs, a composite measure of context beliefs and capability beliefs related to teaching science. The research combined quantitative and qualitative approaches and included an experimental group that utilized the inquiry method and a control group that used traditional teaching methods. Pre- and post-test scores for the experimental and control groups were compared. The context beliefs of both groups showed no significant change as a result of the experience. However, the control group’s capability belief scores, lower than those of the experimental group to start with, declined significantly; the experimental group’s scores remained unchanged. Thus, the inquiry-based field experience led to an increase in personal agency beliefs. The qualitative data suggested a new hypothesis that there is a spiral relationship among teachers’ ability to establish communicative relationships with students, desire for personal growth and improvement, ability to implement multiple instructional strategies, and possession of substantive content knowledge. The study concludes that inquiry-based student teaching should be encouraged in the training of elementary school science teachers. However, the meaning and practice of the inquiry method should be clearly delineated to ensure its correct implementation in the classroom.

  2. Activities of colistin- and minocycline-based combinations against extensive drug resistant Acinetobacter baumannii isolates from intensive care unit patients

    Directory of Open Access Journals (Sweden)

    Li Jian

    2011-04-01

    Full Text Available Abstract Background Extensive drug resistance of Acinetobacter baumannii is a serious problem in the clinical setting. It is therefore important to find active antibiotic combinations that could be effective in the treatment of infections caused by this problematic 'superbug'. In this study, we analyzed the in vitro activities of three colistin-based combinations and a minocycline-based combination against clinically isolated extensive drug resistant Acinetobacter baumannii (XDR-AB strains. Methods Fourteen XDR-AB clinical isolates were collected. The clonotypes were determined by polymerase chain reaction-based fingerprinting. Susceptibility testing was carried out according to the standards of the Clinical and Laboratory Standards Institute. Activities of drug combinations were investigated against four selected strains and analyzed by mean survival time over 12 hours (MST12 h in a time-kill study. Results The time-kill studies indicated that the minimum inhibitory concentration (MIC of colistin (0.5 or 0.25 μg/mL completely killed all strains at 2 to 4 hours, but 0.5×MIC colistin showed no bactericidal activity. Meropenem (8 μg/mL, minocycline (1 μg/mL or rifampicin (0.06 μg/mL did not show bactericidal activity. However, combinations of colistin at 0.5×MIC (0.25 or 0.125 μg/mL with each of the above were synergistic and shown bactericidal activities against all test isolates. A combination of meropenem (16 μg/mL with minocycline (0.5×MIC, 4 or 2 μg/mL was synergitic to all test isolates, but neither showed bactericidal activity alone. The MST12 h values of drug combinations (either colistin- or minocycline-based combinations were significantly shorter than those of the single drugs (p Conclusions This study indicates that combinations of colistin/meropenem, colistin/rifampicin, colistin/minocycline and minocycline/meropenem are synergistic in vitro against XDR-AB strains.

  3. A Novel Threshold Estimation Based Face Recognition Technique

    Directory of Open Access Journals (Sweden)

    Aparna Tiwari

    2015-12-01

    Full Text Available In recent times biometric based authentication gained lot of attention due to its advantages. The traditional approaches are PIN or password based which are now not so secure due to the hacking etc. Moreover, password and PIN can be stolen. To counteract such problems, a biometric based identifier can be used which is unique to each user. In the similar context face and finger are most preferred biometrics. In this paper, a face recognition based algorithm Principal Component Analysis (PCA is discussed which is very famous in face recognition. This method is based on Eigen values and Eigen functions. In this method only principal components are considered and other components which are away from the principal axis are discarded, thus dimension reduces significantly. The major problem with biometric is the selection of proper threshold and in general selected heuristically. In this paper an formula based on the Eigen value is derived and obtained results improves significantly.

  4. Estimations of One Repetition Maximum and Isometric Peak Torque in Knee Extension Based on the Relationship Between Force and Velocity.

    Science.gov (United States)

    Sugiura, Yoshito; Hatanaka, Yasuhiko; Arai, Tomoaki; Sakurai, Hiroaki; Kanada, Yoshikiyo

    2016-04-01

    We aimed to investigate whether a linear regression formula based on the relationship between joint torque and angular velocity measured using a high-speed video camera and image measurement software is effective for estimating 1 repetition maximum (1RM) and isometric peak torque in knee extension. Subjects comprised 20 healthy men (mean ± SD; age, 27.4 ± 4.9 years; height, 170.3 ± 4.4 cm; and body weight, 66.1 ± 10.9 kg). The exercise load ranged from 40% to 150% 1RM. Peak angular velocity (PAV) and peak torque were used to estimate 1RM and isometric peak torque. To elucidate the relationship between force and velocity in knee extension, the relationship between the relative proportion of 1RM (% 1RM) and PAV was examined using simple regression analysis. The concordance rate between the estimated value and actual measurement of 1RM and isometric peak torque was examined using intraclass correlation coefficients (ICCs). Reliability of the regression line of PAV and % 1RM was 0.95. The concordance rate between the actual measurement and estimated value of 1RM resulted in an ICC(2,1) of 0.93 and that of isometric peak torque had an ICC(2,1) of 0.87 and 0.86 for 6 and 3 levels of load, respectively. Our method for estimating 1RM was effective for decreasing the measurement time and reducing patients' burden. Additionally, isometric peak torque can be estimated using 3 levels of load, as we obtained the same results as those reported previously. We plan to expand the range of subjects and examine the generalizability of our results. PMID:26382131

  5. Estimations of One Repetition Maximum and Isometric Peak Torque in Knee Extension Based on the Relationship Between Force and Velocity.

    Science.gov (United States)

    Sugiura, Yoshito; Hatanaka, Yasuhiko; Arai, Tomoaki; Sakurai, Hiroaki; Kanada, Yoshikiyo

    2016-04-01

    We aimed to investigate whether a linear regression formula based on the relationship between joint torque and angular velocity measured using a high-speed video camera and image measurement software is effective for estimating 1 repetition maximum (1RM) and isometric peak torque in knee extension. Subjects comprised 20 healthy men (mean ± SD; age, 27.4 ± 4.9 years; height, 170.3 ± 4.4 cm; and body weight, 66.1 ± 10.9 kg). The exercise load ranged from 40% to 150% 1RM. Peak angular velocity (PAV) and peak torque were used to estimate 1RM and isometric peak torque. To elucidate the relationship between force and velocity in knee extension, the relationship between the relative proportion of 1RM (% 1RM) and PAV was examined using simple regression analysis. The concordance rate between the estimated value and actual measurement of 1RM and isometric peak torque was examined using intraclass correlation coefficients (ICCs). Reliability of the regression line of PAV and % 1RM was 0.95. The concordance rate between the actual measurement and estimated value of 1RM resulted in an ICC(2,1) of 0.93 and that of isometric peak torque had an ICC(2,1) of 0.87 and 0.86 for 6 and 3 levels of load, respectively. Our method for estimating 1RM was effective for decreasing the measurement time and reducing patients' burden. Additionally, isometric peak torque can be estimated using 3 levels of load, as we obtained the same results as those reported previously. We plan to expand the range of subjects and examine the generalizability of our results.

  6. 基于NET-SNMP的代理扩展实现%Implementation of Agent Extension Based on NET-SNMP

    Institute of Scientific and Technical Information of China (English)

    黄岩涛; 阮军洲

    2014-01-01

    针对网络中不同设备的高效管理问题,对NET-SNMP技术进行了简要介绍。NET-SNMP是一个开源的软件包,选择NET-SNMP开发网管系统既便于移植,也利于代理的扩展。介绍了NET-SNMP开发工具的工作原理,详细描述了在Linux环境下使用NET-SNMP开发工具扩展代理的流程和注意事项。代理软件采用模块化的结构,可根据需要扩展所支持的管理信息库模块,实现新的应用。结尾提出了NET-SNMP技术是一种实现远程管理功能的有效可行的方法。%Aiming at the effective management of different devices in network, this paper briefly introduces the NET-SNMP technology. The NET-SNMP is an open source software package. The selection of NET-SNMP for developing the network management system is convenient for migration and agent extension. This paper introduces the operating principle of NET-SNMP development tool, describes in detail the processes and precautions of extension agent of NET-SNMP development tool used in Linux environment. The agent software uses modular structure, so that the supported management information base module can be extended according to the requirements, in order to realize the new application. The conclusion shows that the NET-SNMP technology is an effective method for implementing the remote management function.

  7. FP-Growth Based New Normalization Technique for Subgraph Ranking

    Directory of Open Access Journals (Sweden)

    E.R.Naganathan

    2011-03-01

    Full Text Available The various problems in large volume of data area have been solved using frequent itemset discoveryalgorithms. As data mining techniques are being introduced and widely applied to non-traditionalitemsets, existing approaches for finding frequent itemsets were out of date as they cannot satisfy therequirement of these domains. Hence, an alternate method of modeling the objects in the said data set, isgraph. Modeling objects using graphs allows us to represent an arbitrary relation among entities. Thegraph is used to model the database objects. Within that model, the problem of finding frequent patternsbecomes that of finding subgraphs that occur frequently over the entire set of graphs. In this paper, wepresent an efficient algorithm for ranking of such frequent subgraphs. This proposed ranking method isapplied to the FP-growth method for discovering frequent subgraphs. In order to find out the ranking ofsubgraphs we present a new normalization technique which is the modified normalization techniqueapplied at each position for a chosen value of Discounted Cumulative Gain (DCG of a subgraph.Instead of DCG another modified approach called Modified Discounted Cumulative Gain (MDCG isintroduced. The MDCG alone cannot be used to achieve the performance from one query to the next inthe search engine’s algorithm. To obtain the new normalization technique an ideal ordering of MDCG(IMDCG at each position is to be found out. A Modified Discounted Cumulative Gain (MDCG iscalculated using “lift” as a new approach. IMDCG is also evaluated. Then the new approach forfinding the normalized values are to be computed. Finally, the values for all rules can be averaged toget an average performance of a ranking algorithm. And also the ordering of obtained values as a resultat each position will provide the order of evaluation of rules which in turn gives an efficient ranking ofmined subgraphs.

  8. A Robust Non-Blind Watermarking Technique for Color Video Based on Combined DWT-DFT Transforms and SVD Technique

    Directory of Open Access Journals (Sweden)

    Nandeesh B

    2014-08-01

    Full Text Available The rise of popularity of Digital video in the past decade has been tremendous thereby leading to malicious copying and distribution. So the need for preservation of ownership and in tackling copyright issues has become an imminent issue. Digital Video Watermarking has been in existence as a solution for this. The paper proposes a non-blind watermarking technique based on combined DWT-DFT transforms using singular values of SVD matrix in YCbCr color space. The technique uses Fibonacci series for selection of frames to enhance security and thereby maintaining quality of original video. Watermark encryption is done by scrambling the watermark using Arnold transform. Geometric and non-geometric attacks on watermarked video have been performed to test the robustness of the proposed technique. Quality of watermarked video is measured using PSNR and NC gives the similarity between extracted and the original watermark.

  9. Fast Multigrid Techniques in Total Variation-Based Image Reconstruction

    Science.gov (United States)

    Oman, Mary Ellen

    1996-01-01

    Existing multigrid techniques are used to effect an efficient method for reconstructing an image from noisy, blurred data. Total Variation minimization yields a nonlinear integro-differential equation which, when discretized using cell-centered finite differences, yields a full matrix equation. A fixed point iteration is applied with the intermediate matrix equations solved via a preconditioned conjugate gradient method which utilizes multi-level quadrature (due to Brandt and Lubrecht) to apply the integral operator and a multigrid scheme (due to Ewing and Shen) to invert the differential operator. With effective preconditioning, the method presented seems to require Omicron(n) operations. Numerical results are given for a two-dimensional example.

  10. Voltage Stabilizer Based on SPWM technique Using Microcontroller

    Directory of Open Access Journals (Sweden)

    K. N. Tarchanidis

    2013-01-01

    Full Text Available This paper presents an application of the well known SPWM technique on a voltage stabilizer, using a microcontroller. The stabilizer is AC/DC/AC type. So, the system rectifies the input AC voltage to a suitable DC level and the intelligent control of an embedded microcontroller regulates the pulse width of the output voltage in order to produce through a filter a perfect sinusoidal AC voltage. The control program on the microcontroller has the ability to change the FET transistor firing in order to compensate any input voltage variation. The applied software using the microcontroller’s interrupts managed to achieve concurrency on the running program.

  11. GPU-Based Techniques for Global Illumination Effects

    CERN Document Server

    Szirmay-Kalos, László; Sbert, Mateu

    2008-01-01

    This book presents techniques to render photo-realistic images by programming the Graphics Processing Unit (GPU). We discuss effects such as mirror reflections, refractions, caustics, diffuse or glossy indirect illumination, radiosity, single or multiple scattering in participating media, tone reproduction, glow, and depth of field. This book targets game developers, graphics programmers, and also students with some basic understanding of computer graphics algorithms, rendering APIs like Direct3D or OpenGL, and shader programming. In order to make this book self-contained, the most important c

  12. HPS Electronic Ballast Based on CIC-CPPFC Technique

    Institute of Scientific and Technical Information of China (English)

    王卫; 苏勤; 高国安

    2002-01-01

    Investigates the application of CIC-CPPFC techniques to high-pressure sodium(HPS) lamp electronic ballast. In order to ensure a unity power factor, different power electronic ballasts are studied by PSpice simulation. A dynamic model of HPS lamp with simple and accurate features is proposed for further study of characteristics. Experimental results verify the feasibility of HPS lamp operating at high frequency. It is shown that the presented electronic ballast has 0.99 power factor and 9% total harmonic distortion(THD).

  13. Hiding of Speech based on Chaotic Steganography and Cryptography Techniques

    Directory of Open Access Journals (Sweden)

    Abbas Salman Hameed

    2015-04-01

    Full Text Available The technique of embedding secret information into cover media, like image, video, audio and text called Steganography, so that only the sender and the authorized recipient who have a key can detect the presence of secret information. In this paper Steganography and Cryptography techniques of speech present with Chaos. Fractional order Lorenz and Chua systems that provides an expanded in key space are used to encrypt speech message. The large key space addition to all properties of randomness and nonlinearity which are possessed these chaotic systems ensure a highly robustness and security for cryptography process. As well as Modified Android Cat Map (MACM offers additional space and security for steganography process. The irregular outputs of the MACM are used in this paper to embed a secret message in a digital cover image. The results show a large key sensitivity to a small change in the secret key or parameters of MACM. Therefore, highly security hiding for speech will be guaranteed by using this system

  14. A novel fast full inversion based breast ultrasound elastography technique

    International Nuclear Information System (INIS)

    Cancer detection and classification have been the focus of many imaging and therapeutic research studies. Elastography is a non-invasive technique to visualize suspicious soft tissue areas where tissue stiffness is used as image contrast mechanism. In this study, a breast ultrasound elastography system including software and hardware is proposed. Unlike current elastography systems that image the tissue strain and present it as an approximation to relative tissue stiffness, this system is capable of imaging the breast absolute Young’s modulus in fast fashion. To improve the quality of elastography images, a novel system consisting of two load cells has been attached to the ultrasound probe. The load cells measure the breast surface forces to be used for calculating the tissue stress distribution throughout the breast. To facilitate fast imaging, this stress calculation is conducted by an accelerated finite element method. Acquired tissue displacements and surface force data are used as input to the proposed Young’s modulus reconstruction technique. Numerical and tissue mimicking phantom studies were conducted for validating the proposed system. These studies indicated that fast imaging of breast tissue absolute Young’s modulus using the proposed ultrasound elastography system is feasible. The tissue mimicking phantom study indicated that the system is capable of providing reliable absolute Young’s modulus values for both normal tissue and tumour as the maximum Young’s modulus reconstruction error was less than 6%. This demonstrates that the proposed system has a good potential to be used for clinical breast cancer assessment. (paper)

  15. Computer-vision-based registration techniques for augmented reality

    Science.gov (United States)

    Hoff, William A.; Nguyen, Khoi; Lyon, Torsten

    1996-10-01

    Augmented reality is a term used to describe systems in which computer-generated information is superimposed on top of the real world; for example, through the use of a see- through head-mounted display. A human user of such a system could still see and interact with the real world, but have valuable additional information, such as descriptions of important features or instructions for performing physical tasks, superimposed on the world. For example, the computer could identify and overlay them with graphic outlines, labels, and schematics. The graphics are registered to the real-world objects and appear to be 'painted' onto those objects. Augmented reality systems can be used to make productivity aids for tasks such as inspection, manufacturing, and navigation. One of the most critical requirements for augmented reality is to recognize and locate real-world objects with respect to the person's head. Accurate registration is necessary in order to overlay graphics accurately on top of the real-world objects. At the Colorado School of Mines, we have developed a prototype augmented reality system that uses head-mounted cameras and computer vision techniques to accurately register the head to the scene. The current system locates and tracks a set of pre-placed passive fiducial targets placed on the real-world objects. The system computes the pose of the objects and displays graphics overlays using a see-through head-mounted display. This paper describes the architecture of the system and outlines the computer vision techniques used.

  16. Bioluminescence-based imaging technique for pressure measurement in water

    Science.gov (United States)

    Watanabe, Yasunori; Tanaka, Yasufumi

    2011-07-01

    The dinoflagellate Pyrocystis lunula emits light in response to water motion. We developed a new imaging technique for measuring pressure using plankton that emits light in response to mechanical stimulation. The bioluminescence emitted by P. lunula was used to measure impact water pressure produced using weight-drop tests. The maximum mean luminescence intensity correlated with the maximum impact pressure that the cells receive when the circadian and diurnal biological rhythms are appropriately controlled. Thus, with appropriate calibration of experimentally determined parameters, the dynamic impact pressure can be estimated by measuring the cell-flash distribution. Statistical features of the evolution of flash intensity and the probability distribution during the impacting event, which are described by both biological and mechanical response parameters, are also discussed in this paper. The practical applicability of this bioluminescence imaging technique is examined through a water drop test. The maximum dynamic pressure, occurring at the impact of a water jet against a wall, was estimated from the flash intensity of the dinoflagellate.

  17. On-Line Hydrogen-Isotope Measurements of Organic Samples Using Elemental Chromium : An Extension for High Temperature Elemental-Analyzer Techniques

    NARCIS (Netherlands)

    Gehre, Matthias; Renpenning, Julian; Gilevska, Tetyana; Qi, Haiping; Coplen, Tyler B.; Meijer, Harro A. J.; Brand, Willi A.; Schimmelmann, Arndt

    2015-01-01

    The high temperature conversion (HTC) technique using an elemental analyzer with a glassy carbon tube and filling (temperature conversion/elemental analysis, TC/EA) is a widely used method for hydrogen isotopic analysis of water and many solid and liquid organic samples with analysis by isotope-rati

  18. Study of systems and techniques for data base management

    Science.gov (United States)

    1976-01-01

    Data management areas were studied to identify pertinent problems and issues that will affect future NASA data users in terms of performance and cost. Specific topics discussed include the identifications of potential NASA data users other than those normally discussed, consideration affecting the clustering of minicomputers, low cost computer system for information retrieval and analysis, the testing of minicomputer based data base management systems, ongoing work related to the use of dedicated systems for data base management, and the problems of data interchange among a community of NASA data users.

  19. Surgical technique for repair of complex anterior skull base defects

    Directory of Open Access Journals (Sweden)

    Kevin Reinard

    2015-01-01

    Conclusion: The layered reconstruction of large anterior cranial fossa defects resulted in postoperative CSF leak in only 5% of the patients and represents a simple and effective closure option for skull base surgeons.

  20. A Secured Communication Based On Knowledge Engineering Technique

    Directory of Open Access Journals (Sweden)

    M. W. Youssef

    2012-10-01

    Full Text Available Communication security has become the keynote of the "e" world. Industries like eComm, eGov were built on the technology of computer networks. Those industries cannot afford security breaches. This paper presents a methodology of securing computer communication based on identifying typical communication behavior of each system user based on the dominant set of protocols utilized between the network nodes.

  1. A fast Stokes inversion technique based on quadratic regression

    Science.gov (United States)

    Teng, Fei; Deng, Yuan-Yong

    2016-05-01

    Stokes inversion calculation is a key process in resolving polarization information on radiation from the Sun and obtaining the associated vector magnetic fields. Even in the cases of simple local thermodynamic equilibrium (LTE) and where the Milne-Eddington approximation is valid, the inversion problem may not be easy to solve. The initial values for the iterations are important in handling the case with multiple minima. In this paper, we develop a fast inversion technique without iterations. The time taken for computation is only 1/100 the time that the iterative algorithm takes. In addition, it can provide available initial values even in cases with lower spectral resolutions. This strategy is useful for a filter-type Stokes spectrograph, such as SDO/HMI and the developed two-dimensional real-time spectrograph (2DS).

  2. Tornado wind-loading requirements based on risk assessment techniques

    International Nuclear Information System (INIS)

    Regulations require that nuclear power plants be protected from tornado winds. If struck by a tornado, a plant must be capable of safely shutting down and removing decay heat. Probabilistic techniques are used to show that risk to the public from the US Department of Energy (DOE) SP-100 reactor is acceptable without tornado hardening parts of the secondary system. Relaxed requirements for design wind loadings will result in significant cost savings. To demonstrate an acceptable level of risk, this document examines tornado-initiated accidents. The two tornado-initiated accidents examined in detail are loss of cooling resulting in core damage and loss of secondary system boundary integrity leading to sodium release. Loss of core cooling is analyzed using fault/event tree models. Loss of secondary system boundary integrity is analyzed by comparing the consequences to acceptance criteria for the release of radioactive material or alkali metal aerosol

  3. Tornado wind-loading requirements based on risk assessment techniques

    International Nuclear Information System (INIS)

    Regulations require that nuclear power plants be protected from tornado winds. If struck by a tornado, a plant must be capable of safely shutting down and removing decay heat. Probabilistic techniques are used to show that risk to the public from the US Department of Energy (DOE) SP-100 reactor is acceptable without tornado hardening parts of the secondary system. Relaxed requirements for design wind loadings will result in significant cost savings. To demonstrate an acceptable level of risk, this document examines tornado-initiated accidents. The two tornado-initiated accidents examined in detail are loss of cooling resulting in core damage and loss of secondary system boundary integrity leading to sodium release. Loss of core cooling is analyzed using fault/event tree models. Loss of secondary system boundary integrity is analyzed by comparing the consequences to acceptance criteria for the release of radioactive material or alkali metal aerosol. 4 refs., 4 figs

  4. Quartile Clustering: A quartile based technique for Generating Meaningful Clusters

    CERN Document Server

    Goswami, Saptarsi

    2012-01-01

    Clustering is one of the main tasks in exploratory data analysis and descriptive statistics where the main objective is partitioning observations in groups. Clustering has a broad range of application in varied domains like climate, business, information retrieval, biology, psychology, to name a few. A variety of methods and algorithms have been developed for clustering tasks in the last few decades. We observe that most of these algorithms define a cluster in terms of value of the attributes, density, distance etc. However these definitions fail to attach a clear meaning/semantics to the generated clusters. We argue that clusters having understandable and distinct semantics defined in terms of quartiles/halves are more appealing to business analysts than the clusters defined by data boundaries or prototypes. On the samepremise, we propose our new algorithm named as quartile clustering technique. Through a series of experiments we establish efficacy of this algorithm. We demonstrate that the quartile clusteri...

  5. Innovative instrumentation for VVERs based in non-invasive techniques

    International Nuclear Information System (INIS)

    Nuclear power plants such as VVERs can greatly benefit from innovative instrumentation to improve plant safety and efficiency. In recent years innovative instrumentation has been developed for PWRs with the aim of providing additional measurements of physical parameters on the primary and secondary circuits: the addition of new instrumentation is made possible by using non-invasive techniques such as ultrasonics and radiation detection. These innovations can be adapted for upgrading VVERs presently in operation and also in future VVERs. The following innovative instrumentation for the control, monitoring or testing at VVERs is described: 1. instrumentation for more accurate primary side direct measurements (for a better monitoring of the primary circuit); 2. instrumentation to monitor radioactivity leaks (for a safer plant); 3. instrumentation-related systems to improve the plant efficiency (for a cheaper kWh)

  6. Adaptive Landmark-Based Navigation System Using Learning Techniques

    DEFF Research Database (Denmark)

    Zeidan, Bassel; Dasgupta, Sakyasingha; Wörgötter, Florentin;

    2014-01-01

    The goal-directed navigational ability of animals is an essential prerequisite for them to survive. They can learn to navigate to a distal goal in a complex environment. During this long-distance navigation, they exploit environmental features, like landmarks, to guide them towards their goal. In...... hexapod robots. As a result, it allows the robots to successfully learn to navigate to distal goals in complex environments.......The goal-directed navigational ability of animals is an essential prerequisite for them to survive. They can learn to navigate to a distal goal in a complex environment. During this long-distance navigation, they exploit environmental features, like landmarks, to guide them towards their goal....... Inspired by this, we develop an adaptive landmark-based navigation system based on sequential reinforcement learning. In addition, correlation-based learning is also integrated into the system to improve learning performance. The proposed system has been applied to simulated simple wheeled and more complex...

  7. Wavelet packet transform-based robust video watermarking technique

    Indian Academy of Sciences (India)

    Gaurav Bhatnagar; Balasubrmanian Raman

    2012-06-01

    In this paper, a wavelet packet transform (WPT)-based robust video watermarking algorithm is proposed. A visible meaningful binary image is used as the watermark. First, sequent frames are extracted from the video clip. Then, WPT is applied on each frame and from each orientation one sub-band is selected based on block mean intensity value called robust sub-band. Watermark is embedded in the robust sub-bands based on the relationship between wavelet packet coefficient and its 8-neighbour $(D_8)$ coefficients considering the robustness and invisibility. Experimental results and comparison with existing algorithms show the robustness and the better performance of the proposed algorithm.

  8. Fluorometric Discrimination Technique of Phytoplankton Population Based on Wavelet Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Shanshan; SU Rongguo; DUAN Yali; ZHANG Cui; SONG Zhijie; WANG Xiulin

    2012-01-01

    The discrete excitation-emission-matrix fluorescence spectra(EEMS)at 12 excitation wavelengths (400,430,450,460,470,490,500,510,525,550,570,and 590 nm)and emission wavelengths ranging from 600-750 nm were determined for 43 phytoplankton species.A two-rank fluorescence spectra database was established by wavelet analysis and a fluorometric discrimination technique for determining phytoplankton population was developed.For laboratory simulatively mixed samples,the samples mixed from 43 algal species(the algae of one division accounted for 25%,50%,75%,85%,and 100% of the gross biomass,respectively),the average discrimination rates at the level of division were 65.0%,87.5%,98.6%,99.0%,and 99.1%,with average relative contents of 18.9%,44.5%,68.9%,73.4%,and 82.9%,respectively;the samples mixed from 32 red tide algal species(the dominant species accounted for 60%,70%,80%,90%,and 100% of the gross biomass,respectively),the average correct discrimination rates of the dominant species at the level of genus were 63.3%,74.2%,78.8%,83.4%,and 79.4%,respectively.For the 81 laboratory mixed samples with the dominant species accounting for 75% of the gross biomass(chlorophyll),the discrimination rates of the dominant species were 95.1% and 72.8% at the level of division and genus,respectively.For the 12 samples collected from the mesocosm experiment in Maidao Bay of Qingdao in August 2007,the dominant species of the 11 samples were recognized at the division level and the dominant species of four of the five samples in which the dominant species accounted for more than 80% of the gross biomass were discriminated at the genus level;for the 12 samples obtained from Jiaozhou Bay in August 2007,the dominant species of all the 12 samples were recognized at the division level.The technique can be directly applied to fluorescence spectrophotometers and to the developing of an in situ algae fluorescence auto-analyzer for

  9. Efficient Identification Using a Prime-Feature-Based Technique

    DEFF Research Database (Denmark)

    Hussain, Dil Muhammad Akbar; Haq, Shaiq A.; Valente, Andrea

    2011-01-01

    , which are called minutiae points. Designing a reliable automatic fingerprint matching algorithm for minimal platform is quite challenging. In real-time systems, efficiency of the matching algorithm is of utmost importance. To achieve this goal, a prime-feature-based indexing algorithm is proposed in...

  10. Customer requirements based ERP customization using AHP technique

    NARCIS (Netherlands)

    Parthasarathy, S.; Daneva, Maya

    2014-01-01

    Purpose– Customization is a difficult task for many organizations implementing enterprise resource planning (ERP) systems. The purpose of this paper is to develop a new framework based on customers’ requirements to examine the ERP customization choices for the enterprise. The analytical hierarchy pr

  11. Problem-Based Learning Supported by Semantic Techniques

    Science.gov (United States)

    Lozano, Esther; Gracia, Jorge; Corcho, Oscar; Noble, Richard A.; Gómez-Pérez, Asunción

    2015-01-01

    Problem-based learning has been applied over the last three decades to a diverse range of learning environments. In this educational approach, different problems are posed to the learners so that they can develop different solutions while learning about the problem domain. When applied to conceptual modelling, and particularly to Qualitative…

  12. Orientation precision of TEM-based orientation mapping techniques

    Energy Technology Data Exchange (ETDEWEB)

    Morawiec, A., E-mail: nmmorawi@cyf-kr.edu.pl [Institute of Metallurgy and Materials Science, Polish Academy of Sciences, Kraków (Poland); Bouzy, E. [Laboratoire d' Etude des Microstructures et de Mécanique des Matériaux, Université de Metz, Metz (France); Paul, H. [Institute of Metallurgy and Materials Science, Polish Academy of Sciences, Kraków (Poland); Fundenberger, J.J. [Laboratoire d' Etude des Microstructures et de Mécanique des Matériaux, Université de Metz, Metz (France)

    2014-01-15

    Automatic orientation mapping is an important addition to standard capabilities of conventional transmission electron microscopy (TEM) as it facilitates investigation of crystalline materials. A number of different such mapping systems have been implemented. One of their crucial characteristics is the orientation resolution. The precision in determination of orientations and misorientations reached in practice by TEM-based automatic mapping systems is the main subject of the paper. The analysis is focused on two methods: first, using spot diffraction patterns and ‘template matching’, and second, using Kikuchi patterns and detection of reflections. In simple terms, for typical mapping conditions, their precisions in orientation determination with the confidence of 95% are, respectively, 1.1° and 0.3°. The results are illustrated by example maps of cellular structure in deformed Al, the case for which high orientation sensitivity matters. For more direct comparison, a novel approach to mapping is used: the same patterns are solved by each of the two methods. Proceeding from a classification of the mapping systems, the obtained results may serve as indicators of precisions of other TEM-based orientation mapping methods. The findings are of significance for selection of methods adequate to investigated materials. - Highlights: • Classification of the existing TEM-based orientation mapping systems. • Reliable data on orientation precision in TEM-based orientation maps. • Orientation precisions in spot and Kikuchi based maps estimated to be 1.1° and 0.3°. • New method of mapping by using spot and Kikuchi components of the same patterns.

  13. Calculation of free fall trajectories based on numerical optimization techniques

    Science.gov (United States)

    1972-01-01

    The development of a means of computing free-fall (nonthrusting) trajectories from one specified point in the solar system to another specified point in the solar system in a given amount of time was studied. The problem is that of solving a two-point boundary value problem for which the initial slope is unknown. Two standard methods of attack exist for solving two-point boundary value problems. The first method is known as the initial value or shooting method. The second method of attack for two-point boundary value problems is to approximate the nonlinear differential equations by an appropriate linearized set. Parts of both boundary value problem solution techniques described above are used. A complete velocity history is guessed such that the corresponding position history satisfies the given boundary conditions at the appropriate times. An iterative procedure is then followed until the last guessed velocity history and the velocity history obtained from integrating the acceleration history agree to some specified tolerance everywhere along the trajectory.

  14. Cost-optimal power system extension under flow-based market coupling and high shares of photovoltaics

    Energy Technology Data Exchange (ETDEWEB)

    Hagspiel, Simeon; Jaegemann, Cosima; Lindenberger, Dietmar [Koeln Univ. (Germany). Inst. of Energy Economics; Cherevatskiy, Stanislav; Troester, Eckehard; Brown, Tom [Energynautics GmbH, Langen (Germany)

    2012-07-01

    Electricity market models, implemented as dynamic programming problems, have been applied widely to identify possible pathways towards a cost-optimal and low carbon electricity system. However, the joint optimization of generation and transmission remains challenging, mainly due to the fact that different characteristics and rules apply to commercial and physical exchanges of electricity in meshed networks. This paper presents a methodology that allows to optimize power generation and transmission infrastructures jointly through an iterative approach based on power transfer distribution factors (PTDFs). As PTDFs are linear representations of the physical load flow equations, they can be implemented in a linear programming environment suitable for large scale problems such as the European power system. The algorithm iteratively updates PTDFs when grid infrastructures are modified due to cost-optimal extension and thus yields an optimal solution with a consistent representation of physical load flows. The method is demonstrated on a simplified three-node model where it is found to be stable and convergent. It is then scaled to the European level in order to find the optimal power system infrastructure development under the prescription of strongly decreasing CO{sub 2} emissions in Europe until 2050 with a specific focus on photovoltaic (PV) power. (orig.)

  15. Compressive spectrum sensing of radar pulses based on photonic techniques.

    Science.gov (United States)

    Guo, Qiang; Liang, Yunhua; Chen, Minghua; Chen, Hongwei; Xie, Shizhong

    2015-02-23

    We present a photonic-assisted compressive sampling (CS) system which can acquire about 10(6) radar pulses per second spanning from 500 MHz to 5 GHz with a 520-MHz analog-to-digital converter (ADC). A rectangular pulse, a linear frequency modulated (LFM) pulse and a pulse stream is respectively reconstructed faithfully through this system with a sliding window-based recovery algorithm, demonstrating the feasibility of the proposed photonic-assisted CS system in spectral estimation for radar pulses.

  16. A novel Communication Technique for Nanobots based on acoustic signals

    OpenAIRE

    Loscri, Valeria; Natalizio, Enrico; Mannara, Valentina; Gianluca ALOI

    2012-01-01

    International audience In this work we present the simulation of a swarm of nanobots that behave in a distributed fashion and communicate through vibrations, permitting a decentralized control to treat endogenous diseases of the brain. Each nanobot is able to recognize a cancer cell, eliminate it and announces through a communication based on acoustic signals the presence of the cancer to the other nanobots. We assume that our nano-devices vibrate and these vibrations cause acoustic waves ...

  17. DATA MINING BASED TECHNIQUE FOR IDS ALERT CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    Hany Nashat Gabra

    2015-06-01

    Full Text Available Intrusion detection systems (IDSs have become a widely used measure for security systems. The main problem for such systems is the irrelevant alerts. We propose a data mining based method for classification to distinguish serious and irrelevant alerts with a performance of 99.9%, which is better in comparison with the other recent data mining methods that achieved 97%. A ranked alerts list is also created according to the alert’s importance to minimize human interventions.

  18. Shorter window DFT based technique for fault current filtering

    Energy Technology Data Exchange (ETDEWEB)

    Yu, C.S. [National Defence Univ., Taiwan (China); Lee, S.Y. [Northern Taiwan Inst. of Science and Technology, Taipei, Taiwan (China); Wang, S.C. [Lung Hwa Univ. of Science and Technology, Taoyuan, Taiwan (China); Chen, Y.L. [MingChi Univ. of Technology, Taipei, Taiwan (China)

    2006-07-01

    In computer protection relaying design, fault current filtering is one of the most important considerations. To overcome the reach problem caused by the decaying direct current (DC) component, researchers have focused on finding useful algorithms to remove this effect. However, in considering series compensated lines, the algorithms developed for the decaying DC component are not suitable for the subsynchronous frequency component. In addition, several accurate fault location algorithms have been proposed based on the accurate fundamental frequency phasor. However, the vital slow convergence extremely reduces the accuracy and response time of the relaying scheme. An accurate fundamental frequency phasor is therefore essential. This paper presented a damping filter design based on reiterative discrete Fourier transform (DFT) algorithm for fault current filtering in series compensated lines. To damp the measurement, the shorter window DFT based mimic filter was developed. To reconstruct the damped measurement and achieve further damping, a reiterative scheme was then proposed. The recursive form was developed to reduce the computation burden. It was concluded that the algorithm significantly reduced the time needed to obtain the accurate fundamental phasor and provided better performance than that of the conventional DFT algorithm. 9 refs., 14 figs.

  19. Study of hydrogen in coals, polymers, oxides, and muscle water by nuclear magnetic resonance; extension of solid-state high-resolution techniques. [Hydrogen molybdenum bronze

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, L.M.

    1981-10-01

    Nuclear magnetic resonance (NMR) spectroscopy has been an important analytical and physical research tool for several decades. One area of NMR which has undergone considerable development in recent years is high resolution NMR of solids. In particular, high resolution solid state /sup 13/C NMR spectra exhibiting features similar to those observed in liquids are currently achievable using sophisticated pulse techniques. The work described in this thesis develops analogous methods for high resolution /sup 1/H NMR of rigid solids. Applications include characterization of hydrogen aromaticities in fossil fuels, and studies of hydrogen in oxides and bound water in muscle.

  20. [A Terahertz Spectral Database Based on Browser/Server Technique].

    Science.gov (United States)

    Zhang, Zhuo-yong; Song, Yue

    2015-09-01

    With the solution of key scientific and technical problems and development of instrumentation, the application of terahertz technology in various fields has been paid more and more attention. Owing to the unique characteristic advantages, terahertz technology has been showing a broad future in the fields of fast, non-damaging detections, as well as many other fields. Terahertz technology combined with other complementary methods can be used to cope with many difficult practical problems which could not be solved before. One of the critical points for further development of practical terahertz detection methods depends on a good and reliable terahertz spectral database. We developed a BS (browser/server) -based terahertz spectral database recently. We designed the main structure and main functions to fulfill practical requirements. The terahertz spectral database now includes more than 240 items, and the spectral information was collected based on three sources: (1) collection and citation from some other abroad terahertz spectral databases; (2) collected from published literatures; and (3) spectral data measured in our laboratory. The present paper introduced the basic structure and fundament functions of the terahertz spectral database developed in our laboratory. One of the key functions of this THz database is calculation of optical parameters. Some optical parameters including absorption coefficient, refractive index, etc. can be calculated based on the input THz time domain spectra. The other main functions and searching methods of the browser/server-based terahertz spectral database have been discussed. The database search system can provide users convenient functions including user registration, inquiry, displaying spectral figures and molecular structures, spectral matching, etc. The THz database system provides an on-line searching function for registered users. Registered users can compare the input THz spectrum with the spectra of database, according to

  1. Grid Based Techniques for Visualization in the Geosciences

    Science.gov (United States)

    Bollig, E. F.; Sowell, B.; Lu, Z.; Erlebacher, G.; Yuen, D. A.

    2005-12-01

    As experiments and simulations in the geosciences grow larger and more complex, it has become increasingly important to develop methods of processing and sharing data in a distributed computing environment. In recent years, the scientific community has shown growing interest in exploiting the powerful assets of Grid computing to this end, but the complexity of the Grid has prevented many scientists from converting their applications and embracing this possibility. We are investigating methods for development and deployment of data extraction and visualization services across the NaradaBrokering [1] Grid infrastructure. With the help of gSOAP [2], we have developed a series of C/C++ services for wavelet transforms, earthquake clustering, and basic 3D visualization. We will demonstrate the deployment and collaboration of these services across a network of NaradaBrokering nodes, concentrating on the challenges faced in inter-service communication, service/client division, and particularly web service visualization. Renderings in a distributed environment can be handled in three ways: 1) the data extraction service computes and renders everything locally and sends results to the client as a bitmap image, 2) the data extraction service sends results to a separate visualization service for rendering, which in turn sends results to a client as a bitmap image, and 3) the client itself renders images locally. The first two options allow for large visualizations in a distributed and collaborative environment, but limit interactivity of the client. To address this problem we are investigating the advantages of the JOGL OpenGL library [3] to perform renderings on the client side using the client's hardware for increased performance. We will present benchmarking results to ascertain the relative advantage of the three aforementioned techniques as a function of datasize and visualization task. [1] The NaradaBrokering Project, http://www.naradabrokering.org [2] gSOAP: C/C++ Web

  2. An RSA-Based Leakage-Resilient Authenticated Key Exchange Protocol Secure against Replacement Attacks, and Its Extensions

    Science.gov (United States)

    Shin, Seonghan; Kobara, Kazukuni; Imai, Hideki

    Secure channels can be realized by an authenticated key exchange (AKE) protocol that generates authenticated session keys between the involving parties. In [32], Shin et al., proposed a new kind of AKE (RSA-AKE) protocol whose goal is to provide high efficiency and security against leakage of stored secrets as much as possible. Let us consider more powerful attacks where an adversary completely controls the communications and the stored secrets (the latter is denoted by “replacement” attacks). In this paper, we first show that the RSA-AKE protocol [32] is no longer secure against such an adversary. The main contributions of this paper are as follows: (1) we propose an RSA-based leakage-resilient AKE (RSA-AKE2) protocol that is secure against active attacks as well as replacement attacks; (2) we prove that the RSA-AKE2 protocol is secure against replacement attacks based on the number theory results; (3) we show that it is provably secure in the random oracle model, by showing the reduction to the RSA one-wayness, under an extended model that covers active attacks and replacement attacks; (4) in terms of efficiency, the RSA-AKE2 protocol is comparable to [32] in the sense that the client needs to compute only one modular multiplication with pre-computation; and (5) we also discuss about extensions of the RSA-AKE2 protocol for several security properties (i.e., synchronization of stored secrets, privacy of client and solution to server compromise-impersonation attacks).

  3. A DCT And SVD based Watermarking Technique To Identify Tag

    OpenAIRE

    Ji, Ke; Lin, Jianbiao; Li, Hui; Wang, Ao; Tang, Tianjing

    2015-01-01

    With the rapid development of the multimedia,the secure of the multimedia is get more concerned. as far as we know , Digital watermarking is an effective way to protect copyright. The watermark must be generally hidden does not affect the quality of the original image. In this paper,a novel way based on discrete cosine transform(DCT) and singular value decomposition(SVD) .In the proposed way,we decomposition the image into 8*8 blocks, next we use the DCT to get the transformed block,then we c...

  4. Design of Process Displays based on Risk Analysis Techniques

    DEFF Research Database (Denmark)

    Paulsen, Jette Lundtang

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, es-pecially in view of the enormous amount of information available in computer-based supervision systems......-tions. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engi-neer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described...

  5. Computational Intelligence based techniques for islanding detection of distributed generation in distribution network: A review

    International Nuclear Information System (INIS)

    Highlights: • Unintentional and intentional islanding, their causes, and solutions are presented. • Remote, passive, active and hybrid islanding detection techniques are discussed. • The limitation of these techniques in accurately detect islanding are discussed. • Computational intelligence techniques ability in detecting islanding is discussed. • Review of ANN, fuzzy logic control, ANFIS, Decision tree techniques is provided. - Abstract: Accurate and fast islanding detection of distributed generation is highly important for its successful operation in distribution networks. Up to now, various islanding detection technique based on communication, passive, active and hybrid methods have been proposed. However, each technique suffers from certain demerits that cause inaccuracies in islanding detection. Computational intelligence based techniques, due to their robustness and flexibility in dealing with complex nonlinear systems, is an option that might solve this problem. This paper aims to provide a comprehensive review of computational intelligence based techniques applied for islanding detection of distributed generation. Moreover, the paper compares the accuracies of computational intelligence based techniques over existing techniques to provide a handful of information for industries and utility researchers to determine the best method for their respective system

  6. Image processing technique based on image understanding architecture

    Science.gov (United States)

    Kuvychko, Igor

    2000-12-01

    Effectiveness of image applications is directly based on its abilities to resolve ambiguity and uncertainty in the real images. That requires tight integration of low-level image processing with high-level knowledge-based reasoning, which is the solution of the image understanding problem. This article presents a generic computational framework necessary for the solution of image understanding problem -- Spatial Turing Machine. Instead of tape of symbols, it works with hierarchical networks dually represented as discrete and continuous structures. Dual representation provides natural transformation of the continuous image information into the discrete structures, making it available for analysis. Such structures are data and algorithms at the same time and able to perform graph and diagrammatic operations being the basis of intelligence. They can create derivative structures that play role of context, or 'measurement device,' giving the ability to analyze, and run top-bottom algorithms. Symbols naturally emerge there, and symbolic operations work in combination with new simplified methods of computational intelligence. That makes images and scenes self-describing, and provides flexible ways of resolving uncertainty. Classification of images truly invariant to any transformation could be done via matching their derivative structures. New proposed architecture does not require supercomputers, opening ways to the new image technologies.

  7. Design, Implementation and Demonstration of the Android Version of Agro-Technique Extension Information Platform%基于Android的农技推广信息化平台设计、实现及示范应用

    Institute of Scientific and Technical Information of China (English)

    尹国伟; 王文生; 孙志国; 王曦光

    2015-01-01

    基于Android的农技推广信息化平台结合移动智能设备特点,有针对性地为农技员设计服务农户、报送信息、分享交流等功能,进一步丰富农技推广工作的信息化手段。Android端农技推广移动应用是WEB版的延伸和补充;通过将字符处理、图片处理、音频处理、终端控制、通信交互等功能块进行有机组合以满足农技员需求;在示范应用过程中,要重视示范点的确定、示范农技员的遴选、移动终端的配备、通信流量的设定、移动应用的使用培训、用户的反馈响应等环节。%The Android version of agro-technique extension information platform which further enriched the means of agro-technique extension targeted for the agro-technique extension workers. The functions of farmer service, information submission and experience sharing were designed with mobile intelligent equipment characteristics. This paper reviewed the design idea of the Android terminal mobile application;the character processing, image processing, audio processing, terminal control, communication interaction function was explained briefly. In the process of demonstration, the demonstration point determination, demonstration member selection, the mobile terminal equipment, communication flow setting, training in the use of mobile applications and user feedback response were all indispensable link and need to pay enough attention to.

  8. On-line hydrogen-isotope measurements of organic samples using elemental chromium: an extension for high temperature elemental-analyzer techniques.

    Science.gov (United States)

    Gehre, Matthias; Renpenning, Julian; Gilevska, Tetyana; Qi, Haiping; Coplen, Tyler B; Meijer, Harro A J; Brand, Willi A; Schimmelmann, Arndt

    2015-01-01

    The high temperature conversion (HTC) technique using an elemental analyzer with a glassy carbon tube and filling (temperature conversion/elemental analysis, TC/EA) is a widely used method for hydrogen isotopic analysis of water and many solid and liquid organic samples with analysis by isotope-ratio mass spectrometry (IRMS). However, the TC/EA IRMS method may produce inaccurate δ(2)H results, with values deviating by more than 20 mUr (milliurey = 0.001 = 1‰) from the true value for some materials. We show that a single-oven, chromium-filled elemental analyzer coupled to an IRMS substantially improves the measurement quality and reliability for hydrogen isotopic compositions of organic substances (Cr-EA method). Hot chromium maximizes the yield of molecular hydrogen in a helium carrier gas by irreversibly and quantitatively scavenging all reactive elements except hydrogen. In contrast, under TC/EA conditions, heteroelements like nitrogen or chlorine (and other halogens) can form hydrogen cyanide (HCN) or hydrogen chloride (HCl) and this can cause isotopic fractionation. The Cr-EA technique thus expands the analytical possibilities for on-line hydrogen-isotope measurements of organic samples significantly. This method yielded reproducibility values (1-sigma) for δ(2)H measurements on water and caffeine samples of better than 1.0 and 0.5 mUr, respectively. To overcome handling problems with water as the principal calibration anchor for hydrogen isotopic measurements, we have employed an effective and simple strategy using reference waters or other liquids sealed in silver-tube segments. These crimped silver tubes can be employed in both the Cr-EA and TC/EA techniques. They simplify considerably the normalization of hydrogen-isotope measurement data to the VSMOW-SLAP (Vienna Standard Mean Ocean Water-Standard Light Antarctic Precipitation) scale, and their use improves accuracy of the data by eliminating evaporative loss and associated isotopic fractionation while

  9. Developing Visualization Techniques for Semantics-based Information Networks

    Science.gov (United States)

    Keller, Richard M.; Hall, David R.

    2003-01-01

    Information systems incorporating complex network structured information spaces with a semantic underpinning - such as hypermedia networks, semantic networks, topic maps, and concept maps - are being deployed to solve some of NASA s critical information management problems. This paper describes some of the human interaction and navigation problems associated with complex semantic information spaces and describes a set of new visual interface approaches to address these problems. A key strategy is to leverage semantic knowledge represented within these information spaces to construct abstractions and views that will be meaningful to the human user. Human-computer interaction methodologies will guide the development and evaluation of these approaches, which will benefit deployed NASA systems and also apply to information systems based on the emerging Semantic Web.

  10. Quadrant Based WSN Routing Technique By Shifting Of Origin

    Directory of Open Access Journals (Sweden)

    Nandan Banerji

    2013-04-01

    Full Text Available A sensor is a miniaturized, low powered (basically battery powered, limited storage device which can sense the natural phenomenon or things and convert it into electrical energy or vice versa using transduction process. A Wireless Sensor Network (WSN is such a wireless network built using sensors. The sensors communicate with each other’s using wireless medium. They can be deployed in such an environment; inaccessible to human or difficult to reach. Basically there is a vast application on automated world such as robotics, avionics, oceanographic study, space, satellites etc. The routing of a packet from a source node to a destination should be efficient in such a way that must be efficient in case of energy, communication overhead, less intermediate hops. The scheme will help to route the packet with a lesser intermediate nodes as the neighbors are being selected based on their Quadrant position.

  11. Image restoration techniques based on fuzzy neural networks

    Institute of Scientific and Technical Information of China (English)

    刘普寅; 李洪兴

    2002-01-01

    By establishing some suitable partitions of input and output spaces, a novel fuzzy neuralnetwork (FNN) which is called selection type FNN is developed. Such a system is a multilayerfeedforward neural network, which can be a universal approximator with maximum norm. Based ona family of fuzzy inference rules that are of real senses, a simple and useful inference type FNN isconstructed. As a result, the fusion of selection type FNN and inference type FNN results in a novelfilter-FNN filter. It is simple in structure. And also it is convenient to design the learning algorithmfor structural parameters. Further, FNN filter can efficiently suppress impulse noise superimposed onimage and preserve fine image structure, simultaneously. Some examples are simulated to confirmthe advantages of FNN filter over other filters, such as median filter and adaptive weighted fuzzymean (AWFM) filter and so on, in suppression of noises and preservation of image structure.

  12. Techniques of Image Processing Based on Artificial Neural Networks

    Institute of Scientific and Technical Information of China (English)

    LI Wei-qing; WANG Qun; WANG Cheng-biao

    2006-01-01

    This paper presented an online quality inspection system based on artificial neural networks. Chromatism classification and edge detection are two difficult problems in glass steel surface quality inspection. Two artificial neural networks were made and the two problems were solved. The one solved chromatism classification. Hue,saturation and their probability of three colors, whose appearing probabilities were maximum in color histogram,were selected as input parameters, and the number of output node could be adjusted with the change of requirement. The other solved edge detection. In this neutral network, edge detection of gray scale image was able to be tested with trained neural networks for a binary image. It prevent the difficulty that the number of needed training samples was too large if gray scale images were directly regarded as training samples. This system is able to be applied to not only glass steel fault inspection but also other product online quality inspection and classification.

  13. Symbolic document image compression based on pattern matching techniques

    Science.gov (United States)

    Shiah, Chwan-Yi; Yen, Yun-Sheng

    2011-10-01

    In this paper, a novel compression algorithm for Chinese document images is proposed. Initially, documents are segmented into readable components such as characters and punctuation marks. Similar patterns within the text are found by shape context matching and grouped to form a set of prototype symbols. Text redundancies can be removed by replacing repeated symbols by their corresponding prototype symbols. To keep the compression visually lossless, we use a multi-stage symbol clustering procedure to group similar symbols and to ensure that there is no visible error in the decompressed image. In the encoding phase, the resulting data streams are encoded by adaptive arithmetic coding. Our results show that the average compression ratio is better than the international standard JBIG2 and the compressed form of a document image is suitable for a content-based keyword searching operation.

  14. Ultrasound-based technique for intrathoracic surgical guidance

    Science.gov (United States)

    Huang, Xishi; Hill, Nicholas A.; Peters, Terry M.

    2005-04-01

    Image-guided procedures within the thoracic cavity require accurate registration of a pre-operative virtual model to the patient. Currently, surface landmarks are used for thoracic cavity registration; however, this approach is unreliable due to skin movement relative to the ribs. An alternative method for providing surgeons with image feedback in the operating room is to integrate images acquired during surgery with images acquired pre-operatively. This integration process is required to be automatic, fast, accurate and robust; however inter-modal image registration is difficult due to the lack of a direct relationship between the intensities of the two image sets. To address this problem, Computed Tomography (CT) was used to acquire pre-operative images and Ultrasound (US) was used to acquire peri-operative images. Since bone has a high electron density and is highly echogenic, the rib cage is visualized as a bright white boundary in both datasets. The proposed approach utilizes the ribs as the basis for an intensity-based registration method -- mutual information. We validated this approach using a thorax phantom. Validation results demonstrate that this approach is accurate and shows little variation between operators. The fiducial registration error, the registration error between the US and CT images, was < 1.5mm. We propose this registration method as a basis for precise tracking of minimally invasive thoracic procedures. This method will permit the planning and guidance of image-guided minimally invasive procedures for the lungs, as well as for both catheter-based and direct trans-mural interventions within the beating heart.

  15. Risk-based evaluation of allowed outage time and surveillance test interval extensions for nuclear power plants

    International Nuclear Information System (INIS)

    The main goal of this work is, through the use of Probabilistic Safety Analysis (PSA), to evaluate Technical Specification (TS) Allowed Outage Times (AOT) and Surveillance Test Intervals (STI) extensions for Angra 1 nuclear power plant. PSA has been incorporated as an additional tool, required as part of NPP licensing process. The risk measure used in this work is the Core Damage Frequency (CDF), obtained from the Angra 1 PSA Level 1. AOT and STI extensions are calculated for the Safety Injection System (SIS), Service water System (SAS) and Auxiliary Feedwater System (AFS) through the use of SAPHIRE code. In order to compensate for the risk increase caused by the extensions, compensatory measures as test of redundant train prior to entering maintenance and staggered test strategy are proposed. Results have shown that the proposed AOT extensions are acceptable for the SIS and SAS with the implementation of compensatory measures. The proposed AOT extension is not acceptable for the AFS. The STI extensions are acceptable for all three systems. (author)

  16. A Simulated System for Traffic Signal Management Based on Integrating GIS & WSN Techniques

    Directory of Open Access Journals (Sweden)

    Ahmed S. Elmotelb

    2016-01-01

    Full Text Available Traffic signals management systems (TSMS are traffic systems based on cameras, infrared sensors and satellite systems. Such systems have been lacking the ability of real-time data collection and support. This paper proposes a solution to the traffic signal management problem using combined techniques that combines both GIS information with WSN based techniques. This combination provide appropriate techniques and tools that will enhance the capabilities of traffic jam prevention, early detection, efficient surveillance, efficient spread control, and fast termination of possible hazards. Consequently, this work proposes a new methodology thrown merging WSN and GIS techniques to produce valuable information for traffic signals management systems purposes.

  17. Arithmetic and Frequency Filtering Methods of Pixel-Based Image Fusion Techniques

    CERN Document Server

    Al-Wassai, Firouz Abdullah; Al-Zuky, Ali A

    2011-01-01

    In remote sensing, image fusion technique is a useful tool used to fuse high spatial resolution panchromatic images (PAN) with lower spatial resolution multispectral images (MS) to create a high spatial resolution multispectral of image fusion (F) while preserving the spectral information in the multispectral image (MS).There are many PAN sharpening techniques or Pixel-Based image fusion techniques that have been developed to try to enhance the spatial resolution and the spectral property preservation of the MS. This paper attempts to undertake the study of image fusion, by using two types of pixel-based image fusion techniques i.e. Arithmetic Combination and Frequency Filtering Methods of Pixel-Based Image Fusion Techniques. The first type includes Brovey Transform (BT), Color Normalized Transformation (CN) and Multiplicative Method (MLT). The second type include High-Pass Filter Additive Method (HPFA), High-Frequency-Addition Method (HFA) High Frequency Modulation Method (HFM) and The Wavelet transform-base...

  18. Measurement of particle size based on digital imaging technique

    Institute of Scientific and Technical Information of China (English)

    CHEN Hong; TANG Hong-wu; LIU Yun; WANG Hao; LIU Gui-ping

    2013-01-01

    To improve the analysis methods for the measurement of the sediment particle sizes with a wide distribution and of irregular shapes,a sediment particle image measurement,an analysis system,and an extraction algorithm of the optimal threshold based on the gray histogram peak values are proposed.Recording the pixels of the sediment particles by labeling them,the algorithm can effectively separate the sediment particle images from the background images using the equivalent pixel circles with the same diameters to represent the sediment particles.Compared with the laser analyzer for the case of blue plastic sands,the measurement results of the system are shown to be reasonably similar.The errors are mainly due to the small size of the particles and the limitation of the apparatus.The measurement accuracy can be improved by increasing the Charge-Coupled Devices (CCD) camera resolution.The analysis method of the sediment particle images can provide a technical support for the rapid measurement of the sediment particle size and its distribution.

  19. Damage identification in beams by a response surface based technique

    Directory of Open Access Journals (Sweden)

    Teidj S.

    2014-01-01

    Full Text Available In this work, identification of damage in uniform homogeneous metallic beams was considered through the propagation of non dispersive elastic torsional waves. The proposed damage detection procedure consisted of the following sequence. Giving a localized torque excitation, having the form of a short half-sine pulse, the first step was calculating the transient solution of the resulting torsional wave. This torque could be generated in practice by means of asymmetric laser irradiation of the beam surface. Then, a localized defect assumed to be characterized by an abrupt reduction of beam section area with a given height and extent was placed at a known location of the beam. Next, the response in terms of transverse section rotation rate was obtained for a point situated afterwards the defect, where the sensor was positioned. This last could utilize in practice the concept of laser vibrometry. A parametric study has been conducted after that by using a full factorial design of experiments table and numerical simulations based on a finite difference characteristic scheme. This has enabled the derivation of a response surface model that was shown to represent adequately the response of the system in terms of the following factors: defect extent and severity. The final step was performing the inverse problem solution in order to identify the defect characteristics by using measurement.

  20. Design of process displays based on risk analysis techniques

    International Nuclear Information System (INIS)

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, especially in view of the enormous amount of information available in computer-based supervision systems. The state of the art is discussed: How are supervision systems designed today and why? Which strategies are used? What kind of research is going on? Four different plants and their display systems, designed by the author, are described and discussed. Next we outline different methods for eliciting knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chapter presents an overview of the various types of operation references: constitutive equations, set points, design parameters, component characteristics etc., and their validity in different situations. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engineer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described in some detail. Finally we address the problem of where to put the dot and the lines: when all information is on the table, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose was to develop a software tool for maintenance supervision of components in a nuclear power plant. (au)

  1. Design of process displays based on risk analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lundtang Paulsen, J

    2004-05-01

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, especially in view of the enormous amount of information available in computer-based supervision systems. The state of the art is discussed: How are supervision systems designed today and why? Which strategies are used? What kind of research is going on? Four different plants and their display systems, designed by the author, are described and discussed. Next we outline different methods for eliciting knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chapter presents an overview of the various types of operation references: constitutive equations, set points, design parameters, component characteristics etc., and their validity in different situations. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engineer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described in some detail. Finally we address the problem of where to put the dot and the lines: when all information is on the table, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose was to develop a software tool for maintenance supervision of components in a nuclear power plant. (au)

  2. Laser-based techniques for living cell pattern formation

    Science.gov (United States)

    Hopp, Béla; Smausz, Tomi; Papdi, Bence; Bor, Zsolt; Szabó, András; Kolozsvári, Lajos; Fotakis, Costas; Nógrádi, Antal

    2008-10-01

    In the production of biosensors or artificial tissues a basic step is the immobilization of living cells along the required pattern. In this paper the ability of some promising laser-based methods to influence the interaction between cells and various surfaces is presented. In the first set of experiments laser-induced patterned photochemical modification of polymer foils was used to achieve guided adherence and growth of cells to the modified areas: (a) Polytetrafluoroethylene was irradiated with ArF excimer laser ( λ=193 nm, FWHM=20 ns, F=9 mJ/cm2) in presence of triethylene tetramine liquid photoreagent; (b) a thin carbon layer was produced by KrF excimer laser ( λ=248 nm, FWHM=30 ns, F=35 mJ/cm2) irradiation on polyimide surface to influence the cell adherence. It was found that the incorporation of amine groups in the PTFE polymer chain instead of the fluorine atoms can both promote and prevent the adherence of living cells (depending on the applied cell types) on the treated surfaces, while the laser generated carbon layer on polyimide surface did not effectively improve adherence. Our attempts to influence the cell adherence by morphological modifications created by ArF laser irradiation onto polyethylene terephtalate surface showed a surface roughness dependence. This method was effective only when the Ra roughness parameter of the developed structure did not exceed the 0.1 micrometer value. Pulsed laser deposition with femtosecond KrF excimer lasers ( F=2.2 J/cm2) was effectively used to deposit structured thin films from biomaterials (endothelial cell growth supplement and collagen embedded in starch matrix) to promote the adherence and growth of cells. These results present evidence that some surface can be successfully altered to induce guided cell growth.

  3. The efficacy and toxicity of individualized intensity-modulated radiotherapy based on the tumor extension patterns of nasopharyngeal carcinoma

    Science.gov (United States)

    Zhou, Guan-Qun; Guo, Rui; Zhang, Fan; Zhang, Yuan; Xu, Lin; Zhang, Lu-Lu; Lin, Ai-Hua; Ma, Jun; Sun, Ying

    2016-01-01

    Background To evaluate the efficacy and toxicity of intensity-modulated radiotherapy (IMRT) using individualized clinical target volumes (CTVs) based on the loco-regional extension patterns of nasopharyngeal carcinoma (NPC). Methods From December 2009 to February 2012, 220 patients with histologically-proven, non-disseminated NPC were prospectively treated with IMRT according to an individualized delineation protocol. CTV1 encompassed the gross tumor volume, entire nasopharyngeal mucosa and structures within the pharyngobasilar fascia with a margin. CTV2 encompassed bilateral high risk anatomic sites and downstream anatomic sites adjacent to primary tumor, bilateral retropharyngeal regions, levels II, III and Va, and prophylactic irradiation was gave to one or two levels beyond clinical lymph nodes involvement. Clinical outcomes and toxicities were evaluated. Results Median follow-up was 50.8 (range, 1.3–68.0) months, four-year local relapse-free, regional relapse-free, distant metastasis-free, disease-free and overall survival rates were 94.7%, 97.0%, 91.7%, 87.2% and 91.9%, respectively. Acute severe (≥ grade 3) mucositis, dermatitis and xerostomia were observed in 27.6%, 3.6% and zero patients, respectively. At 1 year, xerostomia was mild, with frequencies of Grade 0, 1, 2 and 3 xerostomia of 27.9%, 63.3%, 8.3% and 0.5%, respectively. Conclusions IMRT using individualized CTVs provided high rates of local and regional control and a favorable toxicity profile in NPC. Individualized CTV delineation strategy is a promising one that may effectively avoid unnecessary or missed irradiation, and deserve optimization to define more precise individualized CTVs. PMID:26980744

  4. Analysis of ISO/IEEE 11073 built-in security and its potential IHE-based extensibility.

    Science.gov (United States)

    Rubio, Óscar J; Trigo, Jesús D; Alesanco, Álvaro; Serrano, Luis; García, José

    2016-04-01

    The ISO/IEEE 11073 standard for Personal Health Devices (X73PHD) aims to ensure interoperability between Personal Health Devices and aggregators-e.g. health appliances, routers-in ambulatory setups. The Integrating the Healthcare Enterprise (IHE) initiative promotes the coordinated use of different standards in healthcare systems (e.g. Personal/Electronic Health Records, alert managers, Clinical Decision Support Systems) by defining profiles intended for medical use cases. X73PHD provides a robust syntactic model and a comprehensive terminology, but it places limited emphasis on security and on interoperability with IHE-compliant systems and frameworks. However, the implementation of eHealth/mHealth applications in environments such as health and fitness monitoring, independent living and disease management (i.e. the X73PHD domains) increasingly requires features such as secure connections to mobile aggregators-e.g. smartphones, tablets-, the sharing of devices among different users with privacy, and interoperability with certain IHE-compliant healthcare systems. This work proposes a comprehensive IHE-based X73PHD extension consisting of additive layers adapted to different eHealth/mHealth applications, after having analyzed the features of X73PHD (especially its built-in security), IHE profiles related with these applications and other research works. Both the new features proposed for each layer and the procedures to support them have been carefully chosen to minimize the impact on X73PHD, on its architecture (in terms of delays and overhead) and on its framework. Such implications are thoroughly analyzed in this paper. As a result, an extended model of X73PHD is proposed, preserving its essential features while extending them with added value.

  5. A content-based image retrieval method for optical colonoscopy images based on image recognition techniques

    Science.gov (United States)

    Nosato, Hirokazu; Sakanashi, Hidenori; Takahashi, Eiichi; Murakawa, Masahiro

    2015-03-01

    This paper proposes a content-based image retrieval method for optical colonoscopy images that can find images similar to ones being diagnosed. Optical colonoscopy is a method of direct observation for colons and rectums to diagnose bowel diseases. It is the most common procedure for screening, surveillance and treatment. However, diagnostic accuracy for intractable inflammatory bowel diseases, such as ulcerative colitis (UC), is highly dependent on the experience and knowledge of the medical doctor, because there is considerable variety in the appearances of colonic mucosa within inflammations with UC. In order to solve this issue, this paper proposes a content-based image retrieval method based on image recognition techniques. The proposed retrieval method can find similar images from a database of images diagnosed as UC, and can potentially furnish the medical records associated with the retrieved images to assist the UC diagnosis. Within the proposed method, color histogram features and higher order local auto-correlation (HLAC) features are adopted to represent the color information and geometrical information of optical colonoscopy images, respectively. Moreover, considering various characteristics of UC colonoscopy images, such as vascular patterns and the roughness of the colonic mucosa, we also propose an image enhancement method to highlight the appearances of colonic mucosa in UC. In an experiment using 161 UC images from 32 patients, we demonstrate that our method improves the accuracy of retrieving similar UC images.

  6. 香蕉育苗技术的推广应用%Extension and Utilization of Seedling Breeding Technique for Banana

    Institute of Scientific and Technical Information of China (English)

    王永壮; 符运柳; 刘以道; 覃和业

    2011-01-01

    介绍健康、优质香蕉组培苗一、二级苗的培育技术,以促进香蕉产业的健康发展。%Based on many years of practical experience in the production of banana seedlings,author now is introducing the seedling cultivation technology on health and quality of banana tissue culture to promote the healthy development of the banana industry.

  7. Whole genome sequencing-based characterization of extensively drug resistant (XDR) strains of Mycobacterium tuberculosis from Pakistan

    KAUST Repository

    Hasan, Zahra

    2015-03-01

    Objectives: The global increase in drug resistance in Mycobacterium tuberculosis (MTB) strains increases the focus on improved molecular diagnostics for MTB. Extensively drug-resistant (XDR) - TB is caused by MTB strains resistant to rifampicin, isoniazid, fluoroquinolone and aminoglycoside antibiotics. Resistance to anti-tuberculous drugs has been associated with single nucleotide polymorphisms (SNPs), in particular MTB genes. However, there is regional variation between MTB lineages and the SNPs associated with resistance. Therefore, there is a need to identify common resistance conferring SNPs so that effective molecular-based diagnostic tests for MTB can be developed. This study investigated used whole genome sequencing (WGS) to characterize 37 XDR MTB isolates from Pakistan and investigated SNPs related to drug resistance. Methods: XDR-TB strains were selected. DNA was extracted from MTB strains, and samples underwent WGS with 76-base-paired end fragment sizes using Illumina paired end HiSeq2000 technology. Raw sequence data were mapped uniquely to H37Rv reference genome. The mappings allowed SNPs and small indels to be called using SAMtools/BCFtools. Results: This study found that in all XDR strains, rifampicin resistance was attributable to SNPs in the rpoB RDR region. Isoniazid resistance-associated mutations were primarily related to katG codon 315 followed by inhA S94A. Fluoroquinolone resistance was attributable to gyrA 91-94 codons in most strains, while one did not have SNPs in either gyrA or gyrB. Aminoglycoside resistance was mostly associated with SNPs in rrs, except in 6 strains. Ethambutol resistant strains had embB codon 306 mutations, but many strains did not have this present. The SNPs were compared with those present in commercial assays such as LiPA Hain MDRTBsl, and the sensitivity of the assays for these strains was evaluated. Conclusions: If common drug resistance associated with SNPs evaluated the concordance between phenotypic and

  8. Wood lens design philosophy based on a binary additive manufacturing technique

    Science.gov (United States)

    Marasco, Peter L.; Bailey, Christopher

    2016-04-01

    Using additive manufacturing techniques in optical engineering to construct a gradient index (GRIN) optic may overcome a number of limitations of GRIN technology. Such techniques are maturing quickly, yielding additional design degrees of freedom for the engineer. How best to employ these degrees of freedom is not completely clear at this time. This paper describes a preliminary design philosophy, including assumptions, pertaining to a particular printing technique for GRIN optics. It includes an analysis based on simulation and initial component measurement.

  9. Differential Cyclic Voltammetry - a Novel Technique for Selective and Simultaneous Detection using Redox Cycling Based Sensors

    OpenAIRE

    Odijk, M.; Wiedemair, J.; Megen, M.J.J; Olthuis, W.; Van den Berg, A.

    2010-01-01

    Redox cycling (RC) is an effect that is used to amplify electrochemical signals. However, traditional techniques such as cyclic voltammetry (CV) do not provide clear insight for a mixture of multiple redox couples while RC is applied. Thus, we have developed a new measurement technique which delivers electrochemical spectra of all reversible redox couples present based on concentrations and standard potentials. This technique has been named differential cyclic voltammetry (DCV). We have fabri...

  10. MVClustViz: A Novice Yet Simple Multivariate Cluster Visualization Technique for Centroid-based Clusters

    OpenAIRE

    Sagar S. De; Minati Mishra; Satchidananda Dehuri

    2013-01-01

    In the visual data mining, visualization of clusters is a challenging task. Although lots of techniques already have been developed, the challenges still remain to represent large volume of data with multiple dimension and overlapped clusters. In this paper, a multivariate clusters visualization technique (MVClustViz) has been presented to visualize the centroid-based clusters. The geographic projection technique supports multi-dimension, large volume, and both crisp and fuzzy clusters visual...

  11. Intrusion Detection Systems Based on Artificial Intelligence Techniques in Wireless Sensor Networks

    OpenAIRE

    Nabil Ali Alrajeh; Lloret, J

    2013-01-01

    Intrusion detection system (IDS) is regarded as the second line of defense against network anomalies and threats. IDS plays an important role in network security. There are many techniques which are used to design IDSs for specific scenario and applications. Artificial intelligence techniques are widely used for threats detection. This paper presents a critical study on genetic algorithm, artificial immune, and artificial neural network (ANN) based IDSs techniques used in wireless sensor netw...

  12. INTELLIGENT CAR STYLING TECHNIQUE AND SYSTEM BASED ON A NEW AERODYNAMIC-THEORETICAL MODEL

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Car styling technique based on a new theoretical model of automotive aerodynamics is introduced, which is proved to be feasible and effective by wind tunnel tests. Development of a multi-module software system from this technique, including modules of knowledge processing, referential styling and ANN aesthetic evaluation etc, capable of assisting car styling works in an intelligent way, is also presented and discussed.

  13. A new approach of binary addition and subtraction by non-linear material based switching technique

    Indian Academy of Sciences (India)

    Archan Kumar Das; Partha Partima Das; Sourangshu Mukhopadhyay

    2005-02-01

    Here, we refer a new proposal of binary addition as well as subtraction in all-optical domain by exploitation of proper non-linear material-based switching technique. In this communication, the authors extend this technique for both adder and subtractor accommodating the spatial input encoding system.

  14. An extensively hydrolysed casein-based formula for infants with cows' milk protein allergy : tolerance/hypo-allergenicity and growth catch-up

    NARCIS (Netherlands)

    Dupont, Christophe; Hol, Jeroen; Nieuwenhuis, Edward E. S.

    2015-01-01

    Children with cows' milk protein allergy (CMPA) are at risk of insufficient length and weight gain, and the nutritional efficacy of hypo-allergenic formulas should be carefully assessed. In 2008, a trial assessed the impact of probiotic supplementation of an extensively hydrolysed casein-based formu

  15. Maternal mortality in rural south Ethiopia: outcomes of community-based birth registration by health extension workers.

    Directory of Open Access Journals (Sweden)

    Yaliso Yaya

    Full Text Available Rural communities in low-income countries lack vital registrations to track birth outcomes. We aimed to examine the feasibility of community-based birth registration and measure maternal mortality ratio (MMR in rural south Ethiopia.In 2010, health extension workers (HEWs registered births and maternal deaths among 421,639 people in three districts (Derashe, Bonke, and Arba Minch Zuria. One nurse-supervisor per district provided administrative and technical support to HEWs. The primary outcomes were the feasibility of registration of a high proportion of births and measuring MMR. The secondary outcome was the proportion of skilled birth attendance. We validated the completeness of the registry and the MMR by conducting a house-to-house survey in 15 randomly selected villages in Bonke.We registered 10,987 births (81·4% of expected 13,492 births with annual crude birth rate of 32 per 1,000 population. The validation study showed that, of 2,401 births occurred in the surveyed households within eight months of the initiation of the registry, 71·6% (1,718 were registered with similar MMRs (474 vs. 439 between the registered and unregistered births. Overall, we recorded 53 maternal deaths; MMR was 489 per 100,000 live births and 83% (44 of 53 maternal deaths occurred at home. Ninety percent (9,863 births were at home, 4% (430 at health posts, 2·5% (282 at health centres, and 3·5% (412 in hospitals. MMR increased if: the male partners were illiterate (609 vs. 346; p= 0·051 and the villages had no road access (946 vs. 410; p= 0·039. The validation helped to increase the registration coverage by 10% through feedback discussions.It is possible to obtain a high-coverage birth registration and measure MMR in rural communities where a functional system of community health workers exists. The MMR was high in rural south Ethiopia and most births and maternal deaths occurred at home.

  16. AQA-PM: Extension of the Air-Quality Model For Austria with Satellite based Particulate Matter Estimates

    Science.gov (United States)

    Hirtl, Marcus; Mantovani, Simone; Krüger, Bernd C.; Triebnig, Gerhard; Flandorfer, Claudia

    2013-04-01

    Air quality is a key element for the well-being and quality of life of European citizens. Air pollution measurements and modeling tools are essential for assessment of air quality according to EU legislation. The responsibilities of ZAMG as the national weather service of Austria include the support of the federal states and the public in questions connected to the protection of the environment in the frame of advisory and counseling services as well as expert opinions. The Air Quality model for Austria (AQA) is operated at ZAMG in cooperation with the University of Natural Resources and Life Sciences in Vienna (BOKU) by order of the regional governments since 2005. AQA conducts daily forecasts of gaseous and particulate (PM10) air pollutants over Austria. In the frame of the project AQA-PM (funded by FFG), satellite measurements of the Aerosol Optical Thickness (AOT) and ground-based PM10-measurements are combined to highly-resolved initial fields using regression- and assimilation techniques. For the model simulations WRF/Chem is used with a resolution of 3 km over the alpine region. Interfaces have been developed to account for the different measurements as input data. The available local emission inventories provided by the different Austrian regional governments were harmonized and used for the model simulations. An episode in February 2010 is chosen for the model evaluation. During that month exceedances of PM10-thresholds occurred at many measurement stations of the Austrian network. Different model runs (only model/only ground stations assimilated/satellite and ground stations assimilated) are compared to the respective measurements. The goal of this project is to improve the PM10-forecasts for Austria with the integration of satellite based measurements and to provide a comprehensive product-platform.

  17. Measuring glioma volumes: A comparison of linear measurement based formulae with the manual image segmentation technique

    Directory of Open Access Journals (Sweden)

    Sanjeev A Sreenivasan

    2016-01-01

    Conclusions: Manual region of interest-based image segmentation is the standard technique for measuring glioma volumes. For routine clinical use, the simple formula v = abc/2 (or the formula for volume of an ellipsoid could be used as alternatives.

  18. Photoacoustic Techniques for Trace Gas Sensing Based on Semiconductor Laser Sources

    Directory of Open Access Journals (Sweden)

    Vincenzo Spagnolo

    2009-12-01

    Full Text Available The paper provides an overview on the use of photoacoustic sensors based on semiconductor laser sources for the detection of trace gases. We review the results obtained using standard, differential and quartz enhanced photoacoustic techniques.

  19. Arrayed primer extension in the "array of arrays" format: a rational approach for microarray-based SNP genotyping

    DEFF Research Database (Denmark)

    Klitø, Niels G F; Tan, Qihua; Nyegaard, Mette;

    2007-01-01

    This study provides a new version of the arrayed primer extension (APEX) protocol adapted to the 'array of arrays' platform using an instrumental setup for microarray processing not previously described. The primary aim of the study is to implement a system for rational cost-efficient genotyping ...

  20. Comparative Study of Retinal Vessel Segmentation Based on Global Thresholding Techniques

    Directory of Open Access Journals (Sweden)

    Temitope Mapayi

    2015-01-01

    Full Text Available Due to noise from uneven contrast and illumination during acquisition process of retinal fundus images, the use of efficient preprocessing techniques is highly desirable to produce good retinal vessel segmentation results. This paper develops and compares the performance of different vessel segmentation techniques based on global thresholding using phase congruency and contrast limited adaptive histogram equalization (CLAHE for the preprocessing of the retinal images. The results obtained show that the combination of preprocessing technique, global thresholding, and postprocessing techniques must be carefully chosen to achieve a good segmentation performance.

  1. A quality control technique based on UV-VIS absorption spectroscopy for tequila distillery factories

    Science.gov (United States)

    Barbosa Garcia, O.; Ramos Ortiz, G.; Maldonado, J. L.; Pichardo Molina, J.; Meneses Nava, M. A.; Landgrave, Enrique; Cervantes, M. J.

    2006-02-01

    A low cost technique based on the UV-VIS absorption spectroscopy is presented for the quality control of the spirit drink known as tequila. It is shown that such spectra offer enough information to discriminate a given spirit drink from a group of bottled commercial tequilas. The technique was applied to white tequilas. Contrary to the reference analytic methods, such as chromatography, for this technique neither special personal training nor sophisticated instrumentations is required. By using hand-held instrumentation this technique can be applied in situ during the production process.

  2. Comparison of multivariate preprocessing techniques as applied to electronic tongue based pattern classification for black tea

    Energy Technology Data Exchange (ETDEWEB)

    Palit, Mousumi [Department of Electronics and Telecommunication Engineering, Central Calcutta Polytechnic, Kolkata 700014 (India); Tudu, Bipan, E-mail: bt@iee.jusl.ac.in [Department of Instrumentation and Electronics Engineering, Jadavpur University, Kolkata 700098 (India); Bhattacharyya, Nabarun [Centre for Development of Advanced Computing, Kolkata 700091 (India); Dutta, Ankur; Dutta, Pallab Kumar [Department of Instrumentation and Electronics Engineering, Jadavpur University, Kolkata 700098 (India); Jana, Arun [Centre for Development of Advanced Computing, Kolkata 700091 (India); Bandyopadhyay, Rajib [Department of Instrumentation and Electronics Engineering, Jadavpur University, Kolkata 700098 (India); Chatterjee, Anutosh [Department of Electronics and Communication Engineering, Heritage Institute of Technology, Kolkata 700107 (India)

    2010-08-18

    In an electronic tongue, preprocessing on raw data precedes pattern analysis and choice of the appropriate preprocessing technique is crucial for the performance of the pattern classifier. While attempting to classify different grades of black tea using a voltammetric electronic tongue, different preprocessing techniques have been explored and a comparison of their performances is presented in this paper. The preprocessing techniques are compared first by a quantitative measurement of separability followed by principle component analysis; and then two different supervised pattern recognition models based on neural networks are used to evaluate the performance of the preprocessing techniques.

  3. Comparison of multivariate preprocessing techniques as applied to electronic tongue based pattern classification for black tea.

    Science.gov (United States)

    Palit, Mousumi; Tudu, Bipan; Bhattacharyya, Nabarun; Dutta, Ankur; Dutta, Pallab Kumar; Jana, Arun; Bandyopadhyay, Rajib; Chatterjee, Anutosh

    2010-08-18

    In an electronic tongue, preprocessing on raw data precedes pattern analysis and choice of the appropriate preprocessing technique is crucial for the performance of the pattern classifier. While attempting to classify different grades of black tea using a voltammetric electronic tongue, different preprocessing techniques have been explored and a comparison of their performances is presented in this paper. The preprocessing techniques are compared first by a quantitative measurement of separability followed by principle component analysis; and then two different supervised pattern recognition models based on neural networks are used to evaluate the performance of the preprocessing techniques.

  4. Novel Metaknowledge-based Processing Technique for Multimedia Big Data clustering challenges

    OpenAIRE

    Bari, Nima; Vichr, Roman; Kowsari, Kamran; Berkovich, Simon Y.

    2015-01-01

    Past research has challenged us with the task of showing relational patterns between text-based data and then clustering for predictive analysis using Golay Code technique. We focus on a novel approach to extract metaknowledge in multimedia datasets. Our collaboration has been an on-going task of studying the relational patterns between datapoints based on metafeatures extracted from metaknowledge in multimedia datasets. Those selected are significant to suit the mining technique we applied, ...

  5. DEVELOPMENT OF OBSTACLE AVOIDANCE TECHNIQUE IN WEB-BASED GEOGRAPHIC INFORMATION SYSTEM FOR TRAFFIC MANAGEMENT USING OPEN SOURCE SOFTWARE

    Directory of Open Access Journals (Sweden)

    Nik Mohd Ramli Nik Yusoff

    2014-01-01

    Full Text Available The shortest path routing is one of the well-known network analysis techniques implemented in road management systems. Pg Routing as an extension of Postgre SQL/Post GIS database is an open source library that implements the Dijkstra shortest path algorithm. However, the functionality to avoid obstacles in that analysis is still limited. Therefore, this study was conducted to enable obstacle avoidance function in the existing pgRouting algorithm using OpenStreetMap road network. By implementing this function, it enhances the Dijkstra algorithm ability in network analysis. In this study a dynamic restriction feature is added at the program level to represent the obstacles on the road. With this modification the algorithm is now able to generate an alternative route by avoiding the existence of obstacles on the roads. By using OpenLayers and PHP a web-based GIS platform was developed to ease the system’s usability.

  6. Finite element modelling of non-bonded piezo sensors for biomedical health monitoring of bones based on EMI technique

    Science.gov (United States)

    Srivastava, Shashank; Bhalla, Suresh; Madan, Alok; Gupta, Ashok

    2016-04-01

    Extensive research is currently underway across the world for employing piezo sensors for biomedical health monitoring in view of their obvious advantages such as low cost,fast dynamics response and bio-compatibility.However,one of the limitations of the piezo sensor in bonded mode based on the electro-mechanical impedance (EMI) technique is that it can cause harmful effects to the humans in terms of irritation ,bone and skin disease. This paper which is in continuation of the recent demonstration of non-bonded configuration is a step towards simulating and analyzing the non-bonded configuration of the piezo sensor for gauging its effectiveness using FEA software. It has been noted that the conductance signatures obtained in non-bonded mode are significantly close to the conventional bonded configuration, thus giving a positive indication of its field use.

  7. Multi technique amalgamation for enhanced information identification with content based image data.

    Science.gov (United States)

    Das, Rik; Thepade, Sudeep; Ghosh, Saurav

    2015-01-01

    Image data has emerged as a resourceful foundation for information with proliferation of image capturing devices and social media. Diverse applications of images in areas including biomedicine, military, commerce, education have resulted in huge image repositories. Semantically analogous images can be fruitfully recognized by means of content based image identification. However, the success of the technique has been largely dependent on extraction of robust feature vectors from the image content. The paper has introduced three different techniques of content based feature extraction based on image binarization, image transform and morphological operator respectively. The techniques were tested with four public datasets namely, Wang Dataset, Oliva Torralba (OT Scene) Dataset, Corel Dataset and Caltech Dataset. The multi technique feature extraction process was further integrated for decision fusion of image identification to boost up the recognition rate. Classification result with the proposed technique has shown an average increase of 14.5 % in Precision compared to the existing techniques and the retrieval result with the introduced technique has shown an average increase of 6.54 % in Precision over state-of-the art techniques. PMID:26798574

  8. Practical Framework for an Electron Beam Induced Current Technique Based on a Numerical Optimization Approach

    Science.gov (United States)

    Yamaguchi, Hideshi; Soeda, Takeshi

    2015-03-01

    A practical framework for an electron beam induced current (EBIC) technique has been established for conductive materials based on a numerical optimization approach. Although the conventional EBIC technique is useful for evaluating the distributions of dopants or crystal defects in semiconductor transistors, issues related to the reproducibility and quantitative capability of measurements using this technique persist. For instance, it is difficult to acquire high-quality EBIC images throughout continuous tests due to variation in operator skill or test environment. Recently, due to the evaluation of EBIC equipment performance and the numerical optimization of equipment items, the constant acquisition of high contrast images has become possible, improving the reproducibility as well as yield regardless of operator skill or test environment. The technique proposed herein is even more sensitive and quantitative than scanning probe microscopy, an imaging technique that can possibly damage the sample. The new technique is expected to benefit the electrical evaluation of fragile or soft materials along with LSI materials.

  9. Application of USP inlet extensions to the TSI impactor system 3306/3320 using HFA 227 based solution metered dose inhalers.

    Science.gov (United States)

    Mogalian, Erik; Myrdal, Paul Brian

    2005-12-01

    The objective of this study was to further evaluate the need for a vertical inlet extension when testing solution metered dose inhalers using the TSI Model 3306 Impactor Inlet in conjunction with the TSI Model 3320 Aerodynamic Particle Sizer (APS). The configurations tested using the TSI system were compared to baseline measurements that were performed using the Andersen Mark II 8-stage cascade impactor (ACI). Seven pressurized solution metered dose inhalers were tested using varied concentrations of beclomethasone dipropionate (BDP), ethanol, and HFA 227 propellant. The inhalers were tested with the cascade impactor, and with the TSI system. The TSI system had three different configurations as the manufacturer provided (0 cm) or with inlet extensions of 20 and 40 cm. The extensions were located between the USP inlet and the Model 3306 Impactor Inlet. There were no practical differences between each system for the stem, actuator, or USP inlet. The fine particle mass (aerodynamic mass < 4.7 microm) was affected by extension length and correlated well with the ACI when an extension was present. APS particle size measurements were unaffected by the extension lengths and correlated well to particle size determined from the ACI analysis. It has been confirmed that an inlet extension may be necessary for the TSI system in order to give mass results that correlate to the ACI, especially for formulations having significant concentrations of low volatility excipients. Additionally, the results generated from this study were used to evaluate the product performance of HFA 227 based solution formulations that contain varying concentrations of ethanol as a cosolvent. PMID:16316853

  10. Image Stitching System Based on ORB Feature-Based Technique and Compensation Blending

    Directory of Open Access Journals (Sweden)

    Ebtsam Adel

    2015-09-01

    Full Text Available The construction of a high-resolution panoramic image from a sequence of input overlapping images of the same scene is called image stitching/mosaicing. It is considered as an important, challenging topic in computer vision, multimedia, and computer graphics. The quality of the mosaic image and the time cost are the two primary parameters for measuring the stitching performance. Therefore, the main objective of this paper is to introduce a high-quality image stitching system with least computation time. First, we compare many different features detectors. We test Harris corner detector, SIFT, SURF, FAST, GoodFeaturesToTrack, MSER, and ORB techniques to measure the detection rate of the corrected keypoints and processing time. Second, we manipulate the implementation of different common categories of image blending methods to increase the quality of the stitching process. From experimental results, we conclude that ORB algorithm is the fastest, more accurate, and with higher performance. In addition, Exposure Compensation is the highest stitching quality blending method. Finally, we have generated an image stitching system based on ORB using Exposure Compensation blending method.

  11. Simulation-driven design by knowledge-based response correction techniques

    CERN Document Server

    Koziel, Slawomir

    2016-01-01

    Focused on efficient simulation-driven multi-fidelity optimization techniques, this monograph on simulation-driven optimization covers simulations utilizing physics-based low-fidelity models, often based on coarse-discretization simulations or other types of simplified physics representations, such as analytical models. The methods presented in the book exploit as much as possible any knowledge about the system or device of interest embedded in the low-fidelity model with the purpose of reducing the computational overhead of the design process. Most of the techniques described in the book are of response correction type and can be split into parametric (usually based on analytical formulas) and non-parametric, i.e., not based on analytical formulas. The latter, while more complex in implementation, tend to be more efficient. The book presents a general formulation of response correction techniques as well as a number of specific methods, including those based on correcting the low-fidelity model response (out...

  12. Android Access Control Extension

    Directory of Open Access Journals (Sweden)

    Anton Baláž

    2015-12-01

    Full Text Available The main objective of this work is to analyze and extend security model of mobile devices running on Android OS. Provided security extension is a Linux kernel security module that allows the system administrator to restrict program's capabilities with per-program profiles. Profiles can allow capabilities like network access, raw socket access, and the permission to read, write, or execute files on matching paths. Module supplements the traditional Android capability access control model by providing mandatory access control (MAC based on path. This extension increases security of access to system objects in a device and allows creating security sandboxes per application.

  13. Nasal base narrowing of the caucasian nose through the cerclage technique

    Directory of Open Access Journals (Sweden)

    Mocellin, Marcos

    2010-06-01

    Full Text Available Introduction: Several techniques can be performed to reduce the nasal base (narrowing, as skin resection vestibular columellar skin resection, resection of skin in elliptical lip narinary, sloughing of skin and advancements (VY technique of Bernstein and the use of cerclage sutures in the nasal base. Objective: To evaluate the technique of cerclage performed in the nasal base, through endonasal rhinoplasty without delivery of basic technique, in the Caucasian nose, reducing the distance inter-alar flare and correcting the wing with consequent improvement in nasal harmony in the whole face. Methods: A retrospective analysis by analysis of clinical documents and photos of 43 patients in whom cerclage was made of the nasal base by resecting skin ellipse in the region of the vestibule and the nasal base (modified technique of Weir using colorless mononylon® 4 "0" with a straight cutting needle. The study was conducted in 2008 and 2009 at Hospital of Paraná Institute of Otolaryngology - IPO in Curitiba, Parana - Brazil. Patients had a follow up ranging 7-12 months. Results: In 100% of cases was achieved an improvement in nasal harmony, by decreasing the inter-alar distance. Conclusion: The encircling with minimal resection of vestibular skin and the nasal base is an effective method for the narrowing of the nasal base in the Caucasian nose, with predictable results and easy to perform.

  14. A GENERIC APPROACH TO CONTENT BASED IMAGE RETRIEVAL USING DCT AND CLASSIFICATION TECHNIQUES

    OpenAIRE

    RAMESH BABU DURAI C; Dr.V.DURAISAMY

    2010-01-01

    With the rapid development of technology, the traditional information retrieval techniques based on keywords are not sufficient, content - based image retrieval (CBIR) has been an active research topic.Content Based Image Retrieval (CBIR) technologies provide a method to find images in large databases by using unique descriptors from a trained image. The ability of the system to classify images based on the training set feature extraction is quite challenging.In this paper we propose to extra...

  15. Effectiveness of Agricultural Extension Activities

    Directory of Open Access Journals (Sweden)

    Ali AL-Sharafat

    2012-01-01

    Full Text Available Problem statement: Jordans agricultural extension service is seriously under-staffed and its effectiveness is consequently compromised. Reservations are being expressed about the performance and capability of the agricultural extension system in Jordan. The performance of this sector has been disappointing and has failed to transfer agricultural technology to the farmers. The main objective of this study is to assess the effectiveness of Jordans agricultural extension services. Approach: The effect of extension services on olive productivity in the study area was investigated. A total number of 60 olive producers were selected to be interviewed for this study. This number was enough to achieve the study objectives. The interviewed producers were distributed almost equally within olive production locations in the study area. The sample obtained through the simple random sampling technique. The two groups had been chosen and distributed randomly into an experimental group (30 farmers; 10 for each source of extension service and control group (30 farmers. The experimental group received extension services and the control group received no extension services. Two interview-cum-structured questionnaires were designed and used to collect information and data for this study. The first instrument was designed for farmers who received extension services and the second from farmers who received no extension services. Another questionnaire was designed for administrators of extension organizations concerned with providing extension services to farmers. To find the differences that may exist between two studied groups, One Way Analysis of Variance (ANOVA, t-test and LSD test via Statistical Package for Social Sciences software (SPSS were used. The average net profit obtained from an area of one dynamo of olive farm was the main item to be considered in determining the effectiveness of agricultural extension activities. Results and Conclusion: The results of

  16. An Analysis of the Risk in Discretely Rebalanced Option Hedges and Delta-Based Techniques

    OpenAIRE

    Russell P. Robins; Barry Schachter

    1994-01-01

    The stochastic properties of discretely rebalanced option hedges have been studied extensively beginning with Black and Scholes (1973). In each analysis hedges were "delta-neutral" after rebalancing. We argue that the distributional properties of discretely rebalanced hedges are such that delta-based hedging is not the variance minimizing strategy. This paper obtains analytical expressions for the variance minimizing option hedge ratios. We also evaluate the hedge variance to assess the magni...

  17. Model-based fault diagnosis techniques design schemes, algorithms, and tools

    CERN Document Server

    Ding, Steven

    2008-01-01

    The objective of this book is to introduce basic model-based FDI schemes, advanced analysis and design algorithms, and the needed mathematical and control theory tools at a level for graduate students and researchers as well as for engineers. This is a textbook with extensive examples and references. Most methods are given in the form of an algorithm that enables a direct implementation in a programme. Comparisons among different methods are included when possible.

  18. Applications of synchrotron-based X-ray techniques in environmental science

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Synchrotron-based X-ray techniques have been widely applied to the fields of environmental science due to their element-specific and nondestructive properties and unique spectral and spatial resolution advantages.The techniques are capable of in situ investigating chemical speciation,microstructure and mapping of elements in question at the molecular or nanometer scale,and thus provide direct evidence for reaction mechanisms for various environmental processes.In this contribution,the applications of three types of the techniques commonly used in the fields of environmental research are reviewed,namely X-ray absorption spectroscopy (XAS),X-ray fluorescence (XRF) spectroscopy and scanning transmission X-ray microscopy (STXM).In particular,the recent advances of the techniques in China are elaborated,and a selection of the applied examples are provided in the field of environmental science.Finally,the perspectives of synchrotron-based X-ray techniques are discussed.With their great progress and wide application,the techniques have revolutionized our understanding of significant geo-and bio-chemical processes.It is anticipatable that synchrotron-based X-ray techniques will continue to play a significant role in the fields and significant advances will be obtained in decades ahead.

  19. SVD-Based Optimal Filtering Technique for Noise Reduction in Hearing Aids Using Two Microphones

    Directory of Open Access Journals (Sweden)

    Jean-Baptiste Maj

    2002-04-01

    Full Text Available We introduce a new SVD-based (Singular value decomposition strategy for noise reduction in hearing aids. This technique is evaluated for noise reduction in a behind-the-ear (BTE hearing aid where two omnidirectional microphones are mounted in an endfire configuration. The behaviour of the SVD-based technique is compared to a two-stage adaptive beamformer for hearing aids developed by Vanden Berghe and Wouters (1998. The evaluation and comparison is done with a performance metric based on the speech intelligibility index (SII. The speech and noise signals are recorded in reverberant conditions with a signal-to-noise ratio of 0 dB and the spectrum of the noise signals is similar to the spectrum of the speech signal. The SVD-based technique works without initialization nor assumptions about a look direction, unlike the two-stage adaptive beamformer. Still, for different noise scenarios, the SVD-based technique performs as well as the two-stage adaptive beamformer, for a similar filter length and adaptation time for the filter coefficients. In a diffuse noise scenario, the SVD-based technique performs better than the two-stage adaptive beamformer and hence provides a more flexible and robust solution under speaker position variations and reverberant conditions.

  20. Repeat Customer Success in Extension

    Science.gov (United States)

    Bess, Melissa M.; Traub, Sarah M.

    2013-01-01

    Four multi-session research-based programs were offered by two Extension specialist in one rural Missouri county. Eleven participants who came to multiple Extension programs could be called "repeat customers." Based on the total number of participants for all four programs, 25% could be deemed as repeat customers. Repeat customers had…

  1. PIXEL VS OBJECT-BASED IMAGE CLASSIFICATION TECHNIQUES FOR LIDAR INTENSITY DATA

    Directory of Open Access Journals (Sweden)

    N. El-Ashmawy

    2012-09-01

    Full Text Available Light Detection and Ranging (LiDAR systems are remote sensing techniques used mainly for terrain surface modelling. LiDAR sensors record the distance between the sensor and the targets (range data with a capability to record the strength of the backscatter energy reflected from the targets (intensity data. The LiDAR sensors use the near-infrared spectrum range which provides high separability in the reflected energy by the target. This phenomenon is investigated to use the LiDAR intensity data for land-cover classification. The goal of this paper is to investigate and evaluates the use of different image classification techniques applied on LiDAR intensity data for land cover classification. The two techniques proposed are: a Maximum likelihood classifier used as pixel- based classification technique; and b Image segmentation used as object-based classification technique. A study area covers an urban district in Burnaby, British Colombia, Canada, is selected to test the different classification techniques for extracting four feature classes: buildings, roads and parking areas, trees, and low vegetation (grass areas, from the LiDAR intensity data. Generally, the results show that LiDAR intensity data can be used for land cover classification. An overall accuracy of 63.5% can be achieved using the pixel-based classification technique. The overall accuracy of the results is improved to 68% using the object- based classification technique. Further research is underway to investigate different criteria for segmentation process and to refine the design of the object-based classification algorithm.

  2. Simultaneous and integrated neutron-based techniques for material analysis of a metallic ancient flute

    Science.gov (United States)

    Festa, G.; Pietropaolo, A.; Grazzi, F.; Sutton, L. F.; Scherillo, A.; Bognetti, L.; Bini, A.; Barzagli, E.; Schooneveld, E.; Andreani, C.

    2013-09-01

    A metallic 19th century flute was studied by means of integrated and simultaneous neutron-based techniques: neutron diffraction, neutron radiative capture analysis and neutron radiography. This experiment follows benchmark measurements devoted to assessing the effectiveness of a multitask beamline concept for neutron-based investigation on materials. The aim of this study is to show the potential application of the approach using multiple and integrated neutron-based techniques for musical instruments. Such samples, in the broad scenario of cultural heritage, represent an exciting research field. They may represent an interesting link between different disciplines such as nuclear physics, metallurgy and acoustics.

  3. Review of Fluorescence-Based Velocimetry Techniques to Study High-Speed Compressible Flows

    Science.gov (United States)

    Bathel, Brett F.; Johansen, Criag; Inman, Jennifer A.; Jones, Stephen B.; Danehy, Paul M.

    2013-01-01

    This paper reviews five laser-induced fluorescence-based velocimetry techniques that have been used to study high-speed compressible flows at NASA Langley Research Center. The techniques discussed in this paper include nitric oxide (NO) molecular tagging velocimetry (MTV), nitrogen dioxide photodissociation (NO2-to-NO) MTV, and NO and atomic oxygen (O-atom) Doppler-shift-based velocimetry. Measurements of both single-component and two-component velocity have been performed using these techniques. This paper details the specific application and experiment for which each technique has been used, the facility in which the experiment was performed, the experimental setup, sample results, and a discussion of the lessons learned from each experiment.

  4. Design Technique of I2L Circuits Based on Multi—Valued Logic

    Institute of Scientific and Technical Information of China (English)

    吴训威; 杭国强

    1996-01-01

    This paper proposes the use of the current signal to express logic values and establishes th theory of grounded current switches suitable for I2L circuits.Based on the advantag that current signals are easy to be added,the design technique of I2L circuits by means of the multi-valued current signal is proposed.It is shown that simpler structure of I2L circuits can be obtained with this technique.

  5. A Low Cost Vision Based Hybrid Fiducial Mark Tracking Technique for Mobile Industrial Robots

    Directory of Open Access Journals (Sweden)

    Mohammed Y Aalsalem

    2012-07-01

    Full Text Available The field of robotic vision is developing rapidly. Robots can react intelligently and provide assistance to user activities through sentient computing. Since industrial applications pose complex requirements that cannot be handled by humans, an efficient low cost and robust technique is required for the tracking of mobile industrial robots. The existing sensor based techniques for mobile robot tracking are expensive and complex to deploy, configure and maintain. Also some of them demand dedicated and often expensive hardware. This paper presents a low cost vision based technique called “Hybrid Fiducial Mark Tracking” (HFMT technique for tracking mobile industrial robot. HFMT technique requires off-the-shelf hardware (CCD cameras and printable 2-D circular marks used as fiducials for tracking a mobile industrial robot on a pre-defined path. This proposed technique allows the robot to track on a predefined path by using fiducials for the detection of Right and Left turns on the path and White Strip for tracking the path. The HFMT technique is implemented and tested on an indoor mobile robot at our laboratory. Experimental results from robot navigating in real environments have confirmed that our approach is simple and robust and can be adopted in any hostile industrial environment where humans are unable to work.

  6. Interferometric Dynamic Measurement: Techniques Based on High-Speed Imaging or a Single Photodetector

    Directory of Open Access Journals (Sweden)

    Yu Fu

    2014-01-01

    Full Text Available In recent years, optical interferometry-based techniques have been widely used to perform noncontact measurement of dynamic deformation in different industrial areas. In these applications, various physical quantities need to be measured in any instant and the Nyquist sampling theorem has to be satisfied along the time axis on each measurement point. Two types of techniques were developed for such measurements: one is based on high-speed cameras and the other uses a single photodetector. The limitation of the measurement range along the time axis in camera-based technology is mainly due to the low capturing rate, while the photodetector-based technology can only do the measurement on a single point. In this paper, several aspects of these two technologies are discussed. For the camera-based interferometry, the discussion includes the introduction of the carrier, the processing of the recorded images, the phase extraction algorithms in various domains, and how to increase the temporal measurement range by using multiwavelength techniques. For the detector-based interferometry, the discussion mainly focuses on the single-point and multipoint laser Doppler vibrometers and their applications for measurement under extreme conditions. The results show the effort done by researchers for the improvement of the measurement capabilities using interferometry-based techniques to cover the requirements needed for the industrial applications.

  7. An adaptive laser beam shaping technique based on a genetic algorithm

    Institute of Scientific and Technical Information of China (English)

    Ping Yang; Yuan Liu; Wei Yang; Minwu Ao; Shijie Hu; Bing Xu; Wenhan Jiang

    2007-01-01

    @@ A new adaptive beam intensity shaping technique based on the combination of a 19-element piezo-electricity deformable mirror (DM) and a global genetic algorithm is presented. This technique can adaptively adjust the voltages of the 19 actuators on the DM to reduce the difference between the target beam shape and the actual beam shape. Numerical simulations and experimental results show that within the stroke range of the DM, this technique can be well used to create the given beam intensity profiles on the focal plane.

  8. Scaling up the DBSCAN Algorithm for Clustering Large Spatial Databases Based on Sampling Technique

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Clustering, in data mining, is a useful technique for discoveringinte resting data distributions and patterns in the underlying data, and has many app lication fields, such as statistical data analysis, pattern recognition, image p rocessing, and etc. We combine sampling technique with DBSCAN alg orithm to cluster large spatial databases, and two sampling-based DBSCAN (SDBSC A N) algorithms are developed. One algorithm introduces sampling technique inside DBSCAN, and the other uses sampling procedure outside DBSCAN. Experimental resul ts demonstrate that our algorithms are effective and efficient in clustering lar ge-scale spatial databases.

  9. Gene expression data clustering using a multiobjective symmetry based clustering technique.

    Science.gov (United States)

    Saha, Sriparna; Ekbal, Asif; Gupta, Kshitija; Bandyopadhyay, Sanghamitra

    2013-11-01

    The invention of microarrays has rapidly changed the state of biological and biomedical research. Clustering algorithms play an important role in clustering microarray data sets where identifying groups of co-expressed genes are a very difficult task. Here we have posed the problem of clustering the microarray data as a multiobjective clustering problem. A new symmetry based fuzzy clustering technique is developed to solve this problem. The effectiveness of the proposed technique is demonstrated on five publicly available benchmark data sets. Results are compared with some widely used microarray clustering techniques. Statistical and biological significance tests have also been carried out. PMID:24209942

  10. Development of IR-Based Short-Range Communication Techniques for Swarm Robot Applications

    Directory of Open Access Journals (Sweden)

    RAMLI, A. R.

    2010-11-01

    Full Text Available This paper proposes several designs for a reliable infra-red based communication techniques for swarm robotic applications. The communication system was deployed on an autonomous miniature mobile robot (AMiR, a swarm robotic platform developed earlier. In swarm applications, all participating robots must be able to communicate and share data. Hence a suitable communication medium and a reliable technique are required. This work uses infrared radiation for transmission of swarm robots messages. Infrared transmission methods such as amplitude and frequency modulations will be presented along with experimental results. Finally the effects of the modulation techniques and other parameters on collective behavior of swarm robots will be analyzed.

  11. Bandwidth-Tunable Fiber Bragg Gratings Based on UV Glue Technique

    Science.gov (United States)

    Fu, Ming-Yue; Liu, Wen-Feng; Chen, Hsin-Tsang; Chuang, Chia-Wei; Bor, Sheau-Shong; Tien, Chuen-Lin

    2007-07-01

    In this study, we have demonstrated that a uniform fiber Bragg grating (FBG) can be transformed into a chirped fiber grating by a simple UV glue adhesive technique without shifting the reflection band with respect to the center wavelength of the FBG. The technique is based on the induced strain of an FBG due to the UV glue adhesive force on the fiber surface that causes a grating period variation and an effective index change. This technique can provide a fast and simple method of obtaining the required chirp value of a grating for applications in the dispersion compensators, gain flattening in erbium-doped fiber amplifiers (EDFAs) or optical filters.

  12. A comparative study on change vector analysis based change detection techniques

    Indian Academy of Sciences (India)

    Sartajvir Singh; Rajneesh Talwar

    2014-12-01

    Detection of Earth surface changes are essential to monitor regional climatic, snow avalanche hazard analysis and energy balance studies that occur due to air temperature irregularities. Geographic Information System (GIS) enables such research activities to be carried out through change detection analysis. From this viewpoint, different change detection algorithms have been developed for land-use land-cover (LULC) region. Among the different change detection algorithms, change vector analysis (CVA) has level headed capability of extracting maximuminformation in terms of overall magnitude of change and the direction of change between multispectral bands from multi-temporal satellite data sets. Since past two–three decades, many effective CVA based change detection techniques e.g., improved change vector analysis (ICVA), modified change vector analysis (MCVA) and change vector analysis posterior-probability space (CVAPS), have been developed to overcome the difficulty that exists in traditional change vector analysis (CVA). Moreover, many integrated techniques such as cross correlogram spectral matching (CCSM) based CVA. CVA uses enhanced principal component analysis (PCA) and inverse triangular (IT) function, hyper-spherical direction cosine (HSDC), and median CVA (m-CVA), as an effective LULC change detection tools. This paper comprises a comparative analysis on CVA based change detection techniques such as CVA, MCVA, ICVA and CVAPS. This paper also summarizes the necessary integrated CVA techniques along with their characteristics, features and shortcomings. Based on experiment outcomes, it has been evaluated that CVAPS technique has greater potential than other CVA techniques to evaluate the overall transformed information over three differentMODerate resolution Imaging Spectroradiometer (MODIS) satellite data sets of different regions. Results of this study are expected to be potentially useful for more accurate analysis of LULC changes which will, in turn

  13. Performance index based learning controls for the partial non-regular systems using lifting technique

    Institute of Scientific and Technical Information of China (English)

    Shengyue YANG; Xiaoping FAN; Zhihua QU

    2009-01-01

    Deficiencies of the performance-based iterative learning control (ILC) for the non-regular systems are investigated in detail, then a faster control input updating and lifting technique is introduced in the design of performance index based ILCs for the partial non-regular systems. Two kinds of optimal ILCs based on different performance indices are considered. Finally, simulation examples are given to illustrate the feasibility of the proposed learning controls.

  14. Satellite Angular Velocity Estimation Based on Star Images and Optical Flow Techniques

    OpenAIRE

    Giancarmine Fasano; Giancarlo Rufino; Domenico Accardo; Michele Grassi

    2013-01-01

    An optical flow-based technique is proposed to estimate spacecraft angular velocity based on sequences of star-field images. It does not require star identification and can be thus used to also deliver angular rate information when attitude determination is not possible, as during platform de tumbling or slewing. Region-based optical flow calculation is carried out on successive star images preprocessed to remove background. Sensor calibration parameters, Poisson equation, and a least-squares...

  15. Iris Recognition Using Modified Hierarchical Phase-Based Matching (HPM) Technique

    OpenAIRE

    C.Anand Deva Durai; M.Karnan

    2010-01-01

    This paper explores an efficient algorithm for iris recognition based on Hierarchical Phase-Based Image Matching (HPM) technique. One of the difficult problems in feature-based iris recognition is that the matching performance is significantly influenced by many parameters in feature extraction process, which may vary depending on environmental factors of image acquisition. The proposed system is designed for applications where the training database contains an iris for each individual. The f...

  16. Content Based Image Retrieval using Hierarchical and K-Means Clustering Techniques

    OpenAIRE

    V.S.V.S. Murthy; E.Vamsidhar; J.N.V.R SWARUP KUMAR; P.Sankara Rao

    2010-01-01

    In this paper we present an image retrieval system that takes an image as the input query and retrieves images based on image content. Content Based Image Retrieval is an approach for retrieving semantically-relevant images from an image database based on automatically-derived image features. The unique aspect of the system is the utilization of hierarchical and k-means clustering techniques. The proposed procedure consists of two stages. First, here we are going to filter most of the images ...

  17. Efficient techniques for wave-based sound propagation in interactive applications

    Science.gov (United States)

    Mehra, Ravish

    Sound propagation techniques model the effect of the environment on sound waves and predict their behavior from point of emission at the source to the final point of arrival at the listener. Sound is a pressure wave produced by mechanical vibration of a surface that propagates through a medium such as air or water, and the problem of sound propagation can be formulated mathematically as a second-order partial differential equation called the wave equation. Accurate techniques based on solving the wave equation, also called the wave-based techniques, are too expensive computationally and memory-wise. Therefore, these techniques face many challenges in terms of their applicability in interactive applications including sound propagation in large environments, time-varying source and listener directivity, and high simulation cost for mid-frequencies. In this dissertation, we propose a set of efficient wave-based sound propagation techniques that solve these three challenges and enable the use of wave-based sound propagation in interactive applications. Firstly, we propose a novel equivalent source technique for interactive wave-based sound propagation in large scenes spanning hundreds of meters. It is based on the equivalent source theory used for solving radiation and scattering problems in acoustics and electromagnetics. Instead of using a volumetric or surface-based approach, this technique takes an object-centric approach to sound propagation. The proposed equivalent source technique generates realistic acoustic effects and takes orders of magnitude less runtime memory compared to prior wave-based techniques. Secondly, we present an efficient framework for handling time-varying source and listener directivity for interactive wave-based sound propagation. The source directivity is represented as a linear combination of elementary spherical harmonic sources. This spherical harmonic-based representation of source directivity can support analytical, data

  18. Extensive vegetated roofs in Sweden

    OpenAIRE

    Emilsson, Tobias

    2006-01-01

    This thesis discusses extensive vegetated roofs, i.e. vegetation systems placed on top of buildings as an aesthetical and/or ecological cover. Specific objectives was to (1) quantify how establishment techniques, substrates and plant mixes influence establishment and development of extensive vegetated roofs, (2) investigate effect of vegetated roofs on stormwater quality, and quantify how maintenance and starting fertilisation influences stormwater quality, and (4) investigate the role of veg...

  19. Techniques, Advantages and Problems of Agent Based Modeling for Traffic Simulation

    Directory of Open Access Journals (Sweden)

    Ali Bazghandi

    2012-01-01

    Full Text Available Agent-based modeling (ABM is a powerful simulation modeling technique in the last few years. ABM, as an approach to simulating the behavior of a complex system in which agents interact with each other and with their environment using simple local rules, is gaining popularity and widespread use in many areas. Successes of this approach in predicting traffic flow in metropolitan areas, the spread of infectious diseases, and the behavior of economic systems have generated further interest in this powerful technology. In this paper we focus on agent-based approach to traffic simulation, and investigate its benefits, difficulties and (microscopic-macroscopic techniques.

  20. Key Techniques for the Development of Web-Based PDM System

    Institute of Scientific and Technical Information of China (English)

    WANG Li-juan; ZHANG Xu; NING Ru-xin

    2006-01-01

    Some key techniques for the development of web-based product data management (PDM) system are introduced. The four-tiered B/S architecture of a PDM system-BITPDM is introduced first, followed by its design and implementation, including virtual data vault, flexible coding system, document management,product structure and configuration management, workflow/process and product maturity management. BITPDM can facilitate the activities from new product introduction phase to manufacturing, and manage the product data and their dynamic changing history. Based on Microsoft. NET, XML, web service and SOAP techniques, BITPDM realizes the integration and efficient management of product information.

  1. Microcapsule-based techniques for improving the safety of lithium-ion batteries

    Science.gov (United States)

    Baginska, Marta

    Lithium-ion batteries are vital energy storage devices due to their high specific energy density, lack of memory effect, and long cycle life. While they are predominantly used in small consumer electronics, new strategies for improving battery safety and lifetime are critical to the successful implementation of high-capacity, fast-charging materials required for advanced Li-ion battery applications. Currently, the presence of a volatile, combustible electrolyte and an oxidizing agent (Lithium oxide cathodes) make the Li-ion cell susceptible to fire and explosions. Thermal overheating, electrical overcharging, or mechanical damage can trigger thermal runaway, and if left unchecked, combustion of battery materials. To improve battery safety, autonomic, thermally-induced shutdown of Li-ion batteries is demonstrated by depositing thermoresponsive polymer microspheres onto battery anodes. When the internal temperature of the cell reaches a critical value, the microspheres melt and conformally coat the anode and/or separator with an ion insulating barrier, halting Li-ion transport and shutting down the cell permanently. Charge and discharge capacity is measured for Li-ion coin cells containing microsphere-coated anodes or separators as a function of capsule coverage. Scanning electron microscopy images of electrode surfaces from cells that have undergone autonomic shutdown provides evidence of melting, wetting, and re-solidification of polyethylene (PE) into the anode and polymer film formation at the anode/separator interface. As an extension of this autonomic shutdown approach, a particle-based separator capable of performing autonomic shutdown, but which reduces the shorting hazard posed by current bi- and tri-polymer commercial separators, is presented. This dual-particle separator is composed of hollow glass microspheres acting as a physical spacer between electrodes, and PE microspheres to impart autonomic shutdown functionality. An oil-immersion technique is

  2. Modal extension rule

    Institute of Scientific and Technical Information of China (English)

    WU Xia; SUN Jigui; LIN Hai; FENG Shasha

    2005-01-01

    Modal logics are good candidates for a formal theory of agents. The efficiency of reasoning method in modal logics is very important, because it determines whether or not the reasoning method can be widely used in systems based on agent. In this paper,we modify the extension rule theorem proving method we presented before, and then apply it to P-logic that is translated from modal logic by functional transformation. At last, we give the proof of its soundness and completeness.

  3. The extension of a DNA double helix by an additional Watson-Crick base pair on the same backbone

    DEFF Research Database (Denmark)

    Kumar, P.; Sharma, P. K.; Madsen, Charlotte S.;

    2013-01-01

    Additional base pair: The DNA duplex can be extended with an additional Watson-Crick base pair on the same backbone by the use of double-headed nucleotides. These also work as compressed dinucleotides and form two base pairs with cognate nucleobases on the opposite strand....

  4. An improved visualization-based force-measurement technique for short-duration hypersonic facilities

    Energy Technology Data Exchange (ETDEWEB)

    Laurence, Stuart J.; Karl, Sebastian [Institute of Aerodynamics and Flow Technology, Spacecraft Section, German Aerospace Center (DLR), Goettingen (Germany)

    2010-06-15

    This article is concerned with describing and exploring the limitations of an improved version of a recently proposed visualization-based technique for the measurement of forces and moments in short-duration hypersonic wind tunnels. The technique is based on tracking the motion of a free-flying body over a sequence of high-speed visualizations; while this idea is not new in itself, the use of high-speed digital cinematography combined with a highly accurate least-squares tracking algorithm allows improved results over what have been previously possible with such techniques. The technique precision is estimated through the analysis of artificially constructed and experimental test images, and the resulting error in acceleration measurements is characterized. For wind-tunnel scale models, position measurements to within a few microns are shown to be readily attainable. Image data from two previous experimental studies in the T5 hypervelocity shock tunnel are then reanalyzed with the improved technique: the uncertainty in the mean drag acceleration is shown to be reduced to the order of the flow unsteadiness, 2-3%, and time-resolved acceleration measurements are also shown to be possible. The response time of the technique for the configurations studied is estimated to be {proportional_to}0.5 ms. Comparisons with computations using the DLR TAU code also yield agreement to within the overall experimental uncertainty. Measurement of the pitching moment for blunt geometries still appears challenging, however. (orig.)

  5. Esthetic Craniofacial Bony and Skull Base Reconstruction Using Flap Wrapping Technique.

    Science.gov (United States)

    Yano, Tomoyuki; Suesada, Nobuko; Usami, Satoshi

    2016-07-01

    For a safe and esthetic skull base reconstruction combined with repair of craniofacial bone defects, the authors introduce the flap wrapping technique in this study. This technique consists of skull base reconstruction using the vastus lateralis muscle of an anterolateral thigh (ALT) free flap, and structural craniofacial bony reconstruction using an autologous calvarial bone graft. The key to this technique is that all of the grafted autologous bone is wrapped with the vascularized fascia of the ALT free flap to protect the grafted bone from infection and exposure. Two anterior skull base tumors combined with craniofacial bony defects were included in this study. The subjects were a man and a woman, aged 18 and 64. Both patients had preoperative proton beam therapy. First, the skull base defect was filled with vastus lateralis muscle, and then structural reconstruction was performed with an autologous bone graft and a fabricated inner layer of calvarial bone, and then the grafted bone was completely wrapped in the vascularized fascia of the ALT free flap. By applying this technique, there was no intracranial infection or grafted bone exposure in these 2 patients postoperatively, even though both patients had preoperative proton beam therapy. Additionally, the vascularized fascia wrapped bone graft could provide a natural contour and prevent collapse of the craniofacial region, and this gives patients a better facial appearance even though they have had skull base surgery. PMID:27300454

  6. A Novel Generic Session Based Bit Level Encryption Technique to Enhance Information Security

    CERN Document Server

    Paul, Manas; Pal, Suvajit; Saha, Ranit

    2009-01-01

    - In this paper a session based symmetric key encryption system has been proposed and is termed as Permutated Cipher Technique (PCT). This technique is more fast, suitable and secure for larger files. In this technique the input file is broken down into blocks of various sizes (of 2 power n order) and encrypted by shifting the position of each bit by a certain value for a certain number of times. A key is generated randomly wherein the length of each block is determined. Each block length generates a unique value of number of bits to be skipped. This value determines the new position of the bits within the block that are to be shifted. After the shifting and inverting each block is XORed with SHA 512 digest of the key. The resultant blocks from the cipher text. The key is generated according to the binary value of the input file size. Decryption is done following the same process as the technique is symmetric.

  7. Evaluation of paint coating thickness variations based on pulsed Infrared thermography laser technique

    Science.gov (United States)

    Mezghani, S.; Perrin, E.; Vrabie, V.; Bodnar, J. L.; Marthe, J.; Cauwe, B.

    2016-05-01

    In this paper, a pulsed Infrared thermography technique using a homogeneous heat provided by a laser source is used for the non-destructive evaluation of paint coating thickness variations. Firstly, numerical simulations of the thermal response of a paint coated sample are performed. By analyzing the thermal responses as a function of thermal properties and thickness of both coating and substrate layers, optimal excitation parameters of the heating source are determined. Two characteristic parameters were studied with respect to the paint coating layer thickness variations. Results obtained using an experimental test bench based on the pulsed Infrared thermography laser technique are compared with those given by a classical Eddy current technique for paint coating variations from 5 to 130 μm. These results demonstrate the efficiency of this approach and suggest that the pulsed Infrared thermography technique presents good perspectives to characterize the heterogeneity of paint coating on large scale samples with other heating sources.

  8. Using a Voltage Domain Programmable Technique for Low-Power Management Cell-Based Design

    Directory of Open Access Journals (Sweden)

    Ching-Hwa Cheng

    2011-09-01

    Full Text Available The Multi-voltage technique is an effective way to reduce power consumption. In the proposed cell-based voltage domain programmable (VDP technique, the high and low voltages applied to logic gates are programmable. The flexible voltage domain reassignment allows the chip performance and power consumption to be dynamically adjusted. In the proposed technique, the power switches possess the feature of flexible programming after chip manufacturing. This VDP method does not use an external voltage regulator to regulate the supply voltage level from outside of the chip but can be easily integrated within the design. This novel technique is proven by use of a video decoder test chip, which shows 55% and 61% power reductions compared to conventional single-Vdd and low-voltage designs, respectively. This power-aware performance adjusting mechanism shows great power reduction with a good power-performance management mechanism.

  9. A damage identification technique based on embedded sensitivity analysis and optimization processes

    Science.gov (United States)

    Yang, Chulho; Adams, Douglas E.

    2014-07-01

    A vibration based structural damage identification method, using embedded sensitivity functions and optimization algorithms, is discussed in this work. The embedded sensitivity technique requires only measured or calculated frequency response functions to obtain the sensitivity of system responses to each component parameter. Therefore, this sensitivity analysis technique can be effectively used for the damage identification process. Optimization techniques are used to minimize the difference between the measured frequency response functions of the damaged structure and those calculated from the baseline system using embedded sensitivity functions. The amount of damage can be quantified directly in engineering units as changes in stiffness, damping, or mass. Various factors in the optimization process and structural dynamics are studied to enhance the performance and robustness of the damage identification process. This study shows that the proposed technique can improve the accuracy of damage identification with less than 2 percent error of estimation.

  10. Interference Mitigation Technique for Coexistence of Pulse-Based UWB and OFDM

    Directory of Open Access Journals (Sweden)

    Tetsushi Ikegami

    2008-04-01

    Full Text Available Ultra-wideband (UWB is a useful radio technique for sharing frequency bands between radio systems. It uses very short pulses to spread spectrum. However, there is a potential for interference between systems using the same frequency bands at close range. In some regulatory systems, interference detection and avoidance (DAA techniques are required to prevent interference with existing radio systems. In this paper, the effect of interference on orthogonal frequency division multiplexing (OFDM signals from pulse-based UWB is discussed, and an interference mitigation technique is proposed. This technique focuses on the pulse repetition cycle of UWB. The pulse repetition interval is set the same or half the period of the OFDM symbol excluding the guard interval to mitigate interference. These proposals are also made for direct sequence (DS-UWB. Bit error rate (BER performance is illustrated through both simulation and theoretical approximations.

  11. Interference Mitigation Technique for Coexistence of Pulse-Based UWB and OFDM

    Directory of Open Access Journals (Sweden)

    Ohno Kohei

    2008-01-01

    Full Text Available Abstract Ultra-wideband (UWB is a useful radio technique for sharing frequency bands between radio systems. It uses very short pulses to spread spectrum. However, there is a potential for interference between systems using the same frequency bands at close range. In some regulatory systems, interference detection and avoidance (DAA techniques are required to prevent interference with existing radio systems. In this paper, the effect of interference on orthogonal frequency division multiplexing (OFDM signals from pulse-based UWB is discussed, and an interference mitigation technique is proposed. This technique focuses on the pulse repetition cycle of UWB. The pulse repetition interval is set the same or half the period of the OFDM symbol excluding the guard interval to mitigate interference. These proposals are also made for direct sequence (DS-UWB. Bit error rate (BER performance is illustrated through both simulation and theoretical approximations.

  12. Vision-based system identification technique for building structures using a motion capture system

    Science.gov (United States)

    Oh, Byung Kwan; Hwang, Jin Woo; Kim, Yousok; Cho, Tongjun; Park, Hyo Seon

    2015-11-01

    This paper presents a new vision-based system identification (SI) technique for building structures by using a motion capture system (MCS). The MCS with outstanding capabilities for dynamic response measurements can provide gage-free measurements of vibrations through the convenient installation of multiple markers. In this technique, from the dynamic displacement responses measured by MCS, the dynamic characteristics (natural frequency, mode shape, and damping ratio) of building structures are extracted after the processes of converting the displacement from MCS to acceleration and conducting SI by frequency domain decomposition. A free vibration experiment on a three-story shear frame was conducted to validate the proposed technique. The SI results from the conventional accelerometer-based method were compared with those from the proposed technique and showed good agreement, which confirms the validity and applicability of the proposed vision-based SI technique for building structures. Furthermore, SI directly employing MCS measured displacements to FDD was performed and showed identical results to those of conventional SI method.

  13. Mobility Based Key Management Technique for Multicast Security in Mobile Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    B. Madhusudhanan

    2015-01-01

    Full Text Available In MANET multicasting, forward and backward secrecy result in increased packet drop rate owing to mobility. Frequent rekeying causes large message overhead which increases energy consumption and end-to-end delay. Particularly, the prevailing group key management techniques cause frequent mobility and disconnections. So there is a need to design a multicast key management technique to overcome these problems. In this paper, we propose the mobility based key management technique for multicast security in MANET. Initially, the nodes are categorized according to their stability index which is estimated based on the link availability and mobility. A multicast tree is constructed such that for every weak node, there is a strong parent node. A session key-based encryption technique is utilized to transmit a multicast data. The rekeying process is performed periodically by the initiator node. The rekeying interval is fixed depending on the node category so that this technique greatly minimizes the rekeying overhead. By simulation results, we show that our proposed approach reduces the packet drop rate and improves the data confidentiality.

  14. Mobility Based Key Management Technique for Multicast Security in Mobile Ad Hoc Networks

    Science.gov (United States)

    Madhusudhanan, B.; Chitra, S.; Rajan, C.

    2015-01-01

    In MANET multicasting, forward and backward secrecy result in increased packet drop rate owing to mobility. Frequent rekeying causes large message overhead which increases energy consumption and end-to-end delay. Particularly, the prevailing group key management techniques cause frequent mobility and disconnections. So there is a need to design a multicast key management technique to overcome these problems. In this paper, we propose the mobility based key management technique for multicast security in MANET. Initially, the nodes are categorized according to their stability index which is estimated based on the link availability and mobility. A multicast tree is constructed such that for every weak node, there is a strong parent node. A session key-based encryption technique is utilized to transmit a multicast data. The rekeying process is performed periodically by the initiator node. The rekeying interval is fixed depending on the node category so that this technique greatly minimizes the rekeying overhead. By simulation results, we show that our proposed approach reduces the packet drop rate and improves the data confidentiality. PMID:25834838

  15. DCT-Yager FNN: a novel Yager-based fuzzy neural network with the discrete clustering technique.

    Science.gov (United States)

    Singh, A; Quek, C; Cho, S Y

    2008-04-01

    superior performance. Extensive experiments have been conducted to test the effectiveness of these two networks, using various clustering algorithms. It follows that the SDCT and UDCT clustering algorithms are particularly suited to networks based on the Yager inference rule.

  16. HACIA LA EXTENSION DEL MÉTODO GRAY WATCH BASADO EN EL ESTÁNDAR DE CALIDAD ISO/IEC 25010 // TOWARDS THE EXTENSION OF THE GRAY WATCH METHOD BASED ON THE QUALITY STANDARD ISO/IEC 25010

    Directory of Open Access Journals (Sweden)

    Jorge Luis Pérez-Medina

    2012-06-01

    Full Text Available Talk about software quality implies the need to rely on parameters that should allow to establish the minimal levels that a product of this type must reach in order to be considered of quality. This paper aims to propose an extension of the method GRAY WATCH, specifically in the technical processes of Analysis and Design connecting the products obtained to the process of Implementation. The orientation of our proposal consists of using the standard of product quality ISO/IEC 25010, which establishes criteria for the specification of quality requirements of software products, their metrics and evaluation, and includes a quality model composed by characteristics and subcharacteristics. The result of this proposal, adds significant value to the extended method, Allowing to system analysts and Computer professionals to specify the precise activities to be performed to obtain quality requirements. To make this work we have supported our efforts in the Domain Engineering process based in Software Quality named InDoCaS as methodology for the definition of activities and products in the processes of Analysis, Design and Implementation of the Application.// RESUMEN: Hablar de calidad de software implica la necesidad de contar con parámetros que permitan establecer los niveles mínimos que un producto de este tipo debe alcanzar para que se considere de calidad. El presente articulo tiene como finalidad proponer una extension del método GRAY WATCH, especificamente en los procesos técnicos de análisis y diseño acoplando los productos obtenidos al proceso de implementación. La orientación de nuestra propuesta consiste en utilizar el estándar de calidad del producto ISO/IEC 25010, que establece criterios para la especificacion de requisitos de calidad de productos de software, sus métricas y su evaluación, e incluye un modelo de calidad compuesto por características y subcaracteristicas. El resultado de esta propuesta, agrega un valor importante al

  17. Segmentation techniques evaluation based on a single compact breast mass classification scheme

    Science.gov (United States)

    Matheus, Bruno R. N.; Marcomini, Karem D.; Schiabel, Homero

    2016-03-01

    In this work some segmentation techniques are evaluated by using a simple centroid-based classification system regarding breast mass delineation in digital mammography images. The aim is to determine the best one for future CADx developments. Six techniques were tested: Otsu, SOM, EICAMM, Fuzzy C-Means, K-Means and Level-Set. All of them were applied to segment 317 mammography images from DDSM database. A single compact set of attributes was extracted and two centroids were defined, one for malignant and another for benign cases. The final classification was based on proximity with a given centroid and the best results were presented by the Level-Set technique with a 68.1% of Accuracy, which indicates this method as the most promising for breast masses segmentation aiming a more precise interpretation in schemes CADx.

  18. Novel stability criteria for fuzzy Hopfield neural networks based on an improved homogeneous matrix polynomials technique

    Institute of Scientific and Technical Information of China (English)

    Feng Yi-Fu; Zhang Qing-Ling; Feng De-Zhi

    2012-01-01

    The global stability problem of Takagi-Sugeno (T S) fuzzy Hopfield neural networks (FHNNs) with time delays is investigated.Novel LMI-based stability criteria are obtained by using Lyapunov functional theory to guarantee the asymptotic stability of the FHNNs with less conservatism.Firstly,using both Finsler's lemma and an improved homogeneous matrix polynomial technique,and applying an affine parameter-dependent Lyapunov-Krasovskii functional,we obtain the convergent LMI-based stability criteria.Algebraic properties of the fuzzy membership functions in the unit simplex are considered in the process of stability analysis via the homogeneous matrix polynomials technique.Secondly,to further reduce the conservatism,a new right-hand-side slack variables introducing technique is also proposed in terms of LMIs,which is suitable to the homogeneous matrix polynomials setting.Finally,two illustrative examples are given to show the efficiency of the proposed approaches.

  19. Novel technique for distributed fibre sensing based on coherent Rayleigh scattering measurements of birefringence

    Science.gov (United States)

    Lu, Xin; Soto, Marcelo A.; Thévenaz, Luc

    2016-05-01

    A novel distributed fibre sensing technique is described and experimentally validated, based on birefringence measurements using coherent Rayleigh scattering. It natively provides distributed measurements of temperature and strain with more than an order of magnitude higher sensitivity than Brillouin sensing, and requiring access to a single fibre-end. Unlike the traditional Rayleigh-based coherent optical time-domain reflectometry, this new method provides absolute measurements of the measurand and may lead to a robust discrimination between temperature and strain in combination with another technique. Since birefringence is purposely induced in the fibre by design, large degrees of freedom are offered to optimize and scale the sensitivity to a given quantity. The technique has been validated in 2 radically different types of birefringent fibres - elliptical-core and Panda polarization-maintaining fibres - with a good repeatability.

  20. Novel stability criteria for fuzzy Hopfield neural networks based on an improved homogeneous matrix polynomials technique

    International Nuclear Information System (INIS)

    The global stability problem of Takagi—Sugeno (T—S) fuzzy Hopfield neural networks (FHNNs) with time delays is investigated. Novel LMI-based stability criteria are obtained by using Lyapunov functional theory to guarantee the asymptotic stability of the FHNNs with less conservatism. Firstly, using both Finsler's lemma and an improved homogeneous matrix polynomial technique, and applying an affine parameter-dependent Lyapunov—Krasovskii functional, we obtain the convergent LMI-based stability criteria. Algebraic properties of the fuzzy membership functions in the unit simplex are considered in the process of stability analysis via the homogeneous matrix polynomials technique. Secondly, to further reduce the conservatism, a new right-hand-side slack variables introducing technique is also proposed in terms of LMIs, which is suitable to the homogeneous matrix polynomials setting. Finally, two illustrative examples are given to show the efficiency of the proposed approaches