WorldWideScience

Sample records for base extension technique

  1. High-extensible scene graph framework based on component techniques

    Institute of Scientific and Technical Information of China (English)

    LI Qi-cheng; WANG Guo-ping; ZHOU Feng

    2006-01-01

    In this paper, a novel component-based scene graph is proposed, in which all objects in the scene are classified to different entities, and a scene can be represented as a hierarchical graph composed of the instances of entities. Each entity contains basic data and its operations which are encapsulated into the entity component. The entity possesses certain behaviours which are responses to rules and interaction defined by the high-level application. Such behaviours can be described by script or behaviours model. The component-based scene graph in the paper is more abstractive and high-level than traditional scene graphs. The contents of a scene could be extended flexibly by adding new entities and new entity components, and behaviour modification can be obtained by modifying the model components or behaviour scripts. Its robustness and efficiency are verified by many examples implemented in the Virtual Scenario developed by Peking University.

  2. Network Lifetime Extension Based On Network Coding Technique In Underwater Acoustic Sensor Networks

    Directory of Open Access Journals (Sweden)

    Padmavathy.T.V

    2012-06-01

    Full Text Available Underwater acoustic sensor networks (UWASNs are playing a lot of interest in ocean applications, such as ocean pollution monitoring, ocean animal surveillance, oceanographic data collection, assisted- navigation, and offshore exploration, UWASN is composed of underwater sensors that engage sound to transmit information collected in the ocean. The reason to utilize sound is that radio frequency (RF signals used by terrestrial sensor networks (TWSNs can merely transmit a few meters in the water. Unfortunately, the efficiency of UWASNs is inferior to that of the terrestrial sensor networks (TWSNs. Some of the challenges in under water communication are propagation delay, high bit error rate and limited bandwidth. Our aim is to minimize the power consumption and to improve the reliability of data transmission by finding the optimum number of clusters based on energy consumption.

  3. Two Extension Block Kirschner Wires' Technique for Bony Mallet Thumb

    Science.gov (United States)

    Takase, Fumiaki; Ueda, Yasuhiro; Shinohara, Issei; Kuroda, Ryosuke; Kokubu, Takeshi

    2016-01-01

    Mallet fingers with an avulsion fracture of the distal phalanx or rupture of the terminal tendon of the extensor mechanism is known as a common injury, while mallet thumb is very rare. In this paper, the case of a 19-year-old woman with a sprained left thumb sustained while playing basketball is presented. Plain radiographs and computed tomography revealed an avulsion fracture involving more than half of the articular surface at the base of the distal phalanx. Closed reduction and percutaneous fixation were performed using the two extension block Kirschner wires' technique under digital block anesthesia. At 4 months postoperatively, the patient had achieved excellent results according to Crawford's evaluation criteria and had no difficulties in working or playing basketball. Various conservative and operative treatment strategies have been reported for management of mallet thumb. We chose the two extension block Kirschner wires' technique to minimize invasion of the extensor mechanism and nail bed and to stabilize the large fracture fragment.

  4. Extension of the preceding birth technique.

    Science.gov (United States)

    Aguirre, A

    1994-01-01

    The Brass-inspired Preceding Birth Technique (PBT), is an indirect estimation technique with low costs of administration. PBT involves asking women at a time close to delivery about the survival of the preceding births. The proportion dead is close to the probability of dying between the birth and the second birthday or an index of early childhood mortality (II or Q). Brass and Macrae have determined that II is an estimate of mortality between birth and an age lower than the birth interval or around 4/5 of the birth interval. Hospital and clinic data are likely to include a concentration of women with lower risks of disease because of higher educational levels and socioeconomic status. A simulation of PBT data from the World Fertility Survey for Mexico and Peru found that the proportions of previously dead children were 0.156 in Peru and 0.092 in Mexican home deliveries. Maternity clinic proportions were 0.088 in Peru and 0.066 in Mexico. Use of clinic and hospital data collection underestimated mortality by 32% in Peru and 15% in Mexico. Another alternative was proposed: interviewing women at some other time than delivery. If the interview was during a child/infant intervention after delivery, the subsample would still be subject to a bias, but this problem could be overcome by computing the weighted average of the actual probability of the older child being dead and the conditional probability of the younger child being dead or both younger and older children being dead. Correction factors could be applied using the general standard of the logit life table system of Brass. Calculation of a simple average of the ages of the younger children could provide enough information to help decide which tables to use. Five surveys were selected for testing the factors of dependence between probabilities of death of successive siblings: Bangladesh, Lesotho, Kenya, Ghana, and Guyana. Higher mortality was related to lower dependency factors between the probabilities of death

  5. Two Extension Block Kirschner Wires’ Technique for Bony Mallet Thumb

    Directory of Open Access Journals (Sweden)

    Yutaka Mifune

    2016-01-01

    Full Text Available Mallet fingers with an avulsion fracture of the distal phalanx or rupture of the terminal tendon of the extensor mechanism is known as a common injury, while mallet thumb is very rare. In this paper, the case of a 19-year-old woman with a sprained left thumb sustained while playing basketball is presented. Plain radiographs and computed tomography revealed an avulsion fracture involving more than half of the articular surface at the base of the distal phalanx. Closed reduction and percutaneous fixation were performed using the two extension block Kirschner wires’ technique under digital block anesthesia. At 4 months postoperatively, the patient had achieved excellent results according to Crawford’s evaluation criteria and had no difficulties in working or playing basketball. Various conservative and operative treatment strategies have been reported for management of mallet thumb. We chose the two extension block Kirschner wires’ technique to minimize invasion of the extensor mechanism and nail bed and to stabilize the large fracture fragment.

  6. Study on gene sensor based on primer extension

    Institute of Scientific and Technical Information of China (English)

    陈誉华; 宋今丹; 李大为

    1997-01-01

    Based on the fact that the resonant frequency of a piezoelectric crystal is the function of its surface deposit, and that the primer extends after it hybridizes with the template, the primer extension gene sensor technique was developed. The prominent feature of the technique is that fast and sensitive frequency signals are used as the monitoring system of gene hybridization and primer strand extension. Results show that this technique may be used in homologous analysis of nucleic acid, trace DNA detection, and determining the integration of DNA. It may also be used for isolation of target gene, gene mutation analysis, and predicting the location of a gene in its genome.

  7. Extensions in model-based system analysis

    OpenAIRE

    Graham, Matthew R.

    2007-01-01

    Model-based system analysis techniques provide a means for determining desired system performance prior to actual implementation. In addition to specifying desired performance, model-based analysis techniques require mathematical descriptions that characterize relevant behavior of the system. The developments of this dissertation give ex. tended formulations for control- relevant model estimation as well as model-based analysis conditions for performance requirements specified as frequency do...

  8. Wavelet Based Image Denoising Technique

    Directory of Open Access Journals (Sweden)

    Sachin D Ruikar

    2011-03-01

    Full Text Available This paper proposes different approaches of wavelet based image denoising methods. The search for efficient image denoising methods is still a valid challenge at the crossing of functional analysis and statistics. In spite of the sophistication of the recently proposed methods, most algorithms have not yet attained a desirable level of applicability. Wavelet algorithms are useful tool for signal processing such as image compression and denoising. Multi wavelets can be considered as an extension of scalar wavelets. The main aim is to modify the wavelet coefficients in the new basis, the noise can be removed from the data. In this paper, we extend the existing technique and providing a comprehensive evaluation of the proposed method. Results based on different noise, such as Gaussian, Poisson’s, Salt and Pepper, and Speckle performed in this paper. A signal to noise ratio as a measure of the quality of denoising was preferred.

  9. Flexible use and technique extension of logistics management

    Science.gov (United States)

    Xiong, Furong

    2011-10-01

    As we all know, the origin of modern logistics was in the United States, developed in Japan, became mature in Europe, and expanded in China. This is a historical development of the modern logistics recognized track. Due to China's economic and technological development, and with the construction of Shanghai International Shipping Center and Shanghai Yangshan International Deepwater development, China's modern logistics industry will attain a leap-forward development of a strong pace, and will also catch up with developed countries in the Western modern logistics level. In this paper, the author explores the flexibility of China's modern logistics management techniques to extend the use, and has certain practical and guidance significances.

  10. Biomechanical analysis of press-extension technique on degenerative lumbar with disc herniation and staggered facet joint.

    Science.gov (United States)

    Du, Hong-Gen; Liao, Sheng-Hui; Jiang, Zhong; Huang, Huan-Ming; Ning, Xi-Tao; Jiang, Neng-Yi; Pei, Jian-Wei; Huang, Qin; Wei, Hui

    2016-05-01

    This study investigates the effect of a new Chinese massage technique named "press-extension" on degenerative lumbar with disc herniation and facet joint dislocation, and provides a biomechanical explanation of this massage technique. Self-developed biomechanical software was used to establish a normal L1-S1 lumbar 3D FE model, which integrated the spine CT and MRI data-based anatomical structure. Then graphic technique is utilized to build a degenerative lumbar FE model with disc herniation and facet joint dislocation. According to the actual press-extension experiments, mechanic parameters are collected to set boundary condition for FE analysis. The result demonstrated that press-extension techniques bring the annuli fibrosi obvious induction effect, making the central nucleus pulposus forward close, increasing the pressure in front part. Study concludes that finite element modelling for lumbar spine is suitable for the analysis of press-extension technique impact on lumbar intervertebral disc biomechanics, to provide the basis for the disease mechanism of intervertebral disc herniation using press-extension technique.

  11. Should structure-based virtual screening techniques be used more extensively in modern drug discovery?%基于结构的虚拟筛选技术能更广泛应用于现代药物发现?

    Institute of Scientific and Technical Information of China (English)

    V. Leroux; B. Maigret

    2007-01-01

    The drug discovery processes used by academic and industrial scientists are nowadays being questioned. The approaches of the pharmaceutical industry that were successful 20 years ago are simply not suitable anymore for the increasing complexity of available biological targets and the raising standards for medical safety. While the current scientific context resulting from significant developments in genomic, proteomic, organic synthesis and biochemistry seems particularly favorable, the efficiency of drug research does not appear to be following the trend. In particular, the in silico approaches, often considered as potential enhancements for classic drug discovery, are an interesting case. Techniques such as virtual screening did undergo many significant progresses in the past 5-10years and have proven their usefulness in hit discovery approaches for who wants to avoid carrying out too many expensive experimental tests while exploring an important molecular diversity. However, reliability is still deceiving despite constant enhancements,and results are unpredictable. What are the origins of such issues?In this short review, we will first summarize the current status of computer-aided drug design, then we will focus on the structurebased class of virtual screening approaches, for which docking programs constitute the main part. Can such methods give something more than cost savings in the early banks-to-hit phases of the drug discovery process? We will try to answer this question by exploring the highlights and pitfalls of the great variety of docking approaches. It will appear that while the structure-based drug design field is not yet ready to fulfill all of its early promises, it should still be investigated extensively and used with caution. Most interestingly,structure-based methods are best used when combined with other complementary drug design approaches such as the ligand-based ones. In this regard, they will have an increasing role to play in modern drug

  12. A Novel Active Network Architecture Based on Extensible Services Router

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    Active networks are a new kind of packet-switched networks in which packets have code fragments that are executed on the intermediary nodes (routers). The code can extend or modify the foundation architecture of a network. In this paper, the authors present a novel active network architecture combined with advantages of two major active networks technology based on extensible services router. The architecture consists of extensible service router, active extensible components server and key distribution center (KDC). Users can write extensible service components with programming interface. At the present time, we have finished the extensible services router prototype system based on Highly Efficient Router Operating System (HEROS), active extensible components server and KDC prototype system based on Linux.

  13. Space-based observation of the extensive airshowers

    Directory of Open Access Journals (Sweden)

    Ebisuzaki T.

    2013-06-01

    Full Text Available Space based observations of extensive air showers constitute the next experimental challenge for the study of the universe at extreme energy. Space observation will allow a “quantum jump” in the observational area available to detect the UV light tracks produced by particles with energies higher than 1020 eV. These are thought to reach the Earth almost undeflected by the cosmic magnetic field. This new technique will contribute to establish the new field of astronomy and astrophysics performed with charged particles and neutrinos at the highest energies. This idea was created by the incredible efforts of three outstanding comic ray physicists: John Linsley, Livio Scarsi, and Yoshiyuki Takahashi. This challenging technique has four significant merits in comparison with ground-based observations: 1 Very large observational area, 2 Well constrained distances of the showers, 3 Clear and stable atmospheric transmission in the above half troposphere, 4 Uniform Exposure across both the north and south skies. Four proposed and planned missions constitute the roadmap of the community: TUS, JEM-EUSO, KLPVE, and Super-EUSO will contribute step-by-step to establish this challenging field of research.

  14. Source extension based on ε-entropy

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jian; YU Sheng-sheng; ZHOU Jing-li; ZHENG Xin-wei

    2005-01-01

    It is known by entropy theory that image is a source correlated with a certain characteristic of probability. The entropy rate of the source and ? entropy (rate-distortion function theory) are the information content to identify the characteristics of video images, and hence are essentially related with video image compression. They are fundamental theories of great significance to image compression, though impossible to be directly turned into a compression method. Based on the entropy theory and the image compression theory, by the application of the rate-distortion feature mathematical model and Lagrange multipliers to some theoretical problems in the H.264 standard, this paper presents a new the algorithm model of coding rate-distortion. This model is introduced into complete test on the capability of the test model of JM61e (JUT Test Model). The result shows that the speed of coding increases without significant reduction of the rate-distortion performance of the coder.

  15. Research on Customer Value Based on Extension Data Mining

    Science.gov (United States)

    Chun-Yan, Yang; Wei-Hua, Li

    Extenics is a new discipline for dealing with contradiction problems with formulize model. Extension data mining (EDM) is a product combining Extenics with data mining. It explores to acquire the knowledge based on extension transformations, which is called extension knowledge (EK), taking advantage of extension methods and data mining technology. EK includes extensible classification knowledge, conductive knowledge and so on. Extension data mining technology (EDMT) is a new data mining technology that mining EK in databases or data warehouse. Customer value (CV) can weigh the essentiality of customer relationship for an enterprise according to an enterprise as a subject of tasting value and customers as objects of tasting value at the same time. CV varies continually. Mining the changing knowledge of CV in databases using EDMT, including quantitative change knowledge and qualitative change knowledge, can provide a foundation for that an enterprise decides the strategy of customer relationship management (CRM). It can also provide a new idea for studying CV.

  16. Effects on hamstring muscle extensibility, muscle activity, and balance of different stretching techniques.

    Science.gov (United States)

    Lim, Kyoung-Il; Nam, Hyung-Chun; Jung, Kyoung-Sim

    2014-02-01

    [Purpose] The purpose of this study was to investigate the effects of two different stretching techniques on range of motion (ROM), muscle activation, and balance. [Subjects] For the present study, 48 adults with hamstring muscle tightness were recruited and randomly divided into three groups: a static stretching group (n=16), a PNF stretching group (n=16), a control group (n=16). [Methods] Both of the stretching techniques were applied to the hamstring once. Active knee extension angle, muscle activation during maximum voluntary isometric contraction (MVC), and static balance were measured before and after the application of each stretching technique. [Results] Both the static stretching and the PNF stretching groups showed significant increases in knee extension angle compared to the control group. However, there were no significant differences in muscle activation or balance between the groups. [Conclusion] Static stretching and PNF stretching techniques improved ROM without decrease in muscle activation, but neither of them exerted statistically significant effects on balance.

  17. Technique for anisotropic extension of organic crystals: Application to temperature dependence of electrical resistance

    Science.gov (United States)

    Yamamoto, Takashi; Kato, Reizo; Yamamoto, Hiroshi M.; Fukaya, Atsuko; Yamasawa, Kenji; Takahashi, Ichiro; Akutsu, Hiroki; Akutsu-Sato, Akane; Day, Peter

    2007-08-01

    We have developed a technique for the anisotropic extension of fragile molecular crystals. The pressure medium and the instrument, which extends the pressure medium, are both made from epoxy resin. Since the thermal contraction of our instrument is identical to that of the pressure medium, the strain applied to the pressure medium has no temperature dependence down to 2K. Therefore, the degree of extension applied to the single crystal at low temperatures is uniquely determined from the degree of extension in the pressure medium and thermal contractions of the epoxy resin and the single crystal at ambient pressure. Using this novel instrument, we have measured the temperature dependence of the electrical resistance of metallic, superconducting, and insulating materials. The experimental results are discussed from the viewpoint of the extension (compression) of the lattice constants along the parallel (perpendicular) direction.

  18. Selective laser melted titanium implants: a new technique for the reconstruction of extensive zygomatic complex defects.

    Science.gov (United States)

    Rotaru, Horatiu; Schumacher, Ralf; Kim, Seong-Gon; Dinu, Cristian

    2015-12-01

    The restoration of extensive zygomatic complex defects is a surgical challenge owing to the difficulty of accurately restoring the normal anatomy, symmetry, proper facial projection and facial width. In the present study, an extensive post-traumatic zygomatic bone defect was reconstructed using a custom-made implant that was made with a selective laser melting (SLM) technique. The computer-designed implant had the proper geometry and fit perfectly into the defect without requiring any intraoperative adjustments. A one-year follow-up revealed a stable outcome with no complications.

  19. A Knowledge-based and Extensible Aircraft Conceptual Design Environment

    Institute of Scientific and Technical Information of China (English)

    FENG Haocheng; LUO Mingqiang; LIU Hu; WU Zhe

    2011-01-01

    Design knowledge and experience are the bases to carry out aircraft conceptual design tasks due to the high complexity and integration of the tasks during this phase.When carrying out the same task,different designers may need individual strategies to fulfill their own demands.A knowledge-based and extensible method in building aircraft conceptual design systems is studied considering the above requirements.Based on the theory,a knowledge-based aircraft conceptual design environment,called knowledge-based and extensible aircraft conceptual design environment (KEACDE) with open architecture,is built as to enable designers to wrap add-on extensions and make their own aircraft conceptual design systems.The architecture,characteristics and other design and development aspects of KEACDE are discussed.A civil airplane conceptual design system (CACDS) is achieved using KEACDE.Finally,a civil airplane design case is presented to demonstrate the usability and effectiveness of this environment.

  20. Center-blocked field technique for treatment of extensive chest wall disease

    Energy Technology Data Exchange (ETDEWEB)

    Podgorsak, E.B.; Pla, M.; Kim, T.H.; Freeman, C.R.

    1981-10-01

    Our treatment technique for patients with extensive chest wall disese is presented. A rotational center-blocked radiation field is used to cover the large tumor volume to a dose with +/- 10% while sparing the lungs and the spinal cord. The center block is tapered to match both the patient's mediastinal slope in the sagittal plane and the outline of the lungs in the coronal plane. Ten patients treated with this technique to a tumor dose of 50 Gy tolerated the treatment well, despite a high integral dose. The local responses were excellent, particularly in view of the initial extent of the disease.

  1. Anesthetic management of peripartum cardiomyopathy using "epidural volume extension" technique: A case series

    Directory of Open Access Journals (Sweden)

    Akhilesh Kumar Tiwari

    2012-01-01

    Full Text Available Peripartum cardiomyopathy is a rare cause of dilated cardiomyopathy in parturients, occurring in approximately one in 1000 deliveries, manifesting during the last few months or the first 5 months of the postpartum period. It can result in severe ventricular dysfunction during late puerperium. The major concern while managing these patients is to optimize fluid administration and avoid myocardial depression, while maintaining stable intraoperative hemodynamics. We present a case series of five parturients that were posted for elective cesarean section and managed successfully by the epidural volume extension technique.

  2. Decomposition Techniques and Effective Algorithms in Reliability-Based Optimization

    DEFF Research Database (Denmark)

    Enevoldsen, I.; Sørensen, John Dalsgaard

    1995-01-01

    The common problem of an extensive number of limit state function calculations in the various formulations and applications of reliability-based optimization is treated. It is suggested to use a formulation based on decomposition techniques so the nested two-level optimization problem can be solved...

  3. THE PRIMER EXTENSION TECHNIQUE FOR THE POLYMORPHISM DETECTION AT OVINE PRN-P LOCUS

    Directory of Open Access Journals (Sweden)

    COSIER VIORICA

    2008-01-01

    Full Text Available Scrapie is a prionic illness with endemic character in many parts of the glob, and the control measuresis difficult to apply because of the long incubation period, the lack of the preclinical manifestation andthe existing tests for diagnostic in living animals. The Ppn-p locus is polymorphic with knownvariability at codon 136, 154, 171, which are associated with different sensibility in experimental andnatural spongiform encephalopaties. General the possible combinations of the 5 amino acids encodedby the 3 different codons will determine the existence of 15 possible genotypes. To put in evidencethose polymorphisms at the ovine Prn-p locus, several methods are developed but the most accurateassay is the direct sequencing of the gene and the primer extension technique. The purpose of thisstudy was to determine the genotypes at Prp locus in 123 male of Tsurcana breed, Hateg ecotype,using primer extension technique (ABI 3130xl Genetic Analyzer and to establish the risk groups of thesusceptibility at scrapie disease.

  4. THE PRIMER EXTENSION TECHNIQUE FOR THE POLYMORPHISM DETECTION AT OVINE PRN-P LOCUS

    Directory of Open Access Journals (Sweden)

    VIORICA COSIER

    2013-12-01

    Full Text Available Scrapie is a prionic illness with endemic character in many parts of the glob, and the control measures is difficult to apply because of the long incubation period, the lack of the preclinical manifestation and the existing tests for diagnostic in living animals. The Ppn-p locus is polymorphic with known variability at codon 136, 154, 171, which are associated with different sensibility in experimental and natural spongiform encephalopaties. General the possible combinations of the 5 amino acids encoded by the 3 different codons will determine the existence of 15 possible genotypes. To put in evidence those polymorphisms at the ovine Prn-p locus, several methods are developed but the most accurate assay is the direct sequencing of the gene and the primer extension technique. The purpose of this study was to determine the genotypes at Prp locus in 123 male of Tsurcana breed, Hateg ecotype, using primer extension technique (ABI 3130xl Genetic Analyzer and to establish the risk groups of the susceptibility at scrapie disease.

  5. Promoting Community-based Extension Agents as an Alternative Approach to Formal Agricultural Extension Service Delivery in Northern Ghana

    Directory of Open Access Journals (Sweden)

    Samuel Z. Bonye

    2012-03-01

    Full Text Available The CBEA concept is an alternative to community-based extension intervention aimed at addressing the inadequacy of formal extension services provision to rural poor farmers of the Northern Regions of Ghana. The study sought to find out the extent to which the Community-Based Extension Agent has improved access to extension services to rural farmers. The study used qualitative and quantitative methods such as, Focus Group Discussions, Key Informants, In-depth interviews, Household and Institutional Questionnaires to collect and analyses data. The findings are that: there are vibrant Community Based Extension Agents established providing extension services in crop, livestock and environmental issues in the study District; farmers groups are linked to external agents and other stakeholders for access to credit facilities; the CBEAs were found to be the main link between the community and external agents; the most dominant extension services delivery carried out by the CBEAs in the entire study district were in crop production, livestock production and bushfire management; there are well established criteria for selecting Community Based Extension Agents, and community Based Extension Agents were least motivated. The study recommends among others that: motivation packages such as bicycles would facilitate the movement CBEAs to reach out to majority of the farmers. There is also the need to link CBEAs to relevant institutions/organizations for support and establishment of mechanisms to generate funds to support activities. Finally, stakeholders and organization need to intensify community sensitization and awareness creation on activities of CBEAs.

  6. Promoting Community-Based Extension Agents as an Alternative Approach to Formal Agricultural Extension Service Delivery in Northern Ghana

    Directory of Open Access Journals (Sweden)

    Samuel Z. Bonye

    2012-03-01

    Full Text Available The CBEA concept is an alternative to community-based extension intervention aimed at addressing the inadequacy of formal extension services provision to rural poor farmers of the Northern Regions of Ghana. The study sought to find out the extent to which the Community-Based Extension Agent has improved access to extension services to rural farmers. The study used qualitative and quantitative methods such as, Focus Group Discussions, Key Informants, In-depth interviews, Household and Institutional Questionnaires to collect and analyses data. The findings are that: there are vibrant Community Based Extension Agents established providing extension services in crop, livestock and environmental issues in the study District; farmers groups are linked to external agents and other stakeholders for access to credit facilities; the CBEAs were found to be the main link between the community and external agents; the most dominant extension services delivery carried out by the CBEAs in the entire study district were in crop production, livestock production and bushfire management; there are well established criteria for selecting Community Based Extension Agents, and community Based Extension Agents were least motivated. The study recommends among others that: motivation packages such as bicycles would facilitate the movement CBEAs to reach out to majority of the farmers. There is also the need to link CBEAs to relevant institutions/organizations for support and establishment of mechanisms to generate funds to support activities. Finally, stakeholders and organization need to intensify community sensitization and awareness creation on activities of CBEAs.

  7. Support vector machines optimization based theory, algorithms, and extensions

    CERN Document Server

    Deng, Naiyang; Zhang, Chunhua

    2013-01-01

    Support Vector Machines: Optimization Based Theory, Algorithms, and Extensions presents an accessible treatment of the two main components of support vector machines (SVMs)-classification problems and regression problems. The book emphasizes the close connection between optimization theory and SVMs since optimization is one of the pillars on which SVMs are built.The authors share insight on many of their research achievements. They give a precise interpretation of statistical leaning theory for C-support vector classification. They also discuss regularized twi

  8. The Search for Extension: 7 Steps to Help People Find Research-Based Information on the Internet

    Science.gov (United States)

    Hill, Paul; Rader, Heidi B.; Hino, Jeff

    2012-01-01

    For Extension's unbiased, research-based content to be found by people searching the Internet, it needs to be organized in a way conducive to the ranking criteria of a search engine. With proper web design and search engine optimization techniques, Extension's content can be found, recognized, and properly indexed by search engines and…

  9. Project Milestone. Analysis of Range Extension Techniques for Battery Electric Vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, Jeremy [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wood, Eric [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Pesaran, Ahmad [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2013-07-01

    This report documents completion of the July 2013 milestone as part of NREL’s Vehicle Technologies Annual Operating Plan with the U.S. Department of Energy. The objective was to perform analysis on range extension techniques for battery electric vehicles (BEVs). This work represents a significant advancement over previous thru-life BEV analyses using NREL’s Battery Ownership Model, FastSim,* and DRIVE.* Herein, the ability of different charging infrastructure to increase achievable travel of BEVs in response to real-world, year-long travel histories is assessed. Effects of battery and cabin thermal response to local climate, battery degradation, and vehicle auxiliary loads are captured. The results reveal the conditions under which different public infrastructure options are most effective, and encourage continued study of fast charging and electric roadway scenarios.

  10. An Extension of the Fuzzy Possibilistic Clustering Algorithm Using Type-2 Fuzzy Logic Techniques

    Directory of Open Access Journals (Sweden)

    Elid Rubio

    2017-01-01

    Full Text Available In this work an extension of the Fuzzy Possibilistic C-Means (FPCM algorithm using Type-2 Fuzzy Logic Techniques is presented, and this is done in order to improve the efficiency of FPCM algorithm. With the purpose of observing the performance of the proposal against the Interval Type-2 Fuzzy C-Means algorithm, several experiments were made using both algorithms with well-known datasets, such as Wine, WDBC, Iris Flower, Ionosphere, Abalone, and Cover type. In addition some experiments were performed using another set of test images to observe the behavior of both of the above-mentioned algorithms in image preprocessing. Some comparisons are performed between the proposed algorithm and the Interval Type-2 Fuzzy C-Means (IT2FCM algorithm to observe if the proposed approach has better performance than this algorithm.

  11. Reach Extension and Capacity Enhancement of VCSEL-Based Transmission Over Single-Lane MMF Links

    DEFF Research Database (Denmark)

    Tatarczak, Anna; Motaghiannezam, S. M. Reza; Kocot, Chris

    2017-01-01

    This paper reviews and examines several techniques for expanding the carrying capacity of multimode fiber (MMF) using vertical cavity surface emitting lasers (VCSELs). The first approach utilizes short wavelength division multiplexing in combination with MMF optimized for operation between 850...... of a standard OM3 MMF to more than 2.1 GHz·km for standard MMF is presented. A statistical model is used to predict the bandwidth enhancement of installed MMF and indicates that significant link extension can be achieved using selective modal launch techniques. These results demonstrate the continued...... effectiveness of VCSEL-based MMF links in current and future data center environments....

  12. DDH-Like Assumptions Based on Extension Rings

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Kiltz, Eike;

    2012-01-01

    We introduce and study a new type of DDH-like assumptions based on groups of prime order q. Whereas standard DDH is based on encoding elements of $\\mathbb{F}_{q}$ “in the exponent” of elements in the group, we ask what happens if instead we put in the exponent elements of the extension ring $R...... obtain, in fact, an infinite hierarchy of progressively weaker assumptions whose complexities lie “between” DDH and CDH. This leads to a large number of new schemes because virtually all known DDH-based constructions can very easily be upgraded to be based on d-DDH. We use the same construction...... and security proof but get better security and moreover, the amortized complexity (e.g, computation per encrypted bit) is the same as when using DDH. We also show that d-DDH, just like DDH, is easy in bilinear groups. We therefore suggest a different type of assumption, the d-vector DDH problems (d...

  13. Nontraditional manufacturing technique-Nano machining technique based on SPM

    Institute of Scientific and Technical Information of China (English)

    DONG; Shen; YAN; Yongda; SUN; Tao; LIANG; Yingchun; CHENG

    2004-01-01

    Nano machining based on SPM is a novel, nontraditional advanced manufacturing technique. There are three main machining methods based on SPM, i.e.single atom manipulation, surface modification using physical or chemical actions and mechanical scratching. The current development of this technique is summarized. Based on the analysis of mechanical scratching mechanism, a 5 μm micro inflation hole is fabricated on the surface of inertial confinement fusion (ICF) target. The processing technique is optimized. The machining properties of brittle material, single crystal Ge, are investigated. A micro machining system combining SPM and a high accuracy stage is developed. Some 2D and 3D microstructures are fabricated using the system. This method has broad applications in the field of nano machining.

  14. Comparison of Three Techniques to Monitor Bathymetric Evolution in a Spatially Extensive, Rapidly Changing Environment

    Science.gov (United States)

    Rutten, J.; Ruessink, G.

    2014-12-01

    The wide variety in spatial and temporal scales inherent to nearshore morphodynamics, together with site-specific environmental characteristics, complicate our current understanding and predictive capability of large (~ km)-scale, long-term (seasons-years) sediment transport patterns and morphologic evolution. The monitoring of this evolution at all relevant scales demands a smart combination of multiple techniques. Here, we compare depth estimates derived from operational optical (Argus video) and microwave (X-band radar) remote sensing with those from jet-ski echo-sounding in an approximately 2.5 km2 large region at the Sand Engine, a 20 Mm3 mega-nourishment at the Dutch coast. Using depth inversion techniques based on linear wave theory, frequent (hourly-daily) bathymetric maps were derived from instantaneous Argus video and X-band radar imagery. Jet-ski surveys were available every 2 to 3 months. Depth inversion on Argus imagery overestimates surveyed depths by up to 0.5 m in shallow water ( 5m) by up to 1 m. Averaged over the entire subtidal study area, the errors canceled in volumetric budget computations. Additionally, estimates of shoreline and subtidal sandbar positions were derived from Argus imagery and jet-ski surveys. Sandbar crest positions extracted from daily low-tide time-exposure Argus images reveal a persistent onshore offset of some 20 m, but do show the smaller temporal variability not visible from jet-ski surveys. Potential improvements to the applied depth-inversion technique will be discussed.

  15. Extensibility in Model-Based Business Process Engines

    Science.gov (United States)

    Sánchez, Mario; Jiménez, Camilo; Villalobos, Jorge; Deridder, Dirk

    An organization’s ability to embrace change, greatly depends on systems that support their operation. Specifically, process engines might facilitate or hinder changes, depending on their flexibility, their extensibility and the changes required: current workflow engine characteristics create difficulties in organizations that need to incorporate some types of modifications. In this paper we present Cumbia, an extensible MDE platform to support the development of flexible and extensible process engines. In a Cumbia process, models represent participating concerns (control, resources, etc.), which are described with concern-specific languages. Cumbia models are executed in a coordinated way, using extensible engines specialized for each concern.

  16. Wavelet-based embedded zerotree extension to color coding

    Science.gov (United States)

    Franques, Victoria T.

    1998-03-01

    Recently, a new image compression algorithm was developed which employs wavelet transform and a simple binary linear quantization scheme with an embedded coding technique to perform data compaction. This new family of coder, Embedded Zerotree Wavelet (EZW), provides a better compression performance than the current JPEG coding standard for low bit rates. Since EZW coding algorithm emerged, all of the published coding results related to this coding technique are on monochrome images. In this paper the author has enhanced the original coding algorithm to yield a better compression ratio, and has extended the wavelet-based zerotree coding to color images. Color imagery is often represented by several components, such as RGB, in which each component is generally processed separately. With color coding, each component could be compressed individually in the same manner as a monochrome image, therefore requiring a threefold increase in processing time. Most image coding standards employ de-correlated components, such as YIQ or Y, CB, CR and subsampling of the 'chroma' components, such coding technique is employed here. Results of the coding, including reconstructed images and coding performance, will be presented.

  17. Service-based extensions to the JDL fusion model

    Science.gov (United States)

    Antony, Richard T.; Karakowski, Joseph A.

    2008-04-01

    Extensions to a previously developed service-based fusion process model are presented. The model accommodates (1) traditional sensor data and human-generated input, (2) streaming and non-streaming data feeds, and (3) the fusion of both physical and non-physical entities. More than a dozen base-level fusion services are identified. These services provide the foundation functional decomposition of levels 0 - 2 in JDL fusion model. Concepts, such as clustering, link analysis and database mining, that have traditionally been only loosely associated with the fusion process, are shown to play key roles within this fusion framework. Additionally, the proposed formulation extends the concepts of tracking and cross-entity association to non-physical entities, as well as supports effective exploitation of a priori and derived context knowledge. Finally, the proposed framework is shown to support set theoretic properties, such as equivalence and transitivity, as well as the development of a pedigree summary metric that characterizes the informational distance between individual fused products and source data.

  18. [Endoscopic surgery and reconstruction for extensive osteoradionecrosis of skull base after radiotherapy for nasopharyngeal carcinoma].

    Science.gov (United States)

    Chen, Z; Qiu, Q H; Zhan, J B; Zhu, Z C; Peng, Y; Liu, H

    2016-12-07

    Objective: To investigate the clinical efficacy of endoscopic surgery for extensive osteoradionecrosis (ORN) of skull base in patients with nasopharyngeal carcinoma (NPC) after radiotherapy. Methods: Seventeen patients diagnosed as ORN of skull base after radiotherapy for NPC and underwent endoscopic surgery were retrospectively studied with their clinic data. Results: Based on the CT and endoscopic examination, all patients had large skull base defects with bone defects averaged 7.02 cm(2) (range, 3.60 - 14.19 cm(2)). Excepting for curetting the sequestra, endoscopic surgery was also used to repair the wound or to protect the internal carotid artery with flap in 12 patients. No bone reconstructions were conducted in all patients with the bone defects of skull base. CT examinations were taken after endoscopic surgery when required. The postoperative follow-up ranged from 8 months to 6 years (average, 14 months). Aside from 1 patient with delayed cerebrospinal fluid (CSF), others had no related complications. Conclusions: The patients with extensive ORN can be treated with endoscopic surgery to curette the necrotic bone of skull base, and endoscopic reconstruction provides an alternative technique. It may not be necessary to reconstruct the bone defects at skull base, however, the exposed important structures of skull base, such as internal carotid artery, need to repair with soft tissue such as flap.

  19. Combined surgical and catheter-based treatment of extensive thoracic aortic aneurysm and aortic valve stenosis

    DEFF Research Database (Denmark)

    De Backer, Ole; Lönn, Lars; Søndergaard, Lars

    2015-01-01

    endovascular aneurysm repair (TEVAR) has changed and extended management options in thoracic aorta disease, including in those patients deemed unfit or unsuitable for open surgery. Accordingly, transcatheter aortic valve replacement (TAVR) is increasingly used to treat patients with symptomatic severe aortic...... valve stenosis (AS) who are considered at high risk for surgical aortic valve replacement. In this report, we describe the combined surgical and catheter-based treatment of an extensive TAA and AS. To our knowledge, this is the first report of hybrid TAA repair combined with TAVR.......An extensive thoracic aortic aneurysm (TAA) is a potentially life-threatening condition and remains a technical challenge to surgeons. Over the past decade, repair of aortic arch aneurysms has been accomplished using both hybrid (open and endovascular) and totally endovascular techniques. Thoracic...

  20. Using the Delphi Technique to Assess Educational Needs Related to Extension's 4-H Beef Program.

    Science.gov (United States)

    Shih, Ching-Chun; Gamon, Julia A.

    1997-01-01

    Delphi panels completing questionnaires included 32 parents of 4-H students, 16 extension beef specialists, 21 4-H field specialists, and 21 industry representatives. They identified 31 subject-matter and 30 life-skill topics useful for 4-H manuals. Emerging topics included consumer and environmental concerns. (SK)

  1. A Conformal Extension Theorem based on Null Conformal Geodesics

    CERN Document Server

    Lübbe, Christian

    2008-01-01

    In this article we describe the formulation of null geodesics as null conformal geodesics and their description in the tractor formalism. A conformal extension theorem through an isotropic singularity is proven by requiring the boundedness of the tractor curvature and its derivatives to sufficient order along a congruence of null conformal geodesic. This article extends earlier work by Tod and Luebbe.

  2. Graph based techniques for tag cloud generation

    DEFF Research Database (Denmark)

    Leginus, Martin; Dolog, Peter; Lage, Ricardo Gomes

    2013-01-01

    Tag cloud is one of the navigation aids for exploring documents. Tag cloud also link documents through the user defined terms. We explore various graph based techniques to improve the tag cloud generation. Moreover, we introduce relevance measures based on underlying data such as ratings or citat......Tag cloud is one of the navigation aids for exploring documents. Tag cloud also link documents through the user defined terms. We explore various graph based techniques to improve the tag cloud generation. Moreover, we introduce relevance measures based on underlying data such as ratings...

  3. Extensive Characterisation of Copper-clad Plates, Bonded by the Explosive Technique, for ITER Electrical Joints

    CERN Document Server

    Langeslag, S A E; Libeyre, P; Gung, C Y

    2015-01-01

    Cable-in-conduit conductors will be extensively implemented in the large superconducting magnet coils foreseen to confine the plasma in the ITER experiment. The design of the various magnet systems imposes the use of electrical joints to connect unit lengths of superconducting coils by inter-pancake coupling. These twin-box lap type joints, produced by compacting each cable end in into a copper - stainless steel bimetallic box, are required to be highly performing in terms of electrical and mechanical prop- erties. To ascertain the suitability of the first copper-clad plates, recently produced, the performance of several plates is studied. Validation of the bonded interface is carried out by determining microstructural, tensile and shear characteristics. These measure- ments confirm the suitability of explosion bonded copper-clad plates for an overall joint application. Additionally, an extensive study is conducted on the suitability of certain copper purity grades for the various joint types.

  4. Extensive Generalization of Statistical Mechanics Based on Incomplete Information Theory

    Directory of Open Access Journals (Sweden)

    Qiuping A. Wang

    2003-06-01

    Full Text Available Statistical mechanics is generalized on the basis of an additive information theory for incomplete probability distributions. The incomplete normalization is used to obtain generalized entropy . The concomitant incomplete statistical mechanics is applied to some physical systems in order to show the effect of the incompleteness of information. It is shown that this extensive generalized statistics can be useful for the correlated electron systems in weak coupling regime.

  5. Rapid high-throughput genotyping of HBV DNA using a modified hybridization-extension technique.

    Science.gov (United States)

    Bao, Han; Zhao, Wenliang; Ruan, Banjun; Wang, Qing; Zhao, Jinrong; Lei, Xiaoying; Wang, Weihua; Liu, Yonglan; Sun, Jianbing; Xiang, An; Guo, Yanhai; Yan, Zhen

    2013-11-07

    China has the highest incidence of hepatitis B virus (HBV) infection worldwide. HBV genotypes have variable impacts on disease pathogenesis and drug tolerance. We have developed a technically simple and accurate method for HBV genotyping that will be applicable to pre-treatment diagnosis and individualized treatment. Multiple sequence alignments of HBV genomes from GenBank were used to design primers and probes for genotyping of HBV A through H. The hybridization was carried out on nitrocellulose (NC) membranes with probes fixed in an array format, which was followed by hybrid amplification by an extension step with DNA polymerase to reinforce the double-stranded DNA hybrids on the NC membrane and subsequent visualization using an avidin-biotin system. Genotyping results were confirmed by DNA sequencing and bioinformatics analysis using the National Center for Biotechnology Information genotyping database, and compared with results from the line probe assay. The data show that multiple sequence alignment defined a 630 bp region in the HBV PreS and S regions that was suitable for genotyping. All genotyping significant single nucleotides in the region were defined. Two-hundred-and-ninety-one HBV-positive serum samples from Northwest Chinese patients were genotyped, and the genotyping rate from the new modified hybridization-extension method was 100% compared with direct sequencing. Compared with line probe assay, the newly developed method is superior, featuring reduced reaction time, lower risk of contamination, and increased accuracy for detecting single nucleotide mutation. In conclusion, a novel hybridization-extension method for HBV genotyping was established, which represents a new tool for accurate and rapid SNP detection that will benefit clinical testing.

  6. Some Novel Solidification Processing Techniques Being Investigated at MSFC: Their Extension for Study Aboard the ISS

    Science.gov (United States)

    Grugel, R. N.; Anilkumar, A. V.; Fedoseyev, A. I.; Mazuruk, K.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    The float-zone and the Bridgman techniques are two classical directional solidification processing methods that are used to improve materials properties. Unfortunately, buoyancy effects and gravity-driven convection due to unstable temperature and/or composition gradients still produce solidified products that exhibit segregation and, consequently, degraded properties. This presentation will briefly introduce how some novel processing applications can minimize detrimental gravitational effects and enhance microstructural uniformity. Discussion follows that to fully understand and model these procedures requires utilizing, in conjunction with a novel mixing technique, the facilities and quiescent microgravity environment available on the ISS.

  7. Flow Modeling Based Wall Element Technique

    Directory of Open Access Journals (Sweden)

    Sabah Tamimi

    2012-08-01

    Full Text Available Two types of flow where examined, pressure and combination of pressure and Coquette flow of confined turbulent flow with a one equation model used to depict the turbulent viscosity of confined flow in a smooth straight channel when a finite element technique based on a zone close to a solid wall has been adopted for predicting the distribution of the pertinent variables in this zone and examined even with case when the near wall zone was extended away from the wall. The validation of imposed technique has been tested and well compared with other techniques.

  8. OpenARC: Extensible OpenACC Compiler Framework for Directive-Based Accelerator Programming Study

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seyong [ORNL; Vetter, Jeffrey S [ORNL

    2014-01-01

    Directive-based, accelerator programming models such as OpenACC have arisen as an alternative solution to program emerging Scalable Heterogeneous Computing (SHC) platforms. However, the increased complexity in the SHC systems incurs several challenges in terms of portability and productivity. This paper presents an open-sourced OpenACC compiler, called OpenARC, which serves as an extensible research framework to address those issues in the directive-based accelerator programming. This paper explains important design strategies and key compiler transformation techniques needed to implement the reference OpenACC compiler. Moreover, this paper demonstrates the efficacy of OpenARC as a research framework for directive-based programming study, by proposing and implementing OpenACC extensions in the OpenARC framework to 1) support hybrid programming of the unified memory and separate memory and 2) exploit architecture-specific features in an abstract manner. Porting thirteen standard OpenACC programs and three extended OpenACC programs to CUDA GPUs shows that OpenARC performs similarly to a commercial OpenACC compiler, while it serves as a high-level research framework.

  9. A Shape Based Image Search Technique

    Directory of Open Access Journals (Sweden)

    Aratrika Sarkar

    2014-08-01

    Full Text Available This paper describes an interactive application we have developed based on shaped-based image retrieval technique. The key concepts described in the project are, imatching of images based on contour matching; iimatching of images based on edge matching; iiimatching of images based on pixel matching of colours. Further, the application facilitates the matching of images invariant of transformations like i translation ; ii rotation; iii scaling. The key factor of the system is, the system shows the percentage unmatched of the image uploaded with respect to the images already existing in the database graphically, whereas, the integrity of the system lies on the unique matching techniques used for optimum result. This increases the accuracy of the system. For example, when a user uploads an image say, an image of a mango leaf, then the application shows all mango leaves present in the database as well other leaves matching the colour and shape of the mango leaf uploaded.

  10. Extensive Test of an SU(3)-based Partial Dynamical Symmetry

    Science.gov (United States)

    Casten, R. F.

    2014-09-01

    The concept of symmetries pervades much of our understanding of nature. In nuclear structure, the IBA embodies a framework with three dynamical symmetries U(5), O(6) and SU(3). Of course, most nuclei break these symmetries. Leviatan has discussed a concept of Partial Dynamical Symmetry (PDS) in which the states of the ground and gamma bands, only, are exactly described by SU(3) while all others are not. With an E2 operator which is not a generator of SU(3), this PDS gives a parameter-free description of γ to ground band relative B(E2) values in 168Er that is virtually identical to the best collective model (IBA) calculations with 2-3 parameters. We have carried out the first extensive study of this PDS, in 47 rare earth nuclei. Overall, the PDS works very well, and the deviations from the data are usually understandable in terms of specific kinds of mixing.

  11. Development of CDMS-II Surface Event Rejection Techniques and Their Extensions to Lower Energy Thresholds

    Energy Technology Data Exchange (ETDEWEB)

    Hofer, Thomas James [Univ. of Minnesota, Minneapolis, MN (United States)

    2014-12-01

    The CDMS-II phase of the Cryogenic Dark Matter Search, a dark matter direct-detection experiment, was operated at the Soudan Underground Laboratory from 2003 to 2008. The full payload consisted of 30 ZIP detectors, totaling approximately 1.1 kg of Si and 4.8 kg of Ge, operated at temperatures of 50 mK. The ZIP detectors read out both ionization and phonon pulses from scatters within the crystals; channel segmentation and analysis of pulse timing parameters allowed e ective ducialization of the crystal volumes and background rejection su cient to set world-leading limits at the times of their publications. A full re-analysis of the CDMS-II data was motivated by an improvement in the event reconstruction algorithms which improved the resolution of ionization energy and timing information. The Ge data were re-analyzed using three distinct background-rejection techniques; the Si data from runs 125 - 128 were analyzed for the rst time using the most successful of the techniques from the Ge re-analysis. The results of these analyses prompted a novel \\mid-threshold" analysis, wherein energy thresholds were lowered but background rejection using phonon timing information was still maintained. This technique proved to have signi cant discrimination power, maintaining adequate signal acceptance and minimizing background leakage. The primary background for CDMS-II analyses comes from surface events, whose poor ionization collection make them di cult to distinguish from true nuclear recoil events. The novel detector technology of SuperCDMS, the successor to CDMS-II, uses interleaved electrodes to achieve full ionization collection for events occurring at the top and bottom detector surfaces. This, along with dual-sided ionization and phonon instrumentation, allows for excellent ducialization and relegates the surface-event rejection techniques of CDMS-II to a secondary level of background discrimination. Current and future SuperCDMS results hold great promise for mid- to low

  12. Extension and application of a scaling technique for duplication of in-flight aerodynamic heat flux in ground test facilities

    NARCIS (Netherlands)

    Veraar, R.G.

    2009-01-01

    To enable direct experimental duplication of the inflight heat flux distribution on supersonic and hypersonic vehicles, an aerodynamic heating scaling technique has been developed. The scaling technique is based on the analytical equations for convective heat transfer for laminar and turbulent bound

  13. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  14. Rapid Tooling Technique Based on Stereolithograph Prototype

    Institute of Scientific and Technical Information of China (English)

    丁浩; 狄平; 顾伟生; 朱世根

    2001-01-01

    Rapid tooling technique based on the sterelithograph prototype is investigated. The epoxy tooling technological process was elucidated. It is analyzed in detail that the epoxy resin formula is easy to cast, curing process, and release agents. The transitional plaster model is also proposed. The mold to encrust mutual.inductors with epoxy and mold to inject plastic soapboxes was made with the technique The tooling needs very little time and cost, for the process is only to achieve the nice replica of the prototype. It is benefit for the trial and small batch of production.

  15. Language Based Techniques for Systems Biology

    DEFF Research Database (Denmark)

    Pilegaard, Henrik

    calculi have similarly been used for the study of bio-chemical reactive systems. In this dissertation it is argued that techniques rooted in the theory and practice of programming languages, language based techniques if you will, constitute a strong basis for the investigation of models of biological.......g., the effects of receptor defects or drug delivery mechanisms. The property of sequential realisability. which is closely related to the function of biochemical pathways, is addressed by a variant of traditional Data Flow Analysis (DFA). This so-called ‘Pathway Analysis’ computes safe approximations to the set...

  16. Extensive Taguchi's Quality Loss Function Based On Asymmetric tolerances

    Institute of Scientific and Technical Information of China (English)

    ZHU Wei; LI Yuan-sheng; LIU Feng

    2004-01-01

    If specification interval is asymmetric, basic specification is the target value of quality characteristics. In this paper Taguchi's quality loss function is applied to describe quality loss based on asymmetric tolerances. The measurement of quality loss which is caused by the deviation of quality characteristics from basic specification is further presented.

  17. Low-cost computer classification of land cover in the Portland area, Oregon, by signature extension techniques

    Science.gov (United States)

    Gaydos, Leonard

    1978-01-01

    Computer-aided techniques for interpreting multispectral data acquired by Landsat offer economies in the mapping of land cover. Even so, the actual establishment of the statistical classes, or "signatures," is one of the relatively more costly operations involved. Analysts have therefore been seeking cost-saving signature extension techniques that would accept training data acquired for one time or place and apply them to another. Opportunities to extend signatures occur in preprocessing steps and in the classification steps that follow. In the present example, land cover classes were derived by the simplest and most direct form of signature extension: Classes statistically derived from a Landsat scene for the Puget Sound area, Wash., were applied to the Portland area, Oreg., using data for the next Landsat scene acquired less than 25 seconds down orbit. Many features can be recognized on the reduced-scale version of the Portland land cover map shown in this report, although no statistical assessment of its accuracy is available.

  18. Interactive early warning technique based on SVDD

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    After reviewing current researches on early warning,it is found that"bad" data of some systems is not easy to obtain,which makes methods proposed by these researches unsuitable for monitored systems.An interactive early warning technique based on SVDD(support vector data description)is proposed to adopt"good" data as samples to overcome the difficulty in obtaining the"bad"data.The process consists of two parts:(1)A hypersphere is fitted on"good"data using SVDD.If the data object are outside the hypersphere,it would be taken as"suspicious";(2)A group of experts would decide whether the suspicious data is"bad"or"good",early warning messages would be issued according to the decisions.And the detailed process of implementation is proposed.At last,an experiment based on data of a macroeconomic system is conducted to verify the proposed technique.

  19. MATRIX BASED INDEXING TECHNIQUE FOR VIDEO DATA

    Directory of Open Access Journals (Sweden)

    Devarj Saravanan

    2013-01-01

    Full Text Available Due to increasing the usage of media, the utilization of video play central role as it supports various applications. Video is the particular media which contains complex collection of objects like audio, motion, text, color and picture. Due to the rapid growth of this information video indexing process is mandatory for fast and effective retrieval. Many current indexing techniques fails to extract the needed image from the stored data set, based on the users query. Urgent attention in the field of video indexing and image retrieval is the need of the hour. Here a new matrix based indexing technique for image retrieval has been proposed. The proposed method provide better result, experimental results prove this.

  20. MATRIX BASED INDEXING TECHNIQUE FOR VIDEO DATA

    OpenAIRE

    2013-01-01

    Due to increasing the usage of media, the utilization of video play central role as it supports various applications. Video is the particular media which contains complex collection of objects like audio, motion, text, color and picture. Due to the rapid growth of this information video indexing process is mandatory for fast and effective retrieval. Many current indexing techniques fails to extract the needed image from the stored data set, based on the users query. Urgent attention in the fi...

  1. Multiview video codec based on KTA techniques

    Science.gov (United States)

    Seo, Jungdong; Kim, Donghyun; Ryu, Seungchul; Sohn, Kwanghoon

    2011-03-01

    Multi-view video coding (MVC) is a video coding standard developed by MPEG and VCEG for multi-view video. It showed average PSNR gain of 1.5dB compared with view-independent coding by H.264/AVC. However, because resolutions of multi-view video are getting higher for more realistic 3D effect, high performance video codec is needed. MVC adopted hierarchical B-picture structure and inter-view prediction as core techniques. The hierarchical B-picture structure removes the temporal redundancy, and the inter-view prediction reduces the inter-view redundancy by compensated prediction from the reconstructed neighboring views. Nevertheless, MVC has inherent limitation in coding efficiency, because it is based on H.264/AVC. To overcome the limit, an enhanced video codec for multi-view video based on Key Technology Area (KTA) is proposed. KTA is a high efficiency video codec by Video Coding Expert Group (VCEG), and it was carried out for coding efficiency beyond H.264/AVC. The KTA software showed better coding gain than H.264/AVC by using additional coding techniques. The techniques and the inter-view prediction are implemented into the proposed codec, which showed high coding gain compared with the view-independent coding result by KTA. The results presents that the inter-view prediction can achieve higher efficiency in a multi-view video codec based on a high performance video codec such as HEVC.

  2. An extensible standards-based control system on a budget

    Science.gov (United States)

    Ford, John M.; Langston, Glen; Shelton, John; Weadon, Tim

    2006-06-01

    The National Radio Astronomy Observatory (NRAO) in Green Bank was charged with replacing and enhancing the original control system on the NRAO 43-Meter (43m) telescope, for a minimum amount of labor, time and materials. The original 1960's vintage design required continuous operator presence for monitoring and control of the telescope. A fully automated, unattended operation was desired, along with better tracking performance at high speeds and reduced maintenance costs. We responded with a design based on proven industrial control technology, RTAI/Linux computers, and hardware and software adapted from the GBT and other NRAO telescopes. Commercial off-the-shelf software packages were also used in the system. We describe the overall design of the system and the decision process that led to the adoption of the various pieces of hardware and software, including the tradeoffs made between buying and building systems, and allocation of telescope functions between subsystems.

  3. Extension of silo discharge model based on discrete element method

    Energy Technology Data Exchange (ETDEWEB)

    Oldal, Istvan; Safranyil, Ferenc [Szent Istvan University, Goedoelloe (Hungary)

    2015-09-15

    Silos are containers used by almost all fields of industry for storing granular materials and generally classified in two types: mass flow and funnel flow. One of the most important design parameter of these equipment is the discharge rate which depends on the flow mode. There are high numbers of analytical and empirical models used for determine this parameter, however none of them is suitable for both flow modes; moreover the accuracy of mass flow models is not acceptable. Recently a few numerical discharge models are made for certain geometries; but the applicability of these models in case of different flow modes was not examined. Aim of our work is the creation of an experimentally validated numerical discharge model based on others work and examination of this in term of different flow modes. We prove that our modified model is suitable for determine silos discharge rate independently from flow mode.

  4. Performance Evaluation of Extension Education Centers in Universities Based on the Balanced Scorecard

    Science.gov (United States)

    Wu, Hung-Yi; Lin, Yi-Kuei; Chang, Chi-Hsiang

    2011-01-01

    This study aims at developing a set of appropriate performance evaluation indices mainly based on balanced scorecard (BSC) for extension education centers in universities by utilizing multiple criteria decision making (MCDM). Through literature reviews and experts who have real practical experiences in extension education, adequate performance…

  5. Resection of giant ethmoid osteoma with orbital and skull base extension followed by duraplasty

    Directory of Open Access Journals (Sweden)

    Ferekidou Eliza

    2008-10-01

    Full Text Available Abstract Background Osteomas of ethmoid sinus are rare, especially when they involve anterior skull base and orbit, and lead to ophthalmologic and neurological symptoms. Case presentation The present case describes a giant ethmoid osteoma. Patient symptoms and signs were exophthalmos and proptosis of the left eye, with progressive visual acuity impairment and visual fields defects. CT/MRI scanning demonstrated a huge osseous lesion of the left ethmoid sinus (6.5 cm × 5 cm × 2.2 cm, extending laterally in to the orbit and cranially up to the anterior skull base. Bilateral extensive polyposis was also found. Endoscopic and external techniques were combined to remove the lesion. Bilateral endoscopic polypectomy, anterior and posterior ethmoidectomy and middle meatus antrostomy were performed. Finally, the remaining part of the tumor was reached and dissected from the surrounding tissue via a minimally invasive Lynch incision around the left middle canthus. During surgery, CSF rhinorrhea was observed and leakage was grafted with fascia lata and coated with bio-glu. Postoperatively, symptoms disappeared. Eighteen months after surgery, the patient is still free of symptoms. Conclusion Before management of ethmoid osteomas with intraorbital and skull base extension, a thorough neurological, ophthalmological and imaging evaluation is required, in order to define the bounders of the tumor, carefully survey the severity of symptoms and signs, and precisely plan the optimal treatment. The endoscopic procedure can constitute an important part of surgery undertaken for giant ethmoidal osteomas. In addition, surgeons always have to take into account a possible CSF leak and they have to be prepared to resolve it.

  6. Design of extensible meteorological data acquisition system based on FPGA

    Science.gov (United States)

    Zhang, Wen; Liu, Yin-hua; Zhang, Hui-jun; Li, Xiao-hui

    2015-02-01

    In order to compensate the tropospheric refraction error generated in the process of satellite navigation and positioning. Temperature, humidity and air pressure had to be used in concerned models to calculate the value of this error. While FPGA XC6SLX16 was used as the core processor, the integrated silicon pressure sensor MPX4115A and digital temperature-humidity sensor SHT75 are used as the basic meteorological parameter detection devices. The core processer was used to control the real-time sampling of ADC AD7608 and to acquire the serial output data of SHT75. The data was stored in the BRAM of XC6SLX16 and used to generate standard meteorological parameters in NEMA format. The whole design was based on Altium hardware platform and ISE software platform. The system was described in the VHDL language and schematic diagram to realize the correct detection of temperature, humidity, air pressure. The 8-channel synchronous sampling characteristics of AD7608 and programmable external resources of FPGA laid the foundation for the increasing of analog or digital meteorological element signal. The designed meteorological data acquisition system featured low cost, high performance, multiple expansions.

  7. DDH-like Assumptions Based on Extension Rings

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Kiltz, Eike;

    2011-01-01

    generalized to use instead d-DDH, and we show in the generic group model that d-DDH is harder than DDH. This means that virtually any application of DDH can now be realized with the same (amortized) efficiency, but under a potentially weaker assumption. On the negative side, we also show that d-DDH, just like...... DDH, is easy in bilinear groups. This motivates our suggestion of a different type of assumption, the d-vector DDH problems (VDDH), which are based on f(X)= X^d, but with a twist to avoid the problems with reducible polynomials. We show in the generic group model that VDDH is hard in bilinear groups...... and that in fact the problems become harder with increasing d and hence form an infinite hierarchy. We show that hardness of VDDH implies CCA-secure encryption, efficient Naor-Reingold style pseudorandom functions, and auxiliary input secure encryption, a strong form of leakage resilience. This can be seen...

  8. XSemantic: An Extension of LCA Based XML Semantic Search

    Science.gov (United States)

    Supasitthimethee, Umaporn; Shimizu, Toshiyuki; Yoshikawa, Masatoshi; Porkaew, Kriengkrai

    One of the most convenient ways to query XML data is a keyword search because it does not require any knowledge of XML structure or learning a new user interface. However, the keyword search is ambiguous. The users may use different terms to search for the same information. Furthermore, it is difficult for a system to decide which node is likely to be chosen as a return node and how much information should be included in the result. To address these challenges, we propose an XML semantic search based on keywords called XSemantic. On the one hand, we give three definitions to complete in terms of semantics. Firstly, the semantic term expansion, our system is robust from the ambiguous keywords by using the domain ontology. Secondly, to return semantic meaningful answers, we automatically infer the return information from the user queries and take advantage of the shortest path to return meaningful connections between keywords. Thirdly, we present the semantic ranking that reflects the degree of similarity as well as the semantic relationship so that the search results with the higher relevance are presented to the users first. On the other hand, in the LCA and the proximity search approaches, we investigated the problem of information included in the search results. Therefore, we introduce the notion of the Lowest Common Element Ancestor (LCEA) and define our simple rule without any requirement on the schema information such as the DTD or XML Schema. The first experiment indicated that XSemantic not only properly infers the return information but also generates compact meaningful results. Additionally, the benefits of our proposed semantics are demonstrated by the second experiment.

  9. New Power Quality Analysis Method Based on Chaos Synchronization and Extension Neural Network

    Directory of Open Access Journals (Sweden)

    Meng-Hui Wang

    2014-10-01

    Full Text Available A hybrid method comprising a chaos synchronization (CS-based detection scheme and an Extension Neural Network (ENN classification algorithm is proposed for power quality monitoring and analysis. The new method can detect minor changes in signals of the power systems. Likewise, prominent characteristics of system signal disturbance can be extracted by this technique. In the proposed approach, the CS-based detection method is used to extract three fundamental characteristics of the power system signal and an ENN-based clustering scheme is then applied to detect the state of the signal, i.e., normal, voltage sag, voltage swell, interruption or harmonics. The validity of the proposed method is demonstrated by means of simulations given the use of three different chaotic systems, namely Lorenz, New Lorenz and Sprott. The simulation results show that the proposed method achieves a high detection accuracy irrespective of the chaotic system used or the presence of noise. The proposed method not only achieves higher detection accuracy than existing methods, but also has low computational cost, an improved robustness toward noise, and improved scalability. As a result, it provides an ideal solution for the future development of hand-held power quality analyzers and real-time detection devices.

  10. Diagrammatic Monte Carlo approach for diagrammatic extensions of dynamical mean-field theory -- convergence analysis of the dual fermion technique

    CERN Document Server

    Gukelberger, Jan; Hafermann, Hartmut

    2016-01-01

    The dual-fermion approach provides a formally exact prescription for calculating properties of a correlated electron system in terms of a diagrammatic expansion around dynamical mean-field theory (DMFT). It can address the full range of interactions, the lowest order theory is asymptotically exact in both the weak- and strong-coupling limits, and the technique naturally incorporates long-range correlations beyond the reach of current cluster extensions to DMFT. Most practical implementations, however, neglect higher-order interaction vertices beyond two-particle scattering in the dual effective action and further truncate the diagrammatic expansion in the two-particle scattering vertex to a leading-order or ladder-type approximation. In this work we compute the dual-fermion expansion for the Hubbard model including all diagram topologies with two-particle interactions to high orders by means of a stochastic diagrammatic Monte Carlo algorithm. We use benchmarking against numerically exact Diagrammatic Determin...

  11. Knowledge-based techniques in software engineering

    Energy Technology Data Exchange (ETDEWEB)

    Jairam, B.N.; Agarwal, A.; Emrich, M.L.

    1988-05-04

    Recent trends in software engineering research focus on the incorporation of AI techniques. The feasibility of an overlap between AI and software engineering is examined. The benefits of merging the two fields are highlighted. The long-term goal is to automate the software development process. Some projects being undertaken towards the attainment of this goal are presented as examples. Finally, research on the Oak Ridge Reservation aimed at developing a knowledge-based software project management aid is presented. 25 refs., 1 tab.

  12. Artificial Intelligence based technique for BTS placement

    Science.gov (United States)

    Alenoghena, C. O.; Emagbetere, J. O.; Aibinu, A. M.

    2013-12-01

    The increase of the base transceiver station (BTS) in most urban areas can be traced to the drive by network providers to meet demand for coverage and capacity. In traditional network planning, the final decision of BTS placement is taken by a team of radio planners, this decision is not fool proof against regulatory requirements. In this paper, an intelligent based algorithm for optimal BTS site placement has been proposed. The proposed technique takes into consideration neighbour and regulation considerations objectively while determining cell site. The application will lead to a quantitatively unbiased evaluated decision making process in BTS placement. An experimental data of a 2km by 3km territory was simulated for testing the new algorithm, results obtained show a 100% performance of the neighbour constrained algorithm in BTS placement optimization. Results on the application of GA with neighbourhood constraint indicate that the choices of location can be unbiased and optimization of facility placement for network design can be carried out.

  13. Web-Based Geographic Information Systems: Experience and Perspectives of Planners and the Implications for Extension

    Science.gov (United States)

    Göçmen, Z. Asligül

    2016-01-01

    Web-based geographic information system (GIS) technology, or web-based GIS, offers many opportunities for public planners and Extension educators who have limited GIS backgrounds or resources. However, investigation of its use in planning has been limited. The study described here examined the use of web-based GIS by public planning agencies. A…

  14. Performance Based Novel Techniques for Semantic Web Mining

    Directory of Open Access Journals (Sweden)

    Mahendra Thakur

    2012-01-01

    Full Text Available The explosive growth in the size and use of the World Wide Web continuously creates new great challenges and needs. The need for predicting the users preferences in order to expedite and improve the browsing though a site can be achieved through personalizing of the websites. Most of the research efforts in web personalization correspond to the evolution of extensive research in web usage mining, i.e. the exploitation of the navigational patterns of the web site visitors. When a personalization system relies solely on usage-based results, however, valuable information conceptually related to what is finally recommended may be missed. Moreover, the structural properties of the web site are often disregarded. In this paper, we propose novel techniques that use the content semantics and the structural properties of a web site in order to improve the effectiveness of web personalization. In the first part of our work we present standing for Semantic Web Personalization, a personalization system that integrates usage data with content semantics, expressed in ontology terms, in order to compute semantically enhanced navigational patterns and effectively generate useful recommendations. To the best of our knowledge, our proposed technique is the only semantic web personalization system that may be used by non-semantic web sites. In the second part of our work, we present a novel approach for enhancing the quality of recommendations based on the underlying structure of a web site. We introduce UPR (Usage-based PageRank, a PageRank-style algorithm that relies on the recorded usage data and link analysis techniques. Overall, we demonstrate that our proposed hybrid personalization framework results in more objective and representative predictions than existing techniques.

  15. Search Using N-gram Technique Based Statistical Analysis for Knowledge Extraction in Case Based Reasoning Systems

    OpenAIRE

    Karthik, M. N.; Davis, Moshe

    2004-01-01

    Searching techniques for Case Based Reasoning systems involve extensive methods of elimination. In this paper, we look at a new method of arriving at the right solution by performing a series of transformations upon the data. These involve N-gram based comparison and deduction of the input data with the case data, using Morphemes and Phonemes as the deciding parameters. A similar technique for eliminating possible errors using a noise removal function is performed. The error tracking and elim...

  16. Record extension for short-gauged water quality parameters using a newly proposed robust version of the line of organic correlation technique

    Directory of Open Access Journals (Sweden)

    B. Khalil

    2012-04-01

    Full Text Available In many situations the extension of hydrological or water quality time series at short-gauged stations is required. Ordinary least squares regression (OLS of any hydrological or water quality variable is a traditional and commonly used record extension technique. However, OLS tends to underestimate the variance in the extended records, which leads to underestimation of high percentiles and overestimation of low percentiles, given that the data is normally distributed. The development of the line of organic correlation (LOC technique is aimed at correcting this bias. On the other hand, the Kendall-Theil robust line (KTRL method has been proposed as an analogue of OLS with the advantage of being robust in the presence of outliers. Given that water quality data are characterised by the presence of outliers, positive skewness and non-normal distribution of data, a robust record extension technique is more appropriate. In this paper, four record-extension techniques are described, and their properties are explored. These techniques are OLS, LOC, KTRL and a new technique proposed in this paper, the robust line of organic correlation technique (RLOC. RLOC includes the advantage of the LOC in reducing the bias in estimating the variance, but at the same time it is also robust to the presence of outliers. A Monte Carlo study and empirical experiment were conducted to examine the four techniques for the accuracy and precision of the estimate of statistical moments and over the full range of percentiles. Results of the Monte Carlo study showed that the OLS and KTRL techniques have serious deficiencies as record-extension techniques, while the LOC and RLOC techniques are nearly similar. However, RLOC outperforms OLS, KTRL and LOC when using real water quality records.

  17. Record extension for short-gauged water quality parameters using a newly proposed robust version of the Line of Organic Correlation technique

    Directory of Open Access Journals (Sweden)

    B. Khalil

    2012-07-01

    Full Text Available In many situations the extension of hydrological or water quality time series at short-gauged stations is required. Ordinary least squares regression (OLS of any hydrological or water quality variable is a traditional and commonly used record extension technique. However, OLS tends to underestimate the variance in the extended records, which leads to underestimation of high percentiles and overestimation of low percentiles, given that the data are normally distributed. The development of the line of organic correlation (LOC technique is aimed at correcting this bias. On the other hand, the Kendall-Theil robust line (KTRL method has been proposed as an analogue of OLS with the advantage of being robust in the presence of outliers. Given that water quality data are characterised by the presence of outliers, positive skewness and non-normal distribution of data, a robust record extension technique is more appropriate. In this paper, four record-extension techniques are described, and their properties are explored. These techniques are OLS, LOC, KTRL and a new technique proposed in this paper, the robust line of organic correlation technique (RLOC. RLOC includes the advantage of the LOC in reducing the bias in estimating the variance, but at the same time it is also robust in the presence of outliers. A Monte Carlo study and empirical experiment were conducted to examine the four techniques for the accuracy and precision of the estimate of statistical moments and over the full range of percentiles. Results of the Monte Carlo study showed that the OLS and KTRL techniques have serious deficiencies as record-extension techniques, while the LOC and RLOC techniques are nearly similar. However, RLOC outperforms OLS, KTRL and LOC when using real water quality records.

  18. Agent-based simulation for weekend-extension strategies to mitigate influenza outbreaks

    Directory of Open Access Journals (Sweden)

    Mao Liang

    2011-06-01

    Full Text Available Abstract Background Non-pharmaceutical strategies are vital in curtailing impacts of influenza and have been intensively studied in public health. However, few strategies have explicitly utilized the weekend effect, which has been widely reported to be capable of reducing influenza infections. This study aims to explore six weekend-extension strategies against seasonal and pandemic flu outbreaks. Methods The weekend-extension strategies were designed to extend regular two-day weekend by one, two and three days, respectively, and in combination with either a continuous or discontinuous pattern. Their effectiveness was evaluated using an established agent-based spatially explicit simulation model in the urbanized area of Buffalo, NY, US. Results If the extensions last more than two days, the weekend-extension strategies can remarkably reduce the overall disease attack rate of seasonal flu. Particularly, a three-day continuous extension is sufficient to suppress the epidemic and confine the spread of disease. For the pandemic flu, the weekend-extension strategies only produce a few mitigation effects until the extensions exceed three days. Sensitivity analysis indicated that a compliance level above 75% is necessary for the weekend-extension strategies to take effects. Conclusion This research is the first attempt to incorporate the weekend effect into influenza mitigation strategies. The results suggest that appropriate extensions of the regular two-day weekend can be a potential measure to fight against influenza outbreaks, while minimizing interruptions on normal rhythms of socio-economy. The concept of weekend extension would be particularly useful if there were a lack of vaccine stockpiles, e.g., in countries with limited health resources, or in the case of unknown emerging infectious diseases.

  19. Type extension trees

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    We introduce type extension trees as a formal representation language for complex combinatorial features of relational data. Based on a very simple syntax this language provides a unified framework for expressing features as diverse as embedded subgraphs on the one hand, and marginal counts...... of attribute values on the other. We show by various examples how many existing relational data mining techniques can be expressed as the problem of constructing a type extension tree and a discriminant function....

  20. Novel ring-based architecture for TWDM-PON with high reliability and flexible extensibility

    Science.gov (United States)

    Xiong, Yu; Sun, Peng; Li, Zhiqiang

    2017-02-01

    Time and wavelength division multiplexed passive optical network (TWDM-PON) was determined as a primary solution to NG-PON2 by the full service access network (FSAN) in 2012. Since then, TWDM-PON has been applied to a wider set of applications, including those that are outage sensitive and expansion flexible. So the protection techniques with reliability and flexibility should be studied to address the above needs. In this paper, we propose a novel ring-based architecture for TWDM-PON. The architecture can provide reliable ring protection scheme against a fiber fault occurring on main ring (MR), sub-ring (SR) or last mile ring (LMR). In addition, we exploit the extended node (EN) to realize the network expansion conveniently and smoothly for the flexible extensibility. Thus, more remote nodes(RNs) and optical network units (ONUs) could access this architecture through EN. Moreover, in order to further improve reliability of the network, we design the 1:1 protection scheme against the connected fiber fault between RN and EN. The results show that the proposed architecture has a recovery time of 17 ms under protection mode and the reliability of the network is also illustrated to be greatly improved compared to the network without protection. As the number of ONUs increases, the average cost of each ONU could be gradually reduced. Finally, the simulations verify the feasibility of the architecture.

  1. [Development of single base extension-tags microarray for the detection of food-borne pathogens].

    Science.gov (United States)

    Lu, Changyong; Shi, Chunlei; Zhang, Chunxiu; Chen, Jing; Shi, Xianming

    2009-04-01

    We developed single base extension-tags (SBE-tags) microarray to detect eight common food-borne pathogens, including Staphylococcus aureus, Vibrio parahaemolyticus, Listeria monocytogenes, Salmonella, Enterobacter sakazaki, Shigella, Escherichia coli O157:H7 and Campylobacter jejuni. With specific PCR primers identified and integrated for eight food-borne pathogens, target sequences were amplified and purified as template DNA of single base extension-tags reaction. The products were hybridized to microarrays and scanned for fluorescence intensity. The experiment showed a specific and simultaneous detection of eight food-borne pathogens. The system limits is 0.1 pg for a genomic DNA and 5x10(2) CFU/mL for Salmonella typhimurium cultures. The single base extension-tags assay can be used to detect food-borne pathogens rapidly and accurately with a high sensitivity, and provide an efficient way for diagnosis and control of disease caused by food-borne pathogens.

  2. Variable tension control for discontinuous tape winding of composites based on constant extension ratio

    Science.gov (United States)

    Shi, Yaoyao; Yan, Long; He, Xiaodong

    2012-09-01

    Discontinuous tape winding, which has obvious advantages in large extension ratio winding, is widely used in the molding of composites. Therefore, the research on technological parameters becomes the focus of many scholars. However, how to accomplish the variable tension control is usually not fully considered. Accordingly, the constant extension ratio and the smoothness of winding process cannot be ensured. Aiming at the problem of tension control, this paper first gives a comparatively deep research on the control method and the interaction mechanism of tension, extension ratio, automatic lap and automatic rectification. Then, according to the winding process features, the mechanical device and the mathematical model of tension control system are established respectively. With regard to the characteristics of PID controller and fuzzy controller, the fuzzy self-tuning PID controller is designed. As a result, the variable tension control is realized during the winding and lapping process, and the constant extension ratio is guaranteed. Finally, a sample application is presented for demonstration. By presenting the variable tension control techniques for discontinuous tape winding, the constant extension ratio of tapes is achieved, the consecution and the automation degree of winding process is improved as well. Thus, the quality of wound products is guaranteed.

  3. Towards semantically sensitive text clustering: a feature space modeling technology based on dimension extension.

    Science.gov (United States)

    Liu, Yuanchao; Liu, Ming; Wang, Xin

    2015-01-01

    The objective of text clustering is to divide document collections into clusters based on the similarity between documents. In this paper, an extension-based feature modeling approach towards semantically sensitive text clustering is proposed along with the corresponding feature space construction and similarity computation method. By combining the similarity in traditional feature space and that in extension space, the adverse effects of the complexity and diversity of natural language can be addressed and clustering semantic sensitivity can be improved correspondingly. The generated clusters can be organized using different granularities. The experimental evaluations on well-known clustering algorithms and datasets have verified the effectiveness of our approach.

  4. Structural level characterization of base oils using advanced analytical techniques

    KAUST Repository

    Hourani, Nadim

    2015-05-21

    Base oils, blended for finished lubricant formulations, are classified by the American Petroleum Institute into five groups, viz., groups I-V. Groups I-III consist of petroleum based hydrocarbons whereas groups IV and V are made of synthetic polymers. In the present study, five base oil samples belonging to groups I and III were extensively characterized using high performance liquid chromatography (HPLC), comprehensive two-dimensional gas chromatography (GC×GC), and Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) equipped with atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) sources. First, the capabilities and limitations of each analytical technique were evaluated, and then the availed information was combined to reveal compositional details on the base oil samples studied. HPLC showed the overwhelming presence of saturated over aromatic compounds in all five base oils. A similar trend was further corroborated using GC×GC, which yielded semiquantitative information on the compound classes present in the samples and provided further details on the carbon number distributions within these classes. In addition to chromatography methods, FT-ICR MS supplemented the compositional information on the base oil samples by resolving the aromatics compounds into alkyl- and naphtheno-subtituted families. APCI proved more effective for the ionization of the highly saturated base oil components compared to APPI. Furthermore, for the detailed information on hydrocarbon molecules FT-ICR MS revealed the presence of saturated and aromatic sulfur species in all base oil samples. The results presented herein offer a unique perspective into the detailed molecular structure of base oils typically used to formulate lubricants. © 2015 American Chemical Society.

  5. A New Extension Theory-based Production Operation Method in Industrial Process

    Institute of Scientific and Technical Information of China (English)

    XU Yuan; ZHU Qunxiong

    2013-01-01

    To explore the problems of dynamic change in production demand and operating contradiction in production process,a new extension theory-based production operation method is proposed.The core is the demand requisition,contradiction resolution and operation classification.For the demand requisition,the deep and comprehensive demand elements are collected by the conjugating analysis.For the contradiction resolution,the conflict between the demand and operating elements are solved by the extension reasoning,extension transformation and consistency judgment.For the operating classification,the operating importance among the operating elements is calculated by the extension clustering so as to guide the production operation and ensure the production safety.Through the actual application in the cascade reaction process of high-density polyethylene (HDPE) of a chemicalplant,cases study and comparison show that the proposed extension theory-based production operation method is significantly better than the traditional experience-based operation method in actual production process,which exploits a new way to the research on the production operating methods for industrial process.

  6. Opportunities and Challenges after the Reconstruction of Basic-level Agricultural Technique Extension System, Taking Agricultural Technique Extension in Counties and Towns of Xinjiang as an Example%基层农技推广体系重构后面临的机遇与挑战——以新疆县乡农技推广为例

    Institute of Scientific and Technical Information of China (English)

    王馨; 郑戈; 林祥明; 刘艳

    2015-01-01

    The rural economy development must overcome the common problems existing in the basic-level agricultural technique extension. In this paper, the county-level agricultural technique extension in Xinjiang autonomous region as the research object, opportunities the present agricultural technique extension faced were analyzed. Short factors in the process of technology popularization were researched, such as team construction is not perfect, the production project benefits are not matching, social service system is not sound, diversified development bottleneck and so on. Finally, some suggestions were put forward based on the analysis, i.e., constructing platform for agricultural management main body, optimizing the allocation of resources, enhancing the level of scientific research and information, etc.%克服基层农技推广中存在的共性问题是推动农村经济快速发展的关键.文章以新疆县乡农技推广作为研究对象,在对现阶段农业推广所面临的机遇进行分析的基础上,剖析了推广过程中存在的队伍建设不完善、生产项目补助不匹配、社会服务化体系不健全和多元化发展等问题.针对性地提出了搭建农业经营主体平台、优化资源配置、提升科研及信息化水平等建议.

  7. Grady Highway Extension (Ship Creek Crossing) Elmendorf Air Force Base and Fort Richardson, Alaska

    Science.gov (United States)

    2005-06-01

    DPW Directorate of Public Works DRMO Defense Reutilization and Marketing Office E.O. Executive Order Acronyms and Abbreviations vi...recreation management designation of the proposed road and bridge site is Limited Recreation (open to hiking, skiing, berry picking, birdwatching , and...Environmental Assessment Grady Highway Extension (Ship Creek Crossing) by the Defense Reutilization and Marketing Office (DRMO) located on the Base

  8. Raising Awareness of Assistive Technology in Older Adults through a Community-Based, Cooperative Extension Program

    Science.gov (United States)

    Sellers, Debra M.; Markham, Melinda Stafford

    2012-01-01

    The Fashion an Easier Lifestyle with Assistive Technology (FELAT) curriculum was developed as a needs-based, community educational program provided through a state Cooperative Extension Service. The overall goal for participants was to raise awareness of assistive technology. Program evaluation included a postassessment and subsequent interview to…

  9. 75 FR 15693 - Extension of Web-Based TRICARE Assistance Program Demonstration Project

    Science.gov (United States)

    2010-03-30

    ... related services, including non-medical counseling and advice services to Active Duty Service members...-based technology. DATES: This extension will be effective April 1, 2010. The demonstration project will... and accessible counseling to Service members and their families who live in locations that are...

  10. Dynamics-based Nondestructive Structural Monitoring Techniques

    Science.gov (United States)

    2012-06-21

    in the practice of non- destructive evaluation ( NDE ) and structural health monitoring (SHM). Guided wave techniques have several advantages over...conventional bulk wave ultrasonic NDE /SHM techniques. Some of these advantages are outlined in Table I. However, in addition to the advantages of...PVDF transducers for SHM applications with controlled guided wave modes and frequencies [7]. Wilcox used EMATs with circular coils in a guided wave

  11. DCT-based cyber defense techniques

    Science.gov (United States)

    Amsalem, Yaron; Puzanov, Anton; Bedinerman, Anton; Kutcher, Maxim; Hadar, Ofer

    2015-09-01

    With the increasing popularity of video streaming services and multimedia sharing via social networks, there is a need to protect the multimedia from malicious use. An attacker may use steganography and watermarking techniques to embed malicious content, in order to attack the end user. Most of the attack algorithms are robust to basic image processing techniques such as filtering, compression, noise addition, etc. Hence, in this article two novel, real-time, defense techniques are proposed: Smart threshold and anomaly correction. Both techniques operate at the DCT domain, and are applicable for JPEG images and H.264 I-Frames. The defense performance was evaluated against a highly robust attack, and the perceptual quality degradation was measured by the well-known PSNR and SSIM quality assessment metrics. A set of defense techniques is suggested for improving the defense efficiency. For the most aggressive attack configuration, the combination of all the defense techniques results in 80% protection against cyber-attacks with PSNR of 25.74 db.

  12. Chaotic Extension Neural Network-Based Fault Diagnosis Method for Solar Photovoltaic Systems

    Directory of Open Access Journals (Sweden)

    Kuo-Nan Yu

    2014-01-01

    Full Text Available At present, the solar photovoltaic system is extensively used. However, once a fault occurs, it is inspected manually, which is not economical. In order to remedy the defect of unavailable fault diagnosis at any irradiance and temperature in the literature with chaos synchronization based intelligent fault diagnosis for photovoltaic systems proposed by Hsieh et al., this study proposed a chaotic extension fault diagnosis method combined with error back propagation neural network to overcome this problem. It used the nn toolbox of matlab 2010 for simulation and comparison, measured current irradiance and temperature, and used the maximum power point tracking (MPPT for chaotic extraction of eigenvalue. The range of extension field was determined by neural network. Finally, the voltage eigenvalue obtained from current temperature and irradiance was used for the fault diagnosis. Comparing the diagnostic rates with the results by Hsieh et al., this scheme can obtain better diagnostic rates when the irradiances or the temperatures are changed.

  13. Risk-Based Allowed Outage Time and Surveillance Test Interval Extensions for Angra 1

    Directory of Open Access Journals (Sweden)

    Sonia M. Orlando Gibelli

    2012-01-01

    Full Text Available In this work, Probabilistic Safety Assessment (PSA is used to evaluate Allowed Outage Times (AOT and Surveillance Test Intervals (STI extensions for three Angra 1 nuclear power plant safety systems. The interest in such an analysis lies on the fact that PSA comprises a risk-based tool for safety evaluation and has been increasingly applied to support both the regulatory and the operational decision-making processes. Regarding Angra 1, among other applications, PSA is meant to be an additional method that can be used by the utility to justify Technical Specification relaxation to the Brazilian regulatory body. The risk measure used in this work is the Core Damage Frequency, obtained from the Angra 1 Level 1 PSA study. AOT and STI extensions are evaluated for the Safety Injection, Service Water and Auxiliary Feedwater Systems using the SAPHIRE code. In order to compensate for the risk increase caused by the extensions, compensatory measures as (1 test of redundant train prior to entering maintenance and (2 staggered test strategy are proposed. Results have shown that the proposed AOT extensions are acceptable for two of the systems with the implementation of compensatory measures whereas STI extensions are acceptable for all three systems.

  14. A Comparative Study of Three Vibration Based Damage Assessment Techniques

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Rytter, A.

    Three different vibration based damage assessment techniques have been compared. One of the techniques uses the ratios between changes in experimentally and theoretically estimated natural frequencies, respectively, to locate a damage. The second technique relies on updating of an FEM based...

  15. Appraisal and Analysis on Diversified Web Service Selection Techniques based on QoS Factors

    Directory of Open Access Journals (Sweden)

    N.Balaji

    Full Text Available Numerous monumental changes have been made in the existing web service selection to provide quality of services. The quality of service is a major bottle neck in the recent development. Hitherto various QoS based Web Service Selection Techniques exist. But these techniques lacks in functional and non-functional attributes. This paper consists with the following tasks; segregate various QoS based Web Service selection techniques with their respective merits and demerits, an extensive comparative study on different QoS aware service selection techniques with respect to the user requirements and multiple QoS properties and preferences. This paper also evaluates the performance of discussed techniques based on the strength of various QoS aware Web service selection functionalitiesusing a set of evaluation metrics.

  16. Translation of Untranslatable Words — Integration of Lexical Approximation and Phrase-Table Extension Techniques into Statistical Machine Translation

    Science.gov (United States)

    Paul, Michael; Arora, Karunesh; Sumita, Eiichiro

    This paper proposes a method for handling out-of-vocabulary (OOV) words that cannot be translated using conventional phrase-based statistical machine translation (SMT) systems. For a given OOV word, lexical approximation techniques are utilized to identify spelling and inflectional word variants that occur in the training data. All OOV words in the source sentence are then replaced with appropriate word variants found in the training corpus, thus reducing the number of OOV words in the input. Moreover, in order to increase the coverage of such word translations, the SMT translation model is extended by adding new phrase translations for all source language words that do not have a single-word entry in the original phrase-table but only appear in the context of larger phrases. The effectiveness of the proposed methods is investigated for the translation of Hindi to English, Chinese, and Japanese.

  17. Flood alert system based on bayesian techniques

    Science.gov (United States)

    Gulliver, Z.; Herrero, J.; Viesca, C.; Polo, M. J.

    2012-04-01

    The problem of floods in the Mediterranean regions is closely linked to the occurrence of torrential storms in dry regions, where even the water supply relies on adequate water management. Like other Mediterranean basins in Southern Spain, the Guadalhorce River Basin is a medium sized watershed (3856 km2) where recurrent yearly floods occur , mainly in autumn and spring periods, driven by cold front phenomena. The torrential character of the precipitation in such small basins, with a concentration time of less than 12 hours, produces flash flood events with catastrophic effects over the city of Malaga (600000 inhabitants). From this fact arises the need for specific alert tools which can forecast these kinds of phenomena. Bayesian networks (BN) have been emerging in the last decade as a very useful and reliable computational tool for water resources and for the decision making process. The joint use of Artificial Neural Networks (ANN) and BN have served us to recognize and simulate the two different types of hydrological behaviour in the basin: natural and regulated. This led to the establishment of causal relationships between precipitation, discharge from upstream reservoirs, and water levels at a gauging station. It was seen that a recurrent ANN model working at an hourly scale, considering daily precipitation and the two previous hourly values of reservoir discharge and water level, could provide R2 values of 0.86. BN's results slightly improve this fit, but contribute with uncertainty to the prediction. In our current work to Design a Weather Warning Service based on Bayesian techniques the first steps were carried out through an analysis of the correlations between the water level and rainfall at certain representative points in the basin, along with the upstream reservoir discharge. The lower correlation found between precipitation and water level emphasizes the highly regulated condition of the stream. The autocorrelations of the variables were also

  18. Downward Price-Based Brand Line Extensions Effects on Luxury Brands

    Directory of Open Access Journals (Sweden)

    Marcelo Royo-Vela

    2015-07-01

    Full Text Available This study tries to examine the brand concept consistency, the self-concept congruence and the resulting loyalty status of the consumers in order to evaluate whether a downward price-based line extensions in the luxury goods market has any negative or positive effect on them. By conducting focus group and in-depth interviews it was tried to filter out how brand concepts of luxury brands are perceived before and after a line extension. Results revealed that a crucial aspect for the evaluation of downward price-based line extensions is the exclusivity variable. Additionally, the research showed different modification to the brand concept consistency after an extension depending whether the brand is bought for pure hedonic or emotional reasons or actually for functional reasons. As practical implications brands appealing to hedonic/emotional motivations need to be crucially differentiated to those brands appealing to functional/rational motivations. In the case of a mixed concept an in-depth segmentation of the target markets is needed in order to successfully reach the consumers’ needs.

  19. Conserved DNA methylation patterns in healthy blood cells and extensive changes in leukemia measured by a new quantitative technique.

    Science.gov (United States)

    Jelinek, Jaroslav; Liang, Shoudan; Lu, Yue; He, Rong; Ramagli, Louis S; Shpall, Elizabeth J; Estecio, Marcos R H; Issa, Jean-Pierre J

    2012-12-01

    Genome wide analysis of DNA methylation provides important information in a variety of diseases, including cancer. Here, we describe a simple method, Digital Restriction Enzyme Analysis of Methylation (DREAM), based on next generation sequencing analysis of methylation-specific signatures created by sequential digestion of genomic DNA with SmaI and XmaI enzymes. DREAM provides information on 150,000 unique CpG sites, of which 39,000 are in CpG islands and 30,000 are at transcription start sites of 13,000 RefSeq genes. We analyzed DNA methylation in healthy white blood cells and found methylation patterns to be remarkably uniform. Inter individual differences > 30% were observed only at 227 of 28,331 (0.8%) of autosomal CpG sites. Similarly, > 30% differences were observed at only 59 sites when we comparing the cord and adult blood. These conserved methylation patterns contrasted with extensive changes affecting 18-40% of CpG sites in a patient with acute myeloid leukemia and in two leukemia cell lines. The method is cost effective, quantitative (r ( 2) = 0.93 when compared with bisulfite pyrosequencing) and reproducible (r ( 2) = 0.997). Using 100-fold coverage, DREAM can detect differences in methylation greater than 10% or 30% with a false positive rate below 0.05 or 0.001, respectively. DREAM can be useful in quantifying epigenetic effects of environment and nutrition, correlating developmental epigenetic variation with phenotypes, understanding epigenetics of cancer and chronic diseases, measuring the effects of drugs on DNA methylation or deriving new biological insights into mammalian genomes.

  20. Physics based modeling of a series parallel battery pack for asymmetry analysis, predictive control and life extension

    Science.gov (United States)

    Ganesan, Nandhini; Basu, Suman; Hariharan, Krishnan S.; Kolake, Subramanya Mayya; Song, Taewon; Yeo, Taejung; Sohn, Dong Kee; Doo, Seokgwang

    2016-08-01

    Lithium-Ion batteries used for electric vehicle applications are subject to large currents and various operation conditions, making battery pack design and life extension a challenging problem. With increase in complexity, modeling and simulation can lead to insights that ensure optimal performance and life extension. In this manuscript, an electrochemical-thermal (ECT) coupled model for a 6 series × 5 parallel pack is developed for Li ion cells with NCA/C electrodes and validated against experimental data. Contribution of the cathode to overall degradation at various operating conditions is assessed. Pack asymmetry is analyzed from a design and an operational perspective. Design based asymmetry leads to a new approach of obtaining the individual cell responses of the pack from an average ECT output. Operational asymmetry is demonstrated in terms of effects of thermal gradients on cycle life, and an efficient model predictive control technique is developed. Concept of reconfigurable battery pack is studied using detailed simulations that can be used for effective monitoring and extension of battery pack life.

  1. Service-Based Extensions to an OAIS Archive for Science Data Management

    Science.gov (United States)

    Flathers, E.; Seamon, E.; Gessler, P. E.

    2014-12-01

    With new data management mandates from major funding sources such as the National Institutes for Health and the National Science Foundation, architecture of science data archive systems is becoming a critical concern for research institutions. The Consultative Committee for Space Data Systems (CCSDS), in 2002, released their first version of a Reference Model for an Open Archival Information System (OAIS). The CCSDS document (now an ISO standard) was updated in 2012 with additional focus on verifying the authenticity of data and developing concepts of access rights and a security model. The OAIS model is a good fit for research data archives, having been designed to support data collections of heterogeneous types, disciplines, storage formats, etc. for the space sciences. As fast, reliable, persistent Internet connectivity spreads, new network-available resources have been developed that can support the science data archive. A natural extension of an OAIS archive is the interconnection with network- or cloud-based services and resources. We use the Service Oriented Architecture (SOA) design paradigm to describe a set of extensions to an OAIS-type archive: purpose and justification for each extension, where and how each extension connects to the model, and an example of a specific service that meets the purpose.

  2. PIE: A Dynamic Failure-Based Technique

    Science.gov (United States)

    Voas, Jeffrey M.

    1990-01-01

    This paper presents a dynamic technique for statistically estimating three program characteristics that affect a program's computational behavior: (1) the probability that a particular section of a program is executed, (2) the probability that the particular section affects the data state, and (3) the probability that a data state produced by that section has an effect on program output. These three characteristics can be used to predict whether faults are likely to be uncovered by software testing. Index Terms: Software testing, data state, fault, failure, testability. 1 Introduction

  3. Path Based Mapping Technique for Robots

    Directory of Open Access Journals (Sweden)

    Amiraj Dhawan

    2013-05-01

    Full Text Available The purpose of this paper is to explore a new way of autonomous mapping. Current systems using perception techniques like LAZER or SONAR use probabilistic methods and have a drawback of allowing considerable uncertainty in the mapping process. Our approach is to break down the environment, specifically indoor, into reachable areas and objects, separated by boundaries, and identifying their shape, to render various navigable paths around them. This is a novel method to do away with uncertainties, as far as possible, at the cost of temporal efficiency. Also this system demands only minimum and cheap hardware, as it relies on only Infra-Red sensors to do the job.

  4. Comparison of Vibration-Based Damage Assessment Techniques

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Rytter, A.

    1995-01-01

    Three different vibration-based damage assessment techniques have been compared. One of the techniques uses the ratios between changes in experimentally and theoretically estimated natural frequencies, respectively, to locate a damage. The second technique relies on updating of a finite element m...

  5. Extensions of the Johnson-Neyman Technique to Linear Models with Curvilinear Effects: Derivations and Analytical Tools

    Science.gov (United States)

    Miller, Jason W.; Stromeyer, William R.; Schwieterman, Matthew A.

    2013-01-01

    The past decade has witnessed renewed interest in the use of the Johnson-Neyman (J-N) technique for calculating the regions of significance for the simple slope of a focal predictor on an outcome variable across the range of a second, continuous independent variable. Although tools have been developed to apply this technique to probe 2- and 3-way…

  6. An Extensible Dialogue Script for a Robot Based on Unification of State-Transition Models

    Directory of Open Access Journals (Sweden)

    Yosuke Matsusaka

    2010-01-01

    development of communication function of the robot. Compared to previous extension-by-connection method used in behavior-based communication robot developments, the extension-by-unification method has the ability to decompose the script into components. The decomposed components can be recomposed to build a new application easily. In this paper, first we, explain a reformulation we have applied to the conventional state-transition model. Second, we explain a set of algorithms to decompose, recompose, and detect the conflict of each component. Third, we explain a dialogue engine and a script management server we have developed. The script management server has a function to propose reusable components to the developer in real time by implementing the conflict detection algorithm. The dialogue engine SEAT (Speech Event-Action Translator has flexible adapter mechanism to enable quick integration to robotic systems. We have confirmed that by the application of three robots, development efficiency has improved by 30%.

  7. Extension Activity Support System (EASY: A Web-Based Prototype for Facilitating Farm Management

    Directory of Open Access Journals (Sweden)

    Christopher Pettit

    2012-01-01

    Full Text Available In response to disparate advances in delivering spatial information to support agricultural extension activities, the Extension Activity Support System (EASY project was established to develop a vision statement and conceptual design for such a system based on a national needs assessment. Personnel from across Australia were consulted and a review of existing farm information/management software undertaken to ensure that any system that is eventually produced from the EASY vision will build on the strengths of existing efforts. This paper reports on the collaborative consultative process undertaken to create the EASY vision as well as the conceptual technical design and business models that could support a fully functional spatially enabled online system.

  8. Enhanced mechanical performance of biocompatible hemicelluloses-based hydrogel via chain extension.

    Science.gov (United States)

    Qi, Xian-Ming; Chen, Ge-Gu; Gong, Xiao-Dong; Fu, Gen-Que; Niu, Ya-Shuai; Bian, Jing; Peng, Feng; Sun, Run-Cang

    2016-09-16

    Hemicelluloses are widely used to prepare gel materials because of their renewability, biodegradability, and biocompatibility. Here, molecular chain extension of hemicelluloses was obtained in a two-step process. Composite hydrogels were prepared via free radical graft copolymerization of crosslinked quaternized hemicelluloses (CQH) and acrylic acid (AA) in the presence of crosslinking agent N,N'-methylenebisacrylamide (MBA). This chain extension strategy significantly improved the mechanical performance of the resulting hydrogels. The crosslinking density, compression modulus, and swelling capacities of hydrogels were tuned by changing the AA/CQH and MBA/CQH contents. Moreover, the biocompatibility test suggests that the hemicelluloses-based hydrogels exhibited no toxicity to cells and allowed cell growth. Taken together, these properties demonstrated that the composite hydrogels have potential applications in the fields of water absorbents, cell culture, and other functional biomaterials.

  9. Enhanced mechanical performance of biocompatible hemicelluloses-based hydrogel via chain extension

    Science.gov (United States)

    Qi, Xian-Ming; Chen, Ge-Gu; Gong, Xiao-Dong; Fu, Gen-Que; Niu, Ya-Shuai; Bian, Jing; Peng, Feng; Sun, Run-Cang

    2016-01-01

    Hemicelluloses are widely used to prepare gel materials because of their renewability, biodegradability, and biocompatibility. Here, molecular chain extension of hemicelluloses was obtained in a two-step process. Composite hydrogels were prepared via free radical graft copolymerization of crosslinked quaternized hemicelluloses (CQH) and acrylic acid (AA) in the presence of crosslinking agent N,N’-methylenebisacrylamide (MBA). This chain extension strategy significantly improved the mechanical performance of the resulting hydrogels. The crosslinking density, compression modulus, and swelling capacities of hydrogels were tuned by changing the AA/CQH and MBA/CQH contents. Moreover, the biocompatibility test suggests that the hemicelluloses-based hydrogels exhibited no toxicity to cells and allowed cell growth. Taken together, these properties demonstrated that the composite hydrogels have potential applications in the fields of water absorbents, cell culture, and other functional biomaterials. PMID:27634095

  10. The base line problem in DLTS technique

    OpenAIRE

    G. Couturier; Thabti, A.; Barrière, A.S.

    1989-01-01

    This paper describes a solution to suppress the base line problem in DLTS spectroscopy using a lock-in amplifier. The method has been used to characterize deep levels in a GaAs Schottky diode. Comparison with the classical method based on the use of a capacitance meter in the differential mode is established. The electric field dependence of the DLTS signal in a weakly doped semiconductor is also reported and proves the efficiency of the method. Finally, the data process is discussed.

  11. Cu/Cu2O/CuO nanoparticles: Novel synthesis by exploding wire technique and extensive characterization

    Science.gov (United States)

    Sahai, Anshuman; Goswami, Navendu; Kaushik, S. D.; Tripathi, Shilpa

    2016-12-01

    In this article, we explore potential of Exploding Wire Technique (EWT) to synthesize the copper nanoparticles using the copper metal in a plate and wire geometry. Rietveld refinement of X-ray diffraction (XRD) pattern of prepared material indicates presence of mixed phases of copper (Cu) and copper oxide (Cu2O). Agglomerates of copper and copper oxide comprised of ∼20 nm average size nanoparticles observed through high resolution transmission electron microscope (HRTEM) and energy dispersive x-ray (EDX) spectroscopy. Micro-Raman (μR) and Fourier transform infrared (FTIR) spectroscopies of prepared nanoparticles reveal existence of additional minority CuO phase, not determined earlier through XRD and TEM analysis. μR investigations vividly reveal cubic Cu2O and monoclinic CuO phases based on the difference of space group symmetries. In good agreement with μRaman analysis, FTIR stretching modes corresponding to Cu2-O and Cu-O were also distinguished. Investigations of μR and FTIR vibrational modes are in accordance and affirm concurrence of CuO phases besides predominant Cu and Cu2O phase. Quantum confinement effects along with increase of band gaps for direct and indirect optical transitions of Cu/Cu2O/CuO nanoparticles are reflected through UV-vis (UV-vis) spectroscopy. Photoluminescence (PL) spectroscopy spots the electronic levels of each phase and optical transitions processes occurring therein. Iterative X-ray photoelectron spectroscopy (XPS) fitting of core level spectra of Cu (2p3/2) and O (1s), divulges presence of Cu2+ and Cu+ in the lattice with an interesting evidence of O deficiency in the lattice structure and surface adsorption. Magnetic analysis illustrates that the prepared nanomaterial demonstrates ferromagnetic behaviour at room temperature.

  12. Field of view extension and truncation correction for MR-based human attenuation correction in simultaneous MR/PET imaging

    Energy Technology Data Exchange (ETDEWEB)

    Blumhagen, Jan O., E-mail: janole.blumhagen@siemens.com; Ladebeck, Ralf; Fenchel, Matthias [Magnetic Resonance, Siemens AG Healthcare Sector, Erlangen 91052 (Germany); Braun, Harald; Quick, Harald H. [Institute of Medical Physics, Friedrich-Alexander-University Erlangen-Nürnberg, Erlangen 91052 (Germany); Faul, David [Siemens Medical Solutions, New York, New York 10015 (United States); Scheffler, Klaus [MRC Department, Max Planck Institute for Biological Cybernetics, Tübingen 72076, Germany and Department of Biomedical Magnetic Resonance, University Hospital Tübingen, Tübingen 72076 (Germany)

    2014-02-15

    Purpose: In quantitative PET imaging, it is critical to accurately measure and compensate for the attenuation of the photons absorbed in the tissue. While in PET/CT the linear attenuation coefficients can be easily determined from a low-dose CT-based transmission scan, in whole-body MR/PET the computation of the linear attenuation coefficients is based on the MR data. However, a constraint of the MR-based attenuation correction (AC) is the MR-inherent field-of-view (FoV) limitation due to static magnetic field (B{sub 0}) inhomogeneities and gradient nonlinearities. Therefore, the MR-based human AC map may be truncated or geometrically distorted toward the edges of the FoV and, consequently, the PET reconstruction with MR-based AC may be biased. This is especially of impact laterally where the patient arms rest beside the body and are not fully considered. Methods: A method is proposed to extend the MR FoV by determining an optimal readout gradient field which locally compensates B{sub 0} inhomogeneities and gradient nonlinearities. This technique was used to reduce truncation in AC maps of 12 patients, and the impact on the PET quantification was analyzed and compared to truncated data without applying the FoV extension and additionally to an established approach of PET-based FoV extension. Results: The truncation artifacts in the MR-based AC maps were successfully reduced in all patients, and the mean body volume was thereby increased by 5.4%. In some cases large patient-dependent changes in SUV of up to 30% were observed in individual lesions when compared to the standard truncated attenuation map. Conclusions: The proposed technique successfully extends the MR FoV in MR-based attenuation correction and shows an improvement of PET quantification in whole-body MR/PET hybrid imaging. In comparison to the PET-based completion of the truncated body contour, the proposed method is also applicable to specialized PET tracers with little uptake in the arms and might

  13. An Authentication Technique Based on Classification

    Institute of Scientific and Technical Information of China (English)

    李钢; 杨杰

    2004-01-01

    We present a novel watermarking approach based on classification for authentication, in which a watermark is embedded into the host image. When the marked image is modified, the extracted watermark is also different to the original watermark, and different kinds of modification lead to different extracted watermarks. In this paper, different kinds of modification are considered as classes, and we used classification algorithm to recognize the modifications with high probability. Simulation results show that the proposed method is potential and effective.

  14. A frequency domain radar interferometric imaging (FII) technique based on high-resolution methods

    Science.gov (United States)

    Luce, H.; Yamamoto, M.; Fukao, S.; Helal, D.; Crochet, M.

    2001-01-01

    In the present work, we propose a frequency-domain interferometric imaging (FII) technique for a better knowledge of the vertical distribution of the atmospheric scatterers detected by MST radars. This is an extension of the dual frequency-domain interferometry (FDI) technique to multiple frequencies. Its objective is to reduce the ambiguity (resulting from the use of only two adjacent frequencies), inherent with the FDI technique. Different methods, commonly used in antenna array processing, are first described within the context of application to the FII technique. These methods are the Fourier-based imaging, the Capon's and the singular value decomposition method used with the MUSIC algorithm. Some preliminary simulations and tests performed on data collected with the middle and upper atmosphere (MU) radar (Shigaraki, Japan) are also presented. This work is a first step in the developments of the FII technique which seems to be very promising.

  15. Multiple Kernel Sparse Representation based Orthogonal Discriminative Projection and Its Cost-Sensitive Extension.

    Science.gov (United States)

    Zhang, Guoqing; Sun, Huaijiang; Xia, Guiyu; Sun, Quansen

    2016-07-07

    Sparse representation based classification (SRC) has been developed and shown great potential for real-world application. Based on SRC, Yang et al. [10] devised a SRC steered discriminative projection (SRC-DP) method. However, as a linear algorithm, SRC-DP cannot handle the data with highly nonlinear distribution. Kernel sparse representation-based classifier (KSRC) is a non-linear extension of SRC and can remedy the drawback of SRC. KSRC requires the use of a predetermined kernel function and selection of the kernel function and its parameters is difficult. Recently, multiple kernel learning for SRC (MKL-SRC) [22] has been proposed to learn a kernel from a set of base kernels. However, MKL-SRC only considers the within-class reconstruction residual while ignoring the between-class relationship, when learning the kernel weights. In this paper, we propose a novel multiple kernel sparse representation-based classifier (MKSRC), and then we use it as a criterion to design a multiple kernel sparse representation based orthogonal discriminative projection method (MK-SR-ODP). The proposed algorithm aims at learning a projection matrix and a corresponding kernel from the given base kernels such that in the low dimension subspace the between-class reconstruction residual is maximized and the within-class reconstruction residual is minimized. Furthermore, to achieve a minimum overall loss by performing recognition in the learned low-dimensional subspace, we introduce cost information into the dimensionality reduction method. The solutions for the proposed method can be efficiently found based on trace ratio optimization method [33]. Extensive experimental results demonstrate the superiority of the proposed algorithm when compared with the state-of-the-art methods.

  16. A new force-extension formula for stretched macromolecules and polymers based on the Ising model

    Science.gov (United States)

    Chan, Yue; Haverkamp, Richard G.

    2016-12-01

    In this paper, we derive a new force-extension formula for stretched macromolecules and homogeneous polymer matrices. The Ising model arising from paramagnetism is employed, where the magnetic force is replaced by the external force, and the resistance energy is addressed in this model instead of the usual persistent length arising from the freely jointed chain and worm-like chain models. While the force-extension formula reveals the distinctive stretching features for stretched polymers, the resistance energy is found to increase almost linearly with the external force for our two polysaccharides stretching examples with and without ring conformational changes. In particular, a jump in the resistance energy which is caused by a conformational transition is investigated, and the gap between the jump determines the energy barrier between two conformational configurations. Our theoretical model matches well with experimental results undergoing no and single conformational transitions, and a Monte Carlo simulation has also been performed to ensure the correctness of the resistance energy. This technique might also be employed to determine the binding energy from other causes during molecular stretching and provide vital information for further theoretical investigations.

  17. A Hill Cipher Modification Based on Eigenvalues Extension with Dynamic Key Size HCM-EXDKS

    Directory of Open Access Journals (Sweden)

    Ahmed Y. Mahmoud

    2014-04-01

    Full Text Available All the proposed Hill cipher modifications have been restricted to the use of dynamic keys only. In this paper, we propose an extension of Hill cipher modification based on eigenvalues HCM-EE, called HCM-EXDKS. The proposed extension generating dynamic encryption key matrix by exponentiation that is made efficiently with the help of eigenvalues, HCM-EXDKS introduces a new class of dynamic keys together with dynamically changing key size. Security of HCM-EXDKS is provided by the use of a large number of dynamic keys with variable size. The proposed extension is more effective in the encryption quality of RGB images than HCM-EE and Hill cipher-known modifications in the case of images with large single colour areas and slightly more effective otherwise. HCM-EXDKS almost has the same encryption time as HCM-EE, and HCM-HMAC. HCM-EXDKS is two times faster than HCM-H, having the best encryption quality among Hill cipher modifications compared versus HCM-EXDKS.

  18. Adaptive Thresholding Technique for Retinal Vessel Segmentation Based on GLCM-Energy Information

    OpenAIRE

    Temitope Mapayi; Serestina Viriri; Jules-Raymond Tapamo

    2015-01-01

    Although retinal vessel segmentation has been extensively researched, a robust and time efficient segmentation method is highly needed. This paper presents a local adaptive thresholding technique based on gray level cooccurrence matrix- (GLCM-) energy information for retinal vessel segmentation. Different thresholds were computed using GLCM-energy information. An experimental evaluation on DRIVE database using the grayscale intensity and Green Channel of the retinal image demo...

  19. Huffman-based code compression techniques for embedded processors

    KAUST Repository

    Bonny, Mohamed Talal

    2010-09-01

    The size of embedded software is increasing at a rapid pace. It is often challenging and time consuming to fit an amount of required software functionality within a given hardware resource budget. Code compression is a means to alleviate the problem by providing substantial savings in terms of code size. In this article we introduce a novel and efficient hardware-supported compression technique that is based on Huffman Coding. Our technique reduces the size of the generated decoding table, which takes a large portion of the memory. It combines our previous techniques, Instruction Splitting Technique and Instruction Re-encoding Technique into new one called Combined Compression Technique to improve the final compression ratio by taking advantage of both previous techniques. The instruction Splitting Technique is instruction set architecture (ISA)-independent. It splits the instructions into portions of varying size (called patterns) before Huffman coding is applied. This technique improves the final compression ratio by more than 20% compared to other known schemes based on Huffman Coding. The average compression ratios achieved using this technique are 48% and 50% for ARM and MIPS, respectively. The Instruction Re-encoding Technique is ISA-dependent. It investigates the benefits of reencoding unused bits (we call them reencodable bits) in the instruction format for a specific application to improve the compression ratio. Reencoding those bits can reduce the size of decoding tables by up to 40%. Using this technique, we improve the final compression ratios in comparison to the first technique to 46% and 45% for ARM and MIPS, respectively (including all overhead that incurs). The Combined Compression Technique improves the compression ratio to 45% and 42% for ARM and MIPS, respectively. In our compression technique, we have conducted evaluations using a representative set of applications and we have applied each technique to two major embedded processor architectures

  20. FDI and Accommodation Using NN Based Techniques

    Science.gov (United States)

    Garcia, Ramon Ferreiro; de Miguel Catoira, Alberto; Sanz, Beatriz Ferreiro

    Massive application of dynamic backpropagation neural networks is used on closed loop control FDI (fault detection and isolation) tasks. The process dynamics is mapped by means of a trained backpropagation NN to be applied on residual generation. Process supervision is then applied to discriminate faults on process sensors, and process plant parameters. A rule based expert system is used to implement the decision making task and the corresponding solution in terms of faults accommodation and/or reconfiguration. Results show an efficient and robust FDI system which could be used as the core of an SCADA or alternatively as a complement supervision tool operating in parallel with the SCADA when applied on a heat exchanger.

  1. Segmentation of Color Images Based on Different Segmentation Techniques

    Directory of Open Access Journals (Sweden)

    Purnashti Bhosale

    2013-03-01

    Full Text Available In this paper, we propose an Color image segmentation algorithm based on different segmentation techniques. We recognize the background objects such as the sky, ground, and trees etc based on the color and texture information using various methods of segmentation. The study of segmentation techniques by using different threshold methods such as global and local techniques and they are compared with one another so as to choose the best technique for threshold segmentation. Further segmentation is done by using clustering method and Graph cut method to improve the results of segmentation.

  2. An Extensible Component-Based Multi-Objective Evolutionary Algorithm Framework

    DEFF Research Database (Denmark)

    Sørensen, Jan Corfixen; Jørgensen, Bo Nørregaard

    2017-01-01

    The ability to easily modify the problem definition is currently missing in Multi-Objective Evolutionary Algorithms (MOEA). Existing MOEA frameworks do not support dynamic addition and extension of the problem formulation. The existing frameworks require a re-specification of the problem definition...... with different compositions of objectives from the horticulture domain are formulated based on a state of the art micro-climate simulator, electricity prices and weather forecasts. The experimental results demonstrate that the Controleum framework support dynamic reconfiguration of the problem formulation...

  3. A novel protein complex identification algorithm based on Connected Affinity Clique Extension (CACE).

    Science.gov (United States)

    Li, Peng; He, Tingting; Hu, Xiaohua; Zhao, Junmin; Shen, Xianjun; Zhang, Ming; Wang, Yan

    2014-06-01

    A novel algorithm based on Connected Affinity Clique Extension (CACE) for mining overlapping functional modules in protein interaction network is proposed in this paper. In this approach, the value of protein connected affinity which is inferred from protein complexes is interpreted as the reliability and possibility of interaction. The protein interaction network is constructed as a weighted graph, and the weight is dependent on the connected affinity coefficient. The experimental results of our CACE in two test data sets show that the CACE can detect the functional modules much more effectively and accurately when compared with other state-of-art algorithms CPM and IPC-MCE.

  4. Non-Destructive Techniques Based on Eddy Current Testing

    Directory of Open Access Journals (Sweden)

    Ernesto Vázquez-Sánchez

    2011-02-01

    Full Text Available Non-destructive techniques are used widely in the metal industry in order to control the quality of materials. Eddy current testing is one of the most extensively used non-destructive techniques for inspecting electrically conductive materials at very high speeds that does not require any contact between the test piece and the sensor. This paper includes an overview of the fundamentals and main variables of eddy current testing. It also describes the state-of-the-art sensors and modern techniques such as multi-frequency and pulsed systems. Recent advances in complex models towards solving crack-sensor interaction, developments in instrumentation due to advances in electronic devices, and the evolution of data processing suggest that eddy current testing systems will be increasingly used in the future.

  5. Non-destructive techniques based on eddy current testing.

    Science.gov (United States)

    García-Martín, Javier; Gómez-Gil, Jaime; Vázquez-Sánchez, Ernesto

    2011-01-01

    Non-destructive techniques are used widely in the metal industry in order to control the quality of materials. Eddy current testing is one of the most extensively used non-destructive techniques for inspecting electrically conductive materials at very high speeds that does not require any contact between the test piece and the sensor. This paper includes an overview of the fundamentals and main variables of eddy current testing. It also describes the state-of-the-art sensors and modern techniques such as multi-frequency and pulsed systems. Recent advances in complex models towards solving crack-sensor interaction, developments in instrumentation due to advances in electronic devices, and the evolution of data processing suggest that eddy current testing systems will be increasingly used in the future.

  6. Non-Destructive Techniques Based on Eddy Current Testing

    Science.gov (United States)

    García-Martín, Javier; Gómez-Gil, Jaime; Vázquez-Sánchez, Ernesto

    2011-01-01

    Non-destructive techniques are used widely in the metal industry in order to control the quality of materials. Eddy current testing is one of the most extensively used non-destructive techniques for inspecting electrically conductive materials at very high speeds that does not require any contact between the test piece and the sensor. This paper includes an overview of the fundamentals and main variables of eddy current testing. It also describes the state-of-the-art sensors and modern techniques such as multi-frequency and pulsed systems. Recent advances in complex models towards solving crack-sensor interaction, developments in instrumentation due to advances in electronic devices, and the evolution of data processing suggest that eddy current testing systems will be increasingly used in the future. PMID:22163754

  7. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    OpenAIRE

    T. Subramani; J. Jothi

    2014-01-01

    This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case s...

  8. Array-based techniques for fingerprinting medicinal herbs

    Directory of Open Access Journals (Sweden)

    Xue Charlie

    2011-05-01

    Full Text Available Abstract Poor quality control of medicinal herbs has led to instances of toxicity, poisoning and even deaths. The fundamental step in quality control of herbal medicine is accurate identification of herbs. Array-based techniques have recently been adapted to authenticate or identify herbal plants. This article reviews the current array-based techniques, eg oligonucleotides microarrays, gene-based probe microarrays, Suppression Subtractive Hybridization (SSH-based arrays, Diversity Array Technology (DArT and Subtracted Diversity Array (SDA. We further compare these techniques according to important parameters such as markers, polymorphism rates, restriction enzymes and sample type. The applicability of the array-based methods for fingerprinting depends on the availability of genomics and genetics of the species to be fingerprinted. For the species with few genome sequence information but high polymorphism rates, SDA techniques are particularly recommended because they require less labour and lower material cost.

  9. Inverter-based circuit design techniques for low supply voltages

    CERN Document Server

    Palani, Rakesh Kumar

    2017-01-01

    This book describes intuitive analog design approaches using digital inverters, providing filter architectures and circuit techniques enabling high performance analog circuit design. The authors provide process, supply voltage and temperature (PVT) variation-tolerant design techniques for inverter based circuits. They also discuss various analog design techniques for lower technology nodes and lower power supply, which can be used for designing high performance systems-on-chip.    .

  10. Low Complexity for Scalable Video Coding Extension of H.264 based on the Complexity of Video

    Directory of Open Access Journals (Sweden)

    Mayada Khairy

    2016-12-01

    Full Text Available Scalable Video Coding (SVC / H.264 is one type of video compression techniques. Which provided more reality in dealing with video compression to provide an efficient video coding based on H.264/AVC. This ensures higher performance through high compression ratio. SVC/H.264 is a complexity technique whereas the takes considerable time for computation the best mode of macroblock and motion estimation through using the exhaustive search techniques. This work reducing the processing time through matching between the complexity of the video and the method of selection macroblock and motion estimation. The goal of this approach is reducing the encoding time and improving the quality of video stream the efficiency of the proposed approach makes it suitable for are many applications as video conference application and security application.

  11. Evidence-Based Programming within Cooperative Extension: How Can We Maintain Program Fidelity While Adapting to Meet Local Needs?

    Science.gov (United States)

    Olson, Jonathan R.; Welsh, Janet A.; Perkins, Daniel F.

    2015-01-01

    In this article, we describe how the recent movement towards evidence-based programming has impacted Extension. We review how the emphasis on implementing such programs with strict fidelity to an underlying program model may be at odds with Extension's strong history of adapting programming to meet the unique needs of children, youth, families,…

  12. Classification of acute pancreatitis based on retroperitoneal extension: Application of the concept of interfascial planes

    Energy Technology Data Exchange (ETDEWEB)

    Ishikawa, Kazuo [Osaka Prefectural Senshu Critical Care Medical Center, 2-24 Rinku-Ourai-Kita, Izumisano-shi, Osaka 598-0048 (Japan)]. E-mail: ishikawa@sccmc.izumisano.osaka.jp; Idoguchi, Koji [Osaka Prefectural Senshu Critical Care Medical Center, 2-24 Rinku-Ourai-Kita, Izumisano-shi, Osaka 598-0048 (Japan)]. E-mail: idoguchi@sccmc.izumisano.osaka.jp; Tanaka, Hiroshi [Department of Traumatology and Acute Critical Care Medicine, Osaka University Hospital, 2-15 Yamada-Oka, Suita-shi, Osaka 565-0871 (Japan)]. E-mail: tanaka@hp-emerg.med.osaka-u.ac.jp; Tohma, Yoshiki [Osaka Prefectural Nakakawachi Medical Center of Acute Medicine, 3-4-13 Nishi-Iwata, Higashiosaka-shi, Osaka 578-0947 (Japan)]. E-mail: tohma@nmcam.jp; Ukai, Isao [Department of Traumatology and Acute Critical Care Medicine, Osaka University Hospital, 2-15 Yamada-Oka, Suita-shi, Osaka 565-0871 (Japan)]. E-mail: isaoukai@nifty.com; Watanabe, Hiroaki [Osaka Prefectural Senshu Critical Care Medical Center, 2-24 Rinku-Ourai-Kita, Izumisano-shi, Osaka 598-0048 (Japan)]. E-mail: hiwatana@sccmc.izumisano.osaka.jp; Matsuoka, Tetsuya [Osaka Prefectural Senshu Critical Care Medical Center, 2-24 Rinku-Ourai-Kita, Izumisano-shi, Osaka 598-0048 (Japan)]. E-mail: matsuoka@sccmc.izumisano.osaka.jp; Yokota, Jyunichiro [Osaka Prefectural Senshu Critical Care Medical Center, 2-24 Rinku-Ourai-Kita, Izumisano-shi, Osaka 598-0048 (Japan)]. E-mail: jyokota@sccmc.izumisano.osaka.jp; Sugimoto, Tsuyoshi [Ryokufukai Hospital, 1-16-13 Setoguchi, Hirano-ku, Osaka-shi, Osaka 547-0034 (Japan)]. E-mail: ts-sugi@ryokufukai.or.jp

    2006-12-15

    Objective: This study aimed to provide a classification system for acute pancreatitis by applying the principle that the disease spreads along the retroperitoneal interfascial planes. Materials and methods: Medical records and computed tomography (CT) images of 58 patients with acute pancreatitis treated between 2000 and 2005 were reviewed. The retroperitoneum was subdivided into 10 components according to the concept of interfascial planes. Severity of acute pancreatitis was graded according to retroperitoneal extension into these components. Clinical courses and outcomes were compared with the grades. The prognostic value of our classification system was compared with that of Balthazar's CT severity index (CTSI). Results: Retroperitoneal extension of acute fluid collection was classified into five grades: Grade I, fluid confined to the anterior pararenal space or retromesenteric plane (8 patients); Grade II, fluid spreading into the lateroconal or retrorenal plane (16 patients); Grade III, fluid spreading into the combined interfascial plane (8 patients); Grade IV, fluid spreading into the subfascial plane beyond the interfascial planes (15 patients); and Grade V, fluid intruding into the posterior pararenal space (11 patients). Morbidity and mortality were 92.3% and 38.5% in the 26 patients with Grade IV or V disease, and 21.9% and 0% in the 32 patients with Grade I, II, or III disease. Morbidity and mortality were 86.7% and 33.3% in patients with disease classified 'severe' according to the CTSI, and 37.5% and 9.4% in patients with disease classified 'mild' or 'moderate'. Conclusion: Classification of acute pancreatitis based on CT-determined retroperitoneal extension is a useful indicator of the disease severity and prognosis without the need for contrast-medium enhanced CT.

  13. Adaptive thresholding technique for retinal vessel segmentation based on GLCM-energy information.

    Science.gov (United States)

    Mapayi, Temitope; Viriri, Serestina; Tapamo, Jules-Raymond

    2015-01-01

    Although retinal vessel segmentation has been extensively researched, a robust and time efficient segmentation method is highly needed. This paper presents a local adaptive thresholding technique based on gray level cooccurrence matrix- (GLCM-) energy information for retinal vessel segmentation. Different thresholds were computed using GLCM-energy information. An experimental evaluation on DRIVE database using the grayscale intensity and Green Channel of the retinal image demonstrates the high performance of the proposed local adaptive thresholding technique. The maximum average accuracy rates of 0.9511 and 0.9510 with maximum average sensitivity rates of 0.7650 and 0.7641 were achieved on DRIVE and STARE databases, respectively. When compared to the widely previously used techniques on the databases, the proposed adaptive thresholding technique is time efficient with a higher average sensitivity and average accuracy rates in the same range of very good specificity.

  14. The detection of bulk explosives using nuclear-based techniques

    Energy Technology Data Exchange (ETDEWEB)

    Morgado, R.E.; Gozani, T.; Seher, C.C.

    1988-01-01

    In 1986 we presented a rationale for the detection of bulk explosives based on nuclear techniques that addressed the requirements of civil aviation security in the airport environment. Since then, efforts have intensified to implement a system based on thermal neutron activation (TNA), with new work developing in fast neutron and energetic photon reactions. In this paper we will describe these techniques and present new results from laboratory and airport testing. Based on preliminary results, we contended in our earlier paper that nuclear-based techniques did provide sufficiently penetrating probes and distinguishable detectable reaction products to achieve the FAA operational goals; new data have supported this contention. The status of nuclear-based techniques for the detection of bulk explosives presently under investigation by the US Federal Aviation Administration (FAA) is reviewed. These include thermal neutron activation (TNA), fast neutron activation (FNA), the associated particle technique, nuclear resonance absorption, and photoneutron activation. The results of comprehensive airport testing of the TNA system performed during 1987-88 are summarized. From a technical point of view, nuclear-based techniques now represent the most comprehensive and feasible approach for meeting the operational criteria of detection, false alarms, and throughput. 9 refs., 5 figs., 2 tabs.

  15. A Review On Segmentation Based Image Compression Techniques

    Directory of Open Access Journals (Sweden)

    S.Thayammal

    2013-11-01

    Full Text Available Abstract -The storage and transmission of imagery become more challenging task in the current scenario of multimedia applications. Hence, an efficient compression scheme is highly essential for imagery, which reduces the requirement of storage medium and transmission bandwidth. Not only improvement in performance and also the compression techniques must converge quickly in order to apply them for real time applications. There are various algorithms have been done in image compression, but everyone has its own pros and cons. Here, an extensive analysis between existing methods is performed. Also, the use of existing works is highlighted, for developing the novel techniques which face the challenging task of image storage and transmission in multimedia applications.

  16. An Extension for Combination of Duty Constraints in Role-Based Access Control

    CERN Document Server

    Hosseini, Ali

    2010-01-01

    Among access control models, Role Based Access Control (RBAC) is very useful and is used in many computer systems. Static Combination of Duty (SCD) and Dynamic Combination of Duty (DCD) constraints have been introduced recently for this model to handle dependent roles. These roles must be used together and can be considered as a contrary point of conflicting roles. In this paper, we propose several new types of SCD and DCD constraints. Also, we introduce strong dependent roles and define new groups of SCD constraints for these types of roles as SCD with common items and SCD with union items. In addition, we present an extension for SCD constraints in the presence of hierarchy.

  17. Local Community Detection in Complex Networks Based on Maximum Cliques Extension

    Directory of Open Access Journals (Sweden)

    Meng Fanrong

    2014-01-01

    Full Text Available Detecting local community structure in complex networks is an appealing problem that has attracted increasing attention in various domains. However, most of the current local community detection algorithms, on one hand, are influenced by the state of the source node and, on the other hand, cannot effectively identify the multiple communities linked with the overlapping nodes. We proposed a novel local community detection algorithm based on maximum clique extension called LCD-MC. The proposed method firstly finds the set of all the maximum cliques containing the source node and initializes them as the starting local communities; then, it extends each unclassified local community by greedy optimization until a certain objective is satisfied; finally, the expected local communities will be obtained until all maximum cliques are assigned into a community. An empirical evaluation using both synthetic and real datasets demonstrates that our algorithm has a superior performance to some of the state-of-the-art approaches.

  18. An Improved Particle Swarm Optimization Algorithm Based on Ensemble Technique

    Institute of Scientific and Technical Information of China (English)

    SHI Yan; HUANG Cong-ming

    2006-01-01

    An improved particle swarm optimization (PSO) algorithm based on ensemble technique is presented. The algorithm combines some previous best positions (pbest) of the particles to get an ensemble position (Epbest), which is used to replace the global best position (gbest). It is compared with the standard PSO algorithm invented by Kennedy and Eberhart and some improved PSO algorithms based on three different benchmark functions. The simulation results show that the improved PSO based on ensemble technique can get better solutions than the standard PSO and some other improved algorithms under all test cases.

  19. Efficient Plant Supervision Strategy Using NN Based Techniques

    Science.gov (United States)

    Garcia, Ramon Ferreiro; Rolle, Jose Luis Calvo; Castelo, Francisco Javier Perez

    Most of non-linear type one and type two control systems suffers from lack of detectability when model based techniques are applied on FDI (fault detection and isolation) tasks. In general, all types of processes suffer from lack of detectability also due to the ambiguity to discriminate the process, sensors and actuators in order to isolate any given fault. This work deals with a strategy to detect and isolate faults which include massive neural networks based functional approximation procedures associated to recursive rule based techniques applied to a parity space approach.

  20. Power system stabilizers based on modern control techniques

    Energy Technology Data Exchange (ETDEWEB)

    Malik, O.P.; Chen, G.P.; Zhang, Y.; El-Metwally, K. [Calgary Univ., AB (Canada). Dept. of Electrical and Computer Engineering

    1994-12-31

    Developments in digital technology have made it feasible to develop and implement improved controllers based on sophisticated control techniques. Power system stabilizers based on adaptive control, fuzzy logic and artificial networks are being developed. Each of these control techniques possesses unique features and strengths. In this paper, the relative performance of power systems stabilizers based on adaptive control, fuzzy logic and neural network, both in simulation studies and real time tests on a physical model of a power system, is presented and compared to that of a fixed parameter conventional power system stabilizer. (author) 16 refs., 45 figs., 3 tabs.

  1. Data Mining and Neural Network Techniques in Case Based System

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper first puts forward a case-based system framework basedon data mining techniques. Then the paper examines the possibility of using neural n etworks as a method of retrieval in such a case-based system. In this system we propose data mining algorithms to discover case knowledge and other algorithms.

  2. An Agent Communication Framework Based on XML and SOAP Technique

    Institute of Scientific and Technical Information of China (English)

    李晓瑜

    2009-01-01

    This thesis introducing XML technology and SOAP technology,present an agent communication fi-amework based on XML and SOAP technique,and analyze the principle,architecture,function and benefit of it. At the end, based on KQML communication primitive lan- guages.

  3. Simulation-based optimization parametric optimization techniques and reinforcement learning

    CERN Document Server

    Gosavi, Abhijit

    2003-01-01

    Simulation-Based Optimization: Parametric Optimization Techniques and Reinforcement Learning introduces the evolving area of simulation-based optimization. The book's objective is two-fold: (1) It examines the mathematical governing principles of simulation-based optimization, thereby providing the reader with the ability to model relevant real-life problems using these techniques. (2) It outlines the computational technology underlying these methods. Taken together these two aspects demonstrate that the mathematical and computational methods discussed in this book do work. Broadly speaking, the book has two parts: (1) parametric (static) optimization and (2) control (dynamic) optimization. Some of the book's special features are: *An accessible introduction to reinforcement learning and parametric-optimization techniques. *A step-by-step description of several algorithms of simulation-based optimization. *A clear and simple introduction to the methodology of neural networks. *A gentle introduction to converg...

  4. Protein-protein interactions prediction based on iterative clique extension with gene ontology filtering.

    Science.gov (United States)

    Yang, Lei; Tang, Xianglong

    2014-01-01

    Cliques (maximal complete subnets) in protein-protein interaction (PPI) network are an important resource used to analyze protein complexes and functional modules. Clique-based methods of predicting PPI complement the data defection from biological experiments. However, clique-based predicting methods only depend on the topology of network. The false-positive and false-negative interactions in a network usually interfere with prediction. Therefore, we propose a method combining clique-based method of prediction and gene ontology (GO) annotations to overcome the shortcoming and improve the accuracy of predictions. According to different GO correcting rules, we generate two predicted interaction sets which guarantee the quality and quantity of predicted protein interactions. The proposed method is applied to the PPI network from the Database of Interacting Proteins (DIP) and most of the predicted interactions are verified by another biological database, BioGRID. The predicted protein interactions are appended to the original protein network, which leads to clique extension and shows the significance of biological meaning.

  5. Protein-Protein Interactions Prediction Based on Iterative Clique Extension with Gene Ontology Filtering

    Directory of Open Access Journals (Sweden)

    Lei Yang

    2014-01-01

    Full Text Available Cliques (maximal complete subnets in protein-protein interaction (PPI network are an important resource used to analyze protein complexes and functional modules. Clique-based methods of predicting PPI complement the data defection from biological experiments. However, clique-based predicting methods only depend on the topology of network. The false-positive and false-negative interactions in a network usually interfere with prediction. Therefore, we propose a method combining clique-based method of prediction and gene ontology (GO annotations to overcome the shortcoming and improve the accuracy of predictions. According to different GO correcting rules, we generate two predicted interaction sets which guarantee the quality and quantity of predicted protein interactions. The proposed method is applied to the PPI network from the Database of Interacting Proteins (DIP and most of the predicted interactions are verified by another biological database, BioGRID. The predicted protein interactions are appended to the original protein network, which leads to clique extension and shows the significance of biological meaning.

  6. A Hough Transform based Technique for Text Segmentation

    CERN Document Server

    Saha, Satadal; Nasipuri, Mita; Basu, Dipak Kr

    2010-01-01

    Text segmentation is an inherent part of an OCR system irrespective of the domain of application of it. The OCR system contains a segmentation module where the text lines, words and ultimately the characters must be segmented properly for its successful recognition. The present work implements a Hough transform based technique for line and word segmentation from digitized images. The proposed technique is applied not only on the document image dataset but also on dataset for business card reader system and license plate recognition system. For standardization of the performance of the system the technique is also applied on public domain dataset published in the website by CMATER, Jadavpur University. The document images consist of multi-script printed and hand written text lines with variety in script and line spacing in single document image. The technique performs quite satisfactorily when applied on mobile camera captured business card images with low resolution. The usefulness of the technique is verifie...

  7. Extensions of the lost letter technique to divisive issues of creationism, darwinism, sex education, and gay and lesbian affiliations.

    Science.gov (United States)

    Bridges, F Stephen; Anzalone, Debra A; Ryan, Stuart W; Anzalone, Fanancy L

    2002-04-01

    Two field studies using 1,004 "lost letters" were designed to test the hypotheses that returned responses would be greater in small towns than from a city, that addressees' affiliation with a group either (1) opposed to physical education in schools, (2) supporting gay and lesbian teachers, or (3) advocating Creationism or Darwinism would reduce the return rate. Of 504 letters "lost" in Study A, 163 (32.3%) were returned in the mail from residents of southeast Louisiana and indicated across 3 addressees and 2 sizes of community, addressees' affiLiations were not associated with returned responses. Community size and addressees' affiliations were associated with significantly different rates of return in the city. Return rates from sites within a city were lower when letters were addressed to an organization which opposed (teaching) health education in the schools than to one supporting daily health education. Of 500 letters "lost" in Study B, 95 (19.0%) were returned from residents of northwest Florida and indicated across 5 addressees and 2 sizes of community, addressees' affiliations were significantly associated with returned responses overall (5 addressees) and in small towns (control, Creationism, Darwinism addressees), but not with community size. Community size and addressees' affiliations were associated with significantly different rates of return in small towns, with returns greater than or equal to those in the city (except for the addressee advocating teaching Darwinism in public schools). The present findings appear to show that applications of the lost letter technique to other divisive social issues are useful in assessing public opinion.

  8. EMAAS: An extensible grid-based Rich Internet Application for microarray data analysis and management

    Directory of Open Access Journals (Sweden)

    Aitman T

    2008-11-01

    Full Text Available Abstract Background Microarray experimentation requires the application of complex analysis methods as well as the use of non-trivial computer technologies to manage the resultant large data sets. This, together with the proliferation of tools and techniques for microarray data analysis, makes it very challenging for a laboratory scientist to keep up-to-date with the latest developments in this field. Our aim was to develop a distributed e-support system for microarray data analysis and management. Results EMAAS (Extensible MicroArray Analysis System is a multi-user rich internet application (RIA providing simple, robust access to up-to-date resources for microarray data storage and analysis, combined with integrated tools to optimise real time user support and training. The system leverages the power of distributed computing to perform microarray analyses, and provides seamless access to resources located at various remote facilities. The EMAAS framework allows users to import microarray data from several sources to an underlying database, to pre-process, quality assess and analyse the data, to perform functional analyses, and to track data analysis steps, all through a single easy to use web portal. This interface offers distance support to users both in the form of video tutorials and via live screen feeds using the web conferencing tool EVO. A number of analysis packages, including R-Bioconductor and Affymetrix Power Tools have been integrated on the server side and are available programmatically through the Postgres-PLR library or on grid compute clusters. Integrated distributed resources include the functional annotation tool DAVID, GeneCards and the microarray data repositories GEO, CELSIUS and MiMiR. EMAAS currently supports analysis of Affymetrix 3' and Exon expression arrays, and the system is extensible to cater for other microarray and transcriptomic platforms. Conclusion EMAAS enables users to track and perform microarray data

  9. Memory Based Machine Intelligence Techniques in VLSI hardware

    OpenAIRE

    James, Alex Pappachen

    2012-01-01

    We briefly introduce the memory based approaches to emulate machine intelligence in VLSI hardware, describing the challenges and advantages. Implementation of artificial intelligence techniques in VLSI hardware is a practical and difficult problem. Deep architectures, hierarchical temporal memories and memory networks are some of the contemporary approaches in this area of research. The techniques attempt to emulate low level intelligence tasks and aim at providing scalable solutions to high ...

  10. Memory Based Machine Intelligence Techniques in VLSI hardware

    CERN Document Server

    James, Alex Pappachen

    2012-01-01

    We briefly introduce the memory based approaches to emulate machine intelligence in VLSI hardware, describing the challenges and advantages. Implementation of artificial intelligence techniques in VLSI hardware is a practical and difficult problem. Deep architectures, hierarchical temporal memories and memory networks are some of the contemporary approaches in this area of research. The techniques attempt to emulate low level intelligence tasks and aim at providing scalable solutions to high level intelligence problems such as sparse coding and contextual processing.

  11. Image analysis techniques associated with automatic data base generation.

    Science.gov (United States)

    Bond, A. D.; Ramapriyan, H. K.; Atkinson, R. J.; Hodges, B. C.; Thomas, D. T.

    1973-01-01

    This paper considers some basic problems relating to automatic data base generation from imagery, the primary emphasis being on fast and efficient automatic extraction of relevant pictorial information. Among the techniques discussed are recursive implementations of some particular types of filters which are much faster than FFT implementations, a 'sequential similarity detection' technique of implementing matched filters, and sequential linear classification of multispectral imagery. Several applications of the above techniques are presented including enhancement of underwater, aerial and radiographic imagery, detection and reconstruction of particular types of features in images, automatic picture registration and classification of multiband aerial photographs to generate thematic land use maps.

  12. MPPT Technique Based on Current and Temperature Measurements

    Directory of Open Access Journals (Sweden)

    Eduardo Moreira Vicente

    2015-01-01

    Full Text Available This paper presents a new maximum power point tracking (MPPT method based on the measurement of temperature and short-circuit current, in a simple and efficient approach. These measurements, which can precisely define the maximum power point (MPP, have not been used together in other existing techniques. The temperature is measured with a low cost sensor and the solar irradiance is estimated through the relationship of the measured short-circuit current and its reference. Fast tracking speed and stable steady-state operation are advantages of this technique, which presents higher performance when compared to other well-known techniques.

  13. Extensive Evaluation of a Diffusion Denuder Technique for the Quantification of Atmospheric Stable and Radioactive Molecular Iodine

    DEFF Research Database (Denmark)

    Huang, Ru-Jin; Hou, Xiaolin; Hoffmann, Thorsten

    2010-01-01

    In this paper we present the evaluation and optimization of a new approach for the quantification of gaseous molecular iodine (I2) for laboratory- and field-based studies and its novel application for the measurement of radioactive molecular iodine. α-Cyclodextrin (α-CD) in combination with 129I......− is shown to be an effective denuder coating for the sampling of gaseous I2 by the formation of an inclusion complex. The entrapped 127I2 together with the 129I− spike in the coating is then released and derivatized to 4-iodo-N,N-dimethylaniline (4-I-DMA) for gas chromatography−mass spectrometry (GC...

  14. Conserved DNA methylation patterns in healthy blood cells and extensive changes in leukemia measured by a new quantitative technique

    OpenAIRE

    Jelinek, Jaroslav; Liang, Shoudan; Lu, Yue; He, Rong; Ramagli, Louis S.; Shpall, Elizabeth J; Estecio, Marcos R.H.; Issa, Jean-Pierre J.

    2012-01-01

    Genome wide analysis of DNA methylation provides important information in a variety of diseases, including cancer. Here, we describe a simple method, Digital Restriction Enzyme Analysis of Methylation (DREAM), based on next generation sequencing analysis of methylation-specific signatures created by sequential digestion of genomic DNA with SmaI and XmaI enzymes. DREAM provides information on 150,000 unique CpG sites, of which 39,000 are in CpG islands and 30,000 are at transcription start sit...

  15. A Lyapunov-Based Extension to Particle Swarm Dynamics for Continuous Function Optimization

    Science.gov (United States)

    Bhattacharya, Sayantani; Konar, Amit; Das, Swagatam; Han, Sang Yong

    2009-01-01

    The paper proposes three alternative extensions to the classical global-best particle swarm optimization dynamics, and compares their relative performance with the standard particle swarm algorithm. The first extension, which readily follows from the well-known Lyapunov's stability theorem, provides a mathematical basis of the particle dynamics with a guaranteed convergence at an optimum. The inclusion of local and global attractors to this dynamics leads to faster convergence speed and better accuracy than the classical one. The second extension augments the velocity adaptation equation by a negative randomly weighted positional term of individual particle, while the third extension considers the negative positional term in place of the inertial term. Computer simulations further reveal that the last two extensions outperform both the classical and the first extension in terms of convergence speed and accuracy. PMID:22303158

  16. Runtime Monitoring Technique to handle Tautology based SQL Injection Attacks

    Directory of Open Access Journals (Sweden)

    Ramya Dharam

    2015-05-01

    Full Text Available Software systems, like web applications, are often used to provide reliable online services such as banking, shopping, social networking, etc., to users. The increasing use of such systems has led to a high need for assuring confidentiality, integrity, and availability of user data. SQL Injection Attacks (SQLIAs is one of the major security threats to web applications. It allows attackers to get unauthorized access to the back-end database consisting of confidential user information. In this paper we present and evaluate a Runtime Monitoring Technique to detect and prevent tautology based SQLIAs in web applications. Our technique monitors the behavior of the application during its post- deployment to identify all the tautology based SQLIAs. A framework called Runtime Monitoring Framework, that implements our technique, is used in the development of runtime monitors. The framework uses two pre-deployment testing techniques, such as basis-path and data-flow to identify a minimal set of all legal/valid execution paths of the application. Runtime monitors are then developed and integrated to perform runtime monitoring of the application, during its post-deployment for the identified valid/legal execution paths. For evaluation we targeted a subject application with a large number of both legitimate inputs and illegitimate tautology based inputs, and measured the performance of the proposed technique. The results of our study show that runtime monitor developed for the application was successfully able to detect all the tautology based attacks without generating any false positives.

  17. Laser-based direct-write techniques for cell printing.

    Science.gov (United States)

    Schiele, Nathan R; Corr, David T; Huang, Yong; Raof, Nurazhani Abdul; Xie, Yubing; Chrisey, Douglas B

    2010-09-01

    Fabrication of cellular constructs with spatial control of cell location (+/-5 microm) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing.

  18. Laser-based direct-write techniques for cell printing

    Energy Technology Data Exchange (ETDEWEB)

    Schiele, Nathan R; Corr, David T [Biomedical Engineering Department, Rensselaer Polytechnic Institute, Troy, NY (United States); Huang Yong [Department of Mechanical Engineering, Clemson University, Clemson, SC (United States); Raof, Nurazhani Abdul; Xie Yubing [College of Nanoscale Science and Engineering, University at Albany, SUNY, Albany, NY (United States); Chrisey, Douglas B, E-mail: schien@rpi.ed, E-mail: chrisd@rpi.ed [Material Science and Engineering Department, Rensselaer Polytechnic Institute, Troy, NY (United States)

    2010-09-15

    Fabrication of cellular constructs with spatial control of cell location ({+-}5 {mu}m) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing. (topical review)

  19. PCA Based Rapid and Real Time Face Recognition Technique

    Directory of Open Access Journals (Sweden)

    T R Chandrashekar

    2013-12-01

    Full Text Available Economical and efficient that is used in various applications is face Biometric which has been a popular form biometric system. Face recognition system is being a topic of research for last few decades. Several techniques are proposed to improve the performance of face recognition system. Accuracy is tested against intensity, distance from camera, and pose variance. Multiple face recognition is another subtopic which is under research now a day. Speed at which the technique works is a parameter under consideration to evaluate a technique. As an example a support vector machine performs really well for face recognition but the computational efficiency degrades significantly with increase in number of classes. Eigen Face technique produces quality features for face recognition but the accuracy is proved to be comparatively less to many other techniques. With increase in use of core processors in personal computers and application demanding speed in processing and multiple face detection and recognition system (for example an entry detection system in shopping mall or an industry, demand for such systems are cumulative as there is a need for automated systems worldwide. In this paper we propose a novel system of face recognition developed with C# .Net that can detect multiple faces and can recognize the faces parallel by utilizing the system resources and the core processors. The system is built around Haar Cascade based face detection and PCA based face recognition system with C#.Net. Parallel library designed for .Net is used to aide to high speed detection and recognition of the real time faces. Analysis of the performance of the proposed technique with some of the conventional techniques reveals that the proposed technique is not only accurate, but also is fast in comparison to other techniques.

  20. Adding Quality of Service Extensions to the Enhanced Associativity Based Routing Protocol for Mobile Ad Hoc Networks (MANET

    Directory of Open Access Journals (Sweden)

    A. M. Murad

    2007-01-01

    Full Text Available This paper described how to discover routes that can satisfy QoS service requirements by using extensions to the Enhanced Associativity Bases Routing Protocol (EABR. These extensions were added to the messages used during route discovery. These extensions specify the service requirements, which must be met by nodes re-broadcasting a route request or returning a route reply for a destination. The performance analysis of EABR with QoS support showed that more overhead was incurred when the intermediate node discover that it cannot support the level of the QoS requested.

  1. A Knowledge—Based Specification Technique for Protocol Development

    Institute of Scientific and Technical Information of China (English)

    张尧学; 史美林; 等

    1993-01-01

    is paper proposes a knowledge-based specification technique(KST)for protocol development.This technique semi-automatically translates a protocol described in an informal description(natural languages or graphs)into one described in forml specifications(Estells and SDL).The translation processes are suported by knowledge stored in the knowledge base.This paper discusses the concept,the specification control mechanism of KST and the rules and algorithms for production of FSM's which is the basis of Estelle and SDL.

  2. A Randomized Controlled Trial Assessing Growth of Infants Fed a 100% Whey Extensively Hydrolyzed Formula Compared With a Casein-Based Extensively Hydrolyzed Formula

    Directory of Open Access Journals (Sweden)

    David Fields PhD

    2016-04-01

    Full Text Available This study compared the growth of healthy infants fed a hypoallergenic 100% whey-based extensively hydrolyzed formula (EHF with Bifidobacterium lactis (test with that of infants fed an extensively hydrolyzed casein formula (control. Formula-fed infants (14 ± 3 days were randomized to test or control groups until 112 days of age. Anthropometrics were assessed at 14, 28, 56, 84, and 112 days, and daily records were kept for 2 days prior to study visits. Serum albumin and plasma amino acids at 84 days were assessed in a subset. A total of 282 infants were randomized (124 test, 158 control. Significantly more infants dropped out of the control (56% as compared with the test (41% group. Mean daily weight gain was significantly higher in the test group compared with the control group (27.95 ± 5.91 vs 25.93 ± 6.12 g/d; P = .027 with the test group reporting significantly fewer stools (2.2 vs 3.6 stools/d; P 3 loose stools/d and a higher incidence of vomiting as compared with the test group. There were no differences in gas, mood, sleep, or serum albumin. Plasma arginine and valine were significantly lower in the test group, whereas leucine and lysine were higher; all values were within normal limits. Significantly more adverse events attributed to the study formula were reported in the control group. The 100% whey-based hypoallergenic EHF containing Bifidobacterium lactis and medium chain triglycerides supported growth of healthy infants. Future studies on the application of this formula in clinically indicated populations are warranted.

  3. Promoting Behavior Change Using Social Norms: Applying a Community Based Social Marketing Tool to Extension Programming

    Science.gov (United States)

    Chaudhary, Anil Kumar; Warner, Laura A.

    2015-01-01

    Most educational programs are designed to produce lower level outcomes, and Extension educators are challenged to produce behavior change in target audiences. Social norms are a very powerful proven tool for encouraging sustainable behavior change among Extension's target audiences. Minor modifications to program content to demonstrate the…

  4. A Component-Based Vocabulary-Extensible Sign Language Gesture Recognition Framework.

    Science.gov (United States)

    Wei, Shengjing; Chen, Xiang; Yang, Xidong; Cao, Shuai; Zhang, Xu

    2016-04-19

    Sign language recognition (SLR) can provide a helpful tool for the communication between the deaf and the external world. This paper proposed a component-based vocabulary extensible SLR framework using data from surface electromyographic (sEMG) sensors, accelerometers (ACC), and gyroscopes (GYRO). In this framework, a sign word was considered to be a combination of five common sign components, including hand shape, axis, orientation, rotation, and trajectory, and sign classification was implemented based on the recognition of five components. Especially, the proposed SLR framework consisted of two major parts. The first part was to obtain the component-based form of sign gestures and establish the code table of target sign gesture set using data from a reference subject. In the second part, which was designed for new users, component classifiers were trained using a training set suggested by the reference subject and the classification of unknown gestures was performed with a code matching method. Five subjects participated in this study and recognition experiments under different size of training sets were implemented on a target gesture set consisting of 110 frequently-used Chinese Sign Language (CSL) sign words. The experimental results demonstrated that the proposed framework can realize large-scale gesture set recognition with a small-scale training set. With the smallest training sets (containing about one-third gestures of the target gesture set) suggested by two reference subjects, (82.6 ± 13.2)% and (79.7 ± 13.4)% average recognition accuracy were obtained for 110 words respectively, and the average recognition accuracy climbed up to (88 ± 13.7)% and (86.3 ± 13.7)% when the training set included 50~60 gestures (about half of the target gesture set). The proposed framework can significantly reduce the user's training burden in large-scale gesture recognition, which will facilitate the implementation of a practical SLR system.

  5. A Component-Based Vocabulary-Extensible Sign Language Gesture Recognition Framework

    Directory of Open Access Journals (Sweden)

    Shengjing Wei

    2016-04-01

    Full Text Available Sign language recognition (SLR can provide a helpful tool for the communication between the deaf and the external world. This paper proposed a component-based vocabulary extensible SLR framework using data from surface electromyographic (sEMG sensors, accelerometers (ACC, and gyroscopes (GYRO. In this framework, a sign word was considered to be a combination of five common sign components, including hand shape, axis, orientation, rotation, and trajectory, and sign classification was implemented based on the recognition of five components. Especially, the proposed SLR framework consisted of two major parts. The first part was to obtain the component-based form of sign gestures and establish the code table of target sign gesture set using data from a reference subject. In the second part, which was designed for new users, component classifiers were trained using a training set suggested by the reference subject and the classification of unknown gestures was performed with a code matching method. Five subjects participated in this study and recognition experiments under different size of training sets were implemented on a target gesture set consisting of 110 frequently-used Chinese Sign Language (CSL sign words. The experimental results demonstrated that the proposed framework can realize large-scale gesture set recognition with a small-scale training set. With the smallest training sets (containing about one-third gestures of the target gesture set suggested by two reference subjects, (82.6 ± 13.2% and (79.7 ± 13.4% average recognition accuracy were obtained for 110 words respectively, and the average recognition accuracy climbed up to (88 ± 13.7% and (86.3 ± 13.7% when the training set included 50~60 gestures (about half of the target gesture set. The proposed framework can significantly reduce the user’s training burden in large-scale gesture recognition, which will facilitate the implementation of a practical SLR system.

  6. Dimensionality Reduction using SOM based Technique for Face Recognition

    Directory of Open Access Journals (Sweden)

    Dinesh Kumar

    2008-05-01

    Full Text Available Unsupervised or Self-Organized learning algorithms have become very popular for discovery of significant patterns or features in the input data. The three prominent algorithms namely Principal Component Analysis (PCA, Self Organizing Maps (SOM, and Independent Component Analysis (ICA have widely and successfully been used for face recognition. In this paper a SOM based technique for dimensionality reduction has been proposed. This technique has also been successfully used for face recognition. A comparative study of PCA, SOM and ICA along with the proposed technique for face recognition has also been given. Simulation results indicate that SOM is better than the other techniques for the given face database and the classifier used. The results also show that the performance of the system decreases as the number of classes increase.

  7. Least-squares based iterative multipath super-resolution technique

    CERN Document Server

    Nam, Wooseok

    2011-01-01

    In this paper, we study the problem of multipath channel estimation for direct sequence spread spectrum signals. To resolve multipath components arriving within a short interval, we propose a new algorithm called the least-squares based iterative multipath super-resolution (LIMS). Compared to conventional super-resolution techniques, such as the multiple signal classification (MUSIC) and the estimation of signal parameters via rotation invariance techniques (ESPRIT), our algorithm has several appealing features. In particular, even in critical situations where the conventional super-resolution techniques are not very powerful due to limited data or the correlation between path coefficients, the LIMS algorithm can produce successful results. In addition, due to its iterative nature, the LIMS algorithm is suitable for recursive multipath tracking, whereas the conventional super-resolution techniques may not be. Through numerical simulations, we show that the LIMS algorithm can resolve the first arrival path amo...

  8. Fault Based Techniques for Testing Boolean Expressions: A Survey

    CERN Document Server

    Badhera, Usha; Taruna, S

    2012-01-01

    Boolean expressions are major focus of specifications and they are very much prone to introduction of faults, this survey presents various fault based testing techniques. It identifies that the techniques differ in their fault detection capabilities and generation of test suite. The various techniques like Cause effect graph, meaningful impact strategy, Branch Operator Strategy (BOR), BOR+MI, MUMCUT, Modified Condition/ Decision Coverage (MCDC) has been considered. This survey describes the basic algorithms and fault categories used by these strategies for evaluating their performance. Finally, it contains short summaries of the papers that use Boolean expressions used to specify the requirements for detecting faults. These techniques have been empirically evaluated by various researchers on a simplified safety related real time control system.

  9. Extending Driving Vision Based on Image Mosaic Technique

    Directory of Open Access Journals (Sweden)

    Chen Deng

    2017-01-01

    Full Text Available Car cameras have been used extensively to assist driving by make driving visible. However, due to the limitation of the Angle of View (AoV, the dead zone still exists, which is a primary origin of car accidents. In this paper, we introduce a system to extend the vision of drivers to 360 degrees. Our system consists of four wide-angle cameras, which are mounted at different sides of a car. Although the AoV of each camera is within 180 degrees, relying on the image mosaic technique, our system can seamlessly integrate 4-channel videos into a panorama video. The panorama video enable drivers to observe everywhere around a car as far as three meters from a top view. We performed experiments in a laboratory environment. Preliminary results show that our system can eliminate vision dead zone completely. Additionally, the real-time performance of our system can satisfy requirements for practical use.

  10. Membrane-based microextraction techniques in analytical chemistry: A review.

    Science.gov (United States)

    Carasek, Eduardo; Merib, Josias

    2015-06-23

    The use of membrane-based sample preparation techniques in analytical chemistry has gained growing attention from the scientific community since the development of miniaturized sample preparation procedures in the 1990s. The use of membranes makes the microextraction procedures more stable, allowing the determination of analytes in complex and "dirty" samples. This review describes some characteristics of classical membrane-based microextraction techniques (membrane-protected solid-phase microextraction, hollow-fiber liquid-phase microextraction and hollow-fiber renewal liquid membrane) as well as some alternative configurations (thin film and electromembrane extraction) used successfully for the determination of different analytes in a large variety of matrices, some critical points regarding each technique are highlighted.

  11. A Novel Nanofabrication Technique of Silicon-Based Nanostructures

    Science.gov (United States)

    Meng, Lingkuan; He, Xiaobin; Gao, Jianfeng; Li, Junjie; Wei, Yayi; Yan, Jiang

    2016-11-01

    A novel nanofabrication technique which can produce highly controlled silicon-based nanostructures in wafer scale has been proposed using a simple amorphous silicon (α-Si) material as an etch mask. SiO2 nanostructures directly fabricated can serve as nanotemplates to transfer into the underlying substrates such as silicon, germanium, transistor gate, or other dielectric materials to form electrically functional nanostructures and devices. In this paper, two typical silicon-based nanostructures such as nanoline and nanofin have been successfully fabricated by this technique, demonstrating excellent etch performance. In addition, silicon nanostructures fabricated above can be further trimmed to less than 10 nm by combing with assisted post-treatment methods. The novel nanofabrication technique will be expected a new emerging technology with low process complexity and good compatibility with existing silicon integrated circuit and is an important step towards the easy fabrication of a wide variety of nanoelectronics, biosensors, and optoelectronic devices.

  12. Cost-optimal power system extension under flow-based market coupling

    Energy Technology Data Exchange (ETDEWEB)

    Hagspiel, Simeon; Jaegemann, Cosima; Lindenberger, Dietmar [Koeln Univ. (Germany). Energiewirtschaftliches Inst.; Brown, Tom; Cherevatskiy, Stanislav; Troester, Eckehard [Energynautics GmbH, Langen (Germany)

    2013-05-15

    Electricity market models, implemented as dynamic programming problems, have been applied widely to identify possible pathways towards a cost-optimal and low carbon electricity system. However, the joint optimization of generation and transmission remains challenging, mainly due to the fact that different characteristics and rules apply to commercial and physical exchanges of electricity in meshed networks. This paper presents a methodology that allows to optimize power generation and transmission infrastructures jointly through an iterative approach based on power transfer distribution factors (PTDFs). As PTDFs are linear representations of the physical load flow equations, they can be implemented in a linear programming environment suitable for large scale problems. The algorithm iteratively updates PTDFs when grid infrastructures are modified due to cost-optimal extension and thus yields an optimal solution with a consistent representation of physical load flows. The method is first demonstrated on a simplified three-node model where it is found to be robust and convergent. It is then applied to the European power system in order to find its cost-optimal development under the prescription of strongly decreasing CO{sub 2} emissions until 2050.

  13. Full-duplex MIMO system based on antenna cancellation technique

    DEFF Research Database (Denmark)

    Foroozanfard, Ehsan; Franek, Ondrej; Tatomirescu, Alexandru

    2014-01-01

    The performance of an antenna cancellation technique for a multiple-input– multiple-output (MIMO) full-duplex system that is based on null-steering beamforming and antenna polarization diversity is investigated. A practical implementation of a symmetric antenna topology comprising three dual-pola...

  14. CDAPubMed: a browser extension to retrieve EHR-based biomedical literature

    Directory of Open Access Journals (Sweden)

    Perez-Rey David

    2012-04-01

    Full Text Available Abstract Background Over the last few decades, the ever-increasing output of scientific publications has led to new challenges to keep up to date with the literature. In the biomedical area, this growth has introduced new requirements for professionals, e.g., physicians, who have to locate the exact papers that they need for their clinical and research work amongst a huge number of publications. Against this backdrop, novel information retrieval methods are even more necessary. While web search engines are widespread in many areas, facilitating access to all kinds of information, additional tools are required to automatically link information retrieved from these engines to specific biomedical applications. In the case of clinical environments, this also means considering aspects such as patient data security and confidentiality or structured contents, e.g., electronic health records (EHRs. In this scenario, we have developed a new tool to facilitate query building to retrieve scientific literature related to EHRs. Results We have developed CDAPubMed, an open-source web browser extension to integrate EHR features in biomedical literature retrieval approaches. Clinical users can use CDAPubMed to: (i load patient clinical documents, i.e., EHRs based on the Health Level 7-Clinical Document Architecture Standard (HL7-CDA, (ii identify relevant terms for scientific literature search in these documents, i.e., Medical Subject Headings (MeSH, automatically driven by the CDAPubMed configuration, which advanced users can optimize to adapt to each specific situation, and (iii generate and launch literature search queries to a major search engine, i.e., PubMed, to retrieve citations related to the EHR under examination. Conclusions CDAPubMed is a platform-independent tool designed to facilitate literature searching using keywords contained in specific EHRs. CDAPubMed is visually integrated, as an extension of a widespread web browser, within the standard

  15. Design and Testing of a C/C-SiC Nozzle Extension Manufactured via Filament Winding Technique and Adapted Liquid Silicon Infiltration

    Science.gov (United States)

    Breede, F.; Koch, D.; Frieß, M.

    2014-06-01

    Nozzle extensions made of ceramic matrix composites (CMC) have the potential to improve the performance of liquid fueled rocket engines. Gas permeability and delamination have been reported to be still critical aspects in the manufacture of CMC nozzle structures. This work shows the development and manufacture of a radiation cooled C/C-SiC nozzle for a full ceramic thrust chamber. The green body was produced via advanced wet filament winding technique using multi-angle fiber architectures which were adapted to reduce the affinity of delamination during subsequent high temperature processing steps. In order to improve the final gas-tightness additional efforts were made to adjust the carbon matrix by re-infiltration for complete conversion to a dense SiC matrix with reduced amount of residual silicon after liquid silicon infiltration process. Microstructural characterization and flaw detection were performed by CT and REM analysis. Prototype nozzle extensions were manufactured and preliminary results of the structural characterization before the hot firing tests are presented.

  16. The Real-Time Image Processing Technique Based on DSP

    Institute of Scientific and Technical Information of China (English)

    QI Chang; CHEN Yue-hua; HUANG Tian-shu

    2005-01-01

    This paper proposes a novel real-time image processing technique based on digital singnal processor (DSP). At the aspect of wavelet transform(WT) algorithm, the technique uses algorithm of second generation wavelet transform-lifting scheme WT that has low calculation complexity property for the 2-D image data processing. Since the processing effect of lifting scheme WT for 1-D data is better than the effect of it for 2-D data obviously, this paper proposes a reformative processing method: Transform 2-D image data to 1-D data sequence by linearization method, then process the 1-D data sequence by algorithm of lifting scheme WT. The method changes the image convolution mode,which based on the cross filtering of rows and columns. At the aspect of hardware realization, the technique optimizes the program structure of DSP to exert the operation power with the in-chip memorizer of DSP. The experiment results show that the real-time image processing technique proposed in this paper can meet the real-time requirement of video-image transmitting in the video surveillance system of electric power. So the technique is a feasible and efficient DSP solution.

  17. Face Veins Based MCMT Technique for Personal Identification

    Directory of Open Access Journals (Sweden)

    Kamta Nath Mishra

    2015-08-01

    Full Text Available Face veins based personal identification is a challenging task in the field of identity verification of a person. It is because many other techniques are not identifying the uniqueness of a person in the universe. This research paper finds the uniqueness of a person on the basis of face veins based technique. In this paper five different persons face veins images have been used with different rotation angles (left/right 900 to 2700 and 3150 . For each person, eight different images at different rotations were used and for each of these images the same minimum cost minutiae tree (MCMT is obtained. Here, Prim‟s or Kruskal‟s algorithm is used for finding the MCMT from a minutiae graph. The MCMT is traversed in pre-order to generate the unique string of vertices and edge lengths. We deviated the edge lengths of each MCMT by five pixels in positive and negative directions for robustness testing. It is observed in our experiments that the traversed string which consists of vertices and edge lengths of MCMT is unique for each person and this unique sequence is correctly identifying a person with an accuracy of above 95%. Further, we have compared the performance of our proposed technique with other standard techniques and it is observed that the proposed technique is giving the promising result.

  18. Video multiple watermarking technique based on image interlacing using DWT.

    Science.gov (United States)

    Ibrahim, Mohamed M; Abdel Kader, Neamat S; Zorkany, M

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth.

  19. Characteristic Modules of Dual Extensions and Gr(o)bner Bases

    Institute of Scientific and Technical Information of China (English)

    Yun Ge XU; Long Cai LI

    2004-01-01

    Let C be a finite dimensional directed algebra over an algebraically closed field k and A = A(C) the dual extension of C. The characteristic modules of A are constructed explicitly for a class of directed algebras, which generalizes the results of Xi. Furthermore, it is shown that the characteristic modules of dual extensions of a certain class of directed algebras admit the left Grobner basis theory in the sense of E. L. Green.

  20. Proposing a Wiki-Based Technique for Collaborative Essay Writing

    Directory of Open Access Journals (Sweden)

    Mabel Ortiz Navarrete

    2014-10-01

    Full Text Available This paper aims at proposing a technique for students learning English as a foreign language when they collaboratively write an argumentative essay in a wiki environment. A wiki environment and collaborative work play an important role within the academic writing task. Nevertheless, an appropriate and systematic work assignment is required in order to make use of both. In this paper the proposed technique when writing a collaborative essay mainly attempts to provide the most effective way to enhance equal participation among group members by taking as a base computer mediated collaboration. Within this context, the students’ role is clearly defined and individual and collaborative tasks are explained.

  1. Knowledge based systems advanced concepts, techniques and applications

    CERN Document Server

    1997-01-01

    The field of knowledge-based systems (KBS) has expanded enormously during the last years, and many important techniques and tools are currently available. Applications of KBS range from medicine to engineering and aerospace.This book provides a selected set of state-of-the-art contributions that present advanced techniques, tools and applications. These contributions have been prepared by a group of eminent researchers and professionals in the field.The theoretical topics covered include: knowledge acquisition, machine learning, genetic algorithms, knowledge management and processing under unc

  2. Line impedance estimation using model based identification technique

    DEFF Research Database (Denmark)

    Ciobotaru, Mihai; Agelidis, Vassilios; Teodorescu, Remus

    2011-01-01

    into the operation of the grid-connected power converters. This paper describes a quasi passive method for estimating the line impedance of the distribution electricity network. The method uses the model based identification technique to obtain the resistive and inductive parts of the line impedance. The quasi......The estimation of the line impedance can be used by the control of numerous grid-connected systems, such as active filters, islanding detection techniques, non-linear current controllers, detection of the on/off grid operation mode. Therefore, estimating the line impedance can add extra functions...

  3. Line Search-Based Inverse Lithography Technique for Mask Design

    Directory of Open Access Journals (Sweden)

    Xin Zhao

    2012-01-01

    Full Text Available As feature size is much smaller than the wavelength of illumination source of lithography equipments, resolution enhancement technology (RET has been increasingly relied upon to minimize image distortions. In advanced process nodes, pixelated mask becomes essential for RET to achieve an acceptable resolution. In this paper, we investigate the problem of pixelated binary mask design in a partially coherent imaging system. Similar to previous approaches, the mask design problem is formulated as a nonlinear program and is solved by gradient-based search. Our contributions are four novel techniques to achieve significantly better image quality. First, to transform the original bound-constrained formulation to an unconstrained optimization problem, we propose a new noncyclic transformation of mask variables to replace the wellknown cyclic one. As our transformation is monotonic, it enables a better control in flipping pixels. Second, based on this new transformation, we propose a highly efficient line search-based heuristic technique to solve the resulting unconstrained optimization. Third, to simplify the optimization, instead of using discretization regularization penalty technique, we directly round the optimized gray mask into binary mask for pattern error evaluation. Forth, we introduce a jump technique in order to jump out of local minimum and continue the search.

  4. An Observed Voting System Based On Biometric Technique

    Directory of Open Access Journals (Sweden)

    B. Devikiruba

    2015-08-01

    Full Text Available ABSTRACT This article describes a computational framework which can run almost on every computer connected to an IP based network to study biometric techniques. This paper discusses with a system protecting confidential information puts strong security demands on the identification. Biometry provides us with a user-friendly method for this identification and is becoming a competitor for current identification mechanisms. The experimentation section focuses on biometric verification specifically based on fingerprints. This article should be read as a warning to those thinking of using methods of identification without first examine the technical opportunities for compromising mechanisms and the associated legal consequences. The development is based on the java language that easily improves software packages that is useful to test new control techniques.

  5. A Survey on Statistical Based Single Channel Speech Enhancement Techniques

    Directory of Open Access Journals (Sweden)

    Sunnydayal. V

    2014-11-01

    Full Text Available Speech enhancement is a long standing problem with various applications like hearing aids, automatic recognition and coding of speech signals. Single channel speech enhancement technique is used for enhancement of the speech degraded by additive background noises. The background noise can have an adverse impact on our ability to converse without hindrance or smoothly in very noisy environments, such as busy streets, in a car or cockpit of an airplane. Such type of noises can affect quality and intelligibility of speech. This is a survey paper and its object is to provide an overview of speech enhancement algorithms so that enhance the noisy speech signal which is corrupted by additive noise. The algorithms are mainly based on statistical based approaches. Different estimators are compared. Challenges and Opportunities of speech enhancement are also discussed. This paper helps in choosing the best statistical based technique for speech enhancement

  6. Structural break detection method based on the Adaptive Regression Splines technique

    Science.gov (United States)

    Kucharczyk, Daniel; Wyłomańska, Agnieszka; Zimroz, Radosław

    2017-04-01

    For many real data, long term observation consists of different processes that coexist or occur one after the other. Those processes very often exhibit different statistical properties and thus before the further analysis the observed data should be segmented. This problem one can find in different applications and therefore new segmentation techniques have been appeared in the literature during last years. In this paper we propose a new method of time series segmentation, i.e. extraction from the analysed vector of observations homogeneous parts with similar behaviour. This method is based on the absolute deviation about the median of the signal and is an extension of the previously proposed techniques also based on the simple statistics. In this paper we introduce the method of structural break point detection which is based on the Adaptive Regression Splines technique, one of the form of regression analysis. Moreover we propose also the statistical test which allows testing hypothesis of behaviour related to different regimes. First, the methodology we apply to the simulated signals with different distributions in order to show the effectiveness of the new technique. Next, in the application part we analyse the real data set that represents the vibration signal from a heavy duty crusher used in a mineral processing plant.

  7. RANKINGTHEREFACTORING TECHNIQUES BASED ON THE INTERNAL QUALITY ATTRIBUTES

    Directory of Open Access Journals (Sweden)

    Sultan Alshehri

    2014-01-01

    Full Text Available The analytic hierarchy process (AHP has been applied in many fields and especially to complex engineering problems and applications. The AHP is capable of structuring decision problems and finding mathematically determined judgments built on knowledge and experience. This suggests that AHP should prove useful in agile software development where complex decisions occur routinely. In this paper, the AHP is used to rank the refactoring techniques based on the internal code quality attributes. XP encourages applying the refactoring where the code smells bad. However, refactoring may consume more time and efforts.So, to maximize the benefits of the refactoring in less time and effort, AHP has been applied to achieve this purpose. It was found that ranking the refactoring techniques helped the XP team to focus on the technique that improve the code and the XP development process in general.

  8. Multivariate discrimination technique based on the Bayesian theory

    Institute of Scientific and Technical Information of China (English)

    JIN Ping; PAN Chang-zhou; XIAO Wei-guo

    2007-01-01

    A multivariate discrimination technique was established based on the Bayesian theory. Using this technique, P/S ratios of different types (e.g., Pn/Sn, Pn/Lg, Pg/Sn or Pg/Lg) measured within different frequency bands and from different stations were combined together to discriminate seismic events in Central Asia. Major advantages of the Bayesian approach are that the probability to be an explosion for any unknown event can be directly calculated given the measurements of a group of discriminants, and at the same time correlations among these discriminants can be fully taken into account. It was proved theoretically that the Bayesian technique would be optimal and its discriminating performance would be better than that of any individual discriminant as well as better than that yielded by the linear combination approach ignoring correlations among discriminants. This conclusion was also validated in this paper by applying the Bayesian approach to the above-mentioned observed data.

  9. Fabrication of thermoplastics chips through lamination based techniques.

    Science.gov (United States)

    Miserere, Sandrine; Mottet, Guillaume; Taniga, Velan; Descroix, Stephanie; Viovy, Jean-Louis; Malaquin, Laurent

    2012-04-24

    In this work, we propose a novel strategy for the fabrication of flexible thermoplastic microdevices entirely based on lamination processes. The same low-cost laminator apparatus can be used from master fabrication to microchannel sealing. This process is appropriate for rapid prototyping at laboratory scale, but it can also be easily upscaled to industrial manufacturing. For demonstration, we used here Cycloolefin Copolymer (COC), a thermoplastic polymer that is extensively used for microfluidic applications. COC is a thermoplastic polymer with good chemical resistance to common chemicals used in microfluidics such as acids, bases and most polar solvents. Its optical quality and mechanical resistance make this material suitable for a large range of applications in chemistry or biology. As an example, the electrokinetic separation of pollutants is proposed in the present study.

  10. MEMS-Based Power Generation Techniques for Implantable Biosensing Applications

    Directory of Open Access Journals (Sweden)

    Jonathan Lueke

    2011-01-01

    Full Text Available Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient.

  11. An Efficient Image Compression Technique Based on Arithmetic Coding

    Directory of Open Access Journals (Sweden)

    Prof. Rajendra Kumar Patel

    2012-12-01

    Full Text Available The rapid growth of digital imaging applications, including desktop publishing, multimedia, teleconferencing, and high visual definition has increased the need for effective and standardized image compression techniques. Digital Images play a very important role for describing the detailed information. The key obstacle for many applications is the vast amount of data required to represent a digital image directly. The various processes of digitizing the images to obtain it in the best quality for the more clear and accurate information leads to the requirement of more storage space and better storage and accessing mechanism in the form of hardware or software. In this paper we concentrate mainly on the above flaw so that we reduce the space with best quality image compression. State-ofthe-art techniques can compress typical images from 1/10 to 1/50 their uncompressed size without visibly affecting image quality. From our study I observe that there is a need of good image compression technique which provides better reduction technique in terms of storage and quality. Arithmetic coding is the best way to reducing encoding data. So in this paper we propose arithmetic coding with walsh transformation based image compression technique which is an efficient way of reduction

  12. SMS Spam Filtering Technique Based on Artificial Immune System

    Directory of Open Access Journals (Sweden)

    Tarek M Mahmoud

    2012-03-01

    Full Text Available The Short Message Service (SMS have an important economic impact for end users and service providers. Spam is a serious universal problem that causes problems for almost all users. Several studies have been presented, including implementations of spam filters that prevent spam from reaching their destination. Nave Bayesian algorithm is one of the most effective approaches used in filtering techniques. The computational power of smart phones are increasing, making increasingly possible to perform spam filtering at these devices as a mobile agent application, leading to better personalization and effectiveness. The challenge of filtering SMS spam is that the short messages often consist of few words composed of abbreviations and idioms. In this paper, we propose an anti-spam technique based on Artificial Immune System (AIS for filtering SMS spam messages. The proposed technique utilizes a set of some features that can be used as inputs to spam detection model. The idea is to classify message using trained dataset that contains Phone Numbers, Spam Words, and Detectors. Our proposed technique utilizes a double collection of bulk SMS messages Spam and Ham in the training process. We state a set of stages that help us to build dataset such as tokenizer, stop word filter, and training process. Experimental results presented in this paper are based on iPhone Operating System (iOS. The results applied to the testing messages show that the proposed system can classify the SMS spam and ham with accurate compared with Nave Bayesian algorithm.

  13. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    Directory of Open Access Journals (Sweden)

    T. Subramani

    2014-06-01

    Full Text Available This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case study on natural base isolation using naturally available soils is presented. Also, the future areas of research are indicated. Earthquakes are one of nature IS greatest hazards; throughout historic time they have caused significant loss offline and severe damage to property, especially to man-made structures. On the other hand, earthquakes provide architects and engineers with a number of important design criteria foreign to the normal design process. From well established procedures reviewed by many researchers, seismic isolation may be used to provide an effective solution for a wide range of seismic design problems. The application of the base isolation techniques to protect structures against damage from earthquake attacks has been considered as one of the most effective approaches and has gained increasing acceptance during the last two decades. This is because base isolation limits the effects of the earthquake attack, a flexible base largely decoupling the structure from the ground motion, and the structural response accelerations are usually less than the ground acceleration. In general, the increase of additional viscous damping in the structure may reduce displacement and acceleration responses of the structure. This study also seeks to evaluate the effects of additional damping on the seismic response when compared with structures without additional damping for the different ground motions.

  14. Finding Within Cluster Dense Regions Using Distance Based Technique

    Directory of Open Access Journals (Sweden)

    Wesam Ashour

    2012-03-01

    Full Text Available One of the main categories in Data Clustering is density based clustering. Density based clustering techniques like DBSCAN are attractive because they can find arbitrary shaped clusters along with noisy outlier. The main weakness of the traditional density based algorithms like DBSCAN is clustering the different density level data sets. DBSCAN calculations done according to given parameters applied to all points in a data set, while densities of the data set clusters may be totally different. The proposed algorithm overcomes this weakness of the traditional density based algorithms. The algorithm starts with partitioning the data within a cluster to units based on a user parameter and compute the density for each unit separately. Consequently, the algorithm compares the results and merges neighboring units with closer approximate density values to become a new cluster. The experimental results of the simulation show that the proposed algorithm gives good results in finding clusters for different density cluster data set.

  15. Gabor-based fusion technique for Optical Coherence Microscopy.

    Science.gov (United States)

    Rolland, Jannick P; Meemon, Panomsak; Murali, Supraja; Thompson, Kevin P; Lee, Kye-sung

    2010-02-15

    We recently reported on an Optical Coherence Microscopy technique, whose innovation intrinsically builds on a recently reported - 2 microm invariant lateral resolution by design throughout a 2 mm cubic full-field of view - liquid-lens-based dynamic focusing optical probe [Murali et al., Optics Letters 34, 145-147, 2009]. We shall report in this paper on the image acquisition enabled by this optical probe when combined with an automatic data fusion method developed and described here to produce an in-focus high resolution image throughout the imaging depth of the sample. An African frog tadpole (Xenopus laevis) was imaged with the novel probe and the Gabor-based fusion technique, demonstrating subcellular resolution in a 0.5 mm (lateral) x 0.5 mm (axial) without the need, for the first time, for x-y translation stages, depth scanning, high-cost adaptive optics, or manual intervention. In vivo images of human skin are also presented.

  16. Establishment of safety verification method for life extension based on periodic safety review

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Soong Pyung; Yeom, Yu Son; Yoon, In Sik; Lee, Jeo Young [Chosun Univ., Gwangju (Korea, Republic of)

    2004-02-15

    Safe management of operating lifetimes of Nuclear Power Plants is a subject of prime interests. As the design life of the Nuclear Power Plant will be ended in 2008, an appropriate procedure for the design life re-assessment or lifetime extension is necessary in Korea. Therefore, the objective of this work is to develop procedural requirements which can be applied to the regulation of lifetime management or life extension of Nuclear Power Plants in Korea. Review on the linkage of the PSR with the extension of the operating lifetime of Nuclear Power Plants was performed to enhance the utilization of PSR results and analysis of the insufficiencies in the license rule in Korea.

  17. DEVA: An extensible ontology-based annotation model for visual document collections

    Science.gov (United States)

    Jelmini, Carlo; Marchand-Maillet, Stephane

    2003-01-01

    The description of visual documents is a fundamental aspect of any efficient information management system, but the process of manually annotating large collections of documents is tedious and far from being perfect. The need for a generic and extensible annotation model therefore arises. In this paper, we present DEVA, an open, generic and expressive multimedia annotation framework. DEVA is an extension of the Dublin Core specification. The model can represent the semantic content of any visual document. It is described in the ontology language DAML+OIL and can easily be extended with external specialized ontologies, adapting the vocabulary to the given application domain. In parallel, we present the Magritte annotation tool, which is an early prototype that validates the DEVA features. Magritte allows to manually annotating image collections. It is designed with a modular and extensible architecture, which enables the user to dynamically adapt the user interface to specialized ontologies merged into DEVA.

  18. Quantum state tomography of orbital angular momentum photonics qubits via a projection-based technique

    CERN Document Server

    Nicolas, Adrien; Giacobino, Elisabeth; Maxein, Dominik; Laurat, Julien

    2014-01-01

    While measuring the orbital angular momentum state of bright light beams can be performed using imaging techniques, a full characterization at the single-photon level is challenging. For applications to quantum optics and quantum information science, such characterization is an essential capability. Here, we present a setup to perform the quantum state tomography of photonic qubits encoded in this degree of freedom. The method is based on a projective technique using spatial mode projection via fork holograms and single-mode fibers inserted into an interferometer. The alignment and calibration of the device is detailed as well as the measurement sequence to reconstruct the associated density matrix. Possible extensions to higher-dimensional spaces are discussed.

  19. Clustering economies based on multiple criteria decision making techniques

    OpenAIRE

    2011-01-01

    One of the primary concerns on many countries is to determine different important factors affecting economic growth. In this paper, we study some factors such as unemployment rate, inflation ratio, population growth, average annual income, etc to cluster different countries. The proposed model of this paper uses analytical hierarchy process (AHP) to prioritize the criteria and then uses a K-mean technique to cluster 59 countries based on the ranked criteria into four groups. The first group i...

  20. High-Accurate, Physics-Based Wake Simulation Techniques

    Science.gov (United States)

    2015-01-27

    physically accurate problem as well as to show that the sensor can account for artificial viscosity where needed but not overload the problem and ”wash out...code) 1/27/2015 Final Technical Report 02/25/10 - 08/31/14 High-Accurate, Physics -Based Wake Simulation Techniques N00014-10-C-0190 Andrew Shelton...code was developed that utilizes the discontinuous Galerkin method to solve the Euler equations while utilizing a modal artificial viscosity sensor

  1. Research on Liquidity Risk Evaluation of Chinese A-Shares Market Based on Extension Theory

    Science.gov (United States)

    Bai-Qing, Sun; Peng-Xiang, Liu; Lin, Zhang; Yan-Ge, Li

    This research defines the liquidity risk of stock market in matter-element theory and affair-element theory, establishes the indicator system of the forewarning for liquidity risks,designs the model and the process of early warning using the extension set method, extension dependent function and the comprehensive evaluation model. And the paper studies empirically A-shares market through the data of 1A0001, which prove that the model can better describe liquidity risk of China’s A-share market. At last, it gives the corresponding policy recommendations.

  2. A Review of Financial Accounting Fraud Detection based on Data Mining Techniques

    Science.gov (United States)

    Sharma, Anuj; Kumar Panigrahi, Prabin

    2012-02-01

    With an upsurge in financial accounting fraud in the current economic scenario experienced, financial accounting fraud detection (FAFD) has become an emerging topic of great importance for academic, research and industries. The failure of internal auditing system of the organization in identifying the accounting frauds has lead to use of specialized procedures to detect financial accounting fraud, collective known as forensic accounting. Data mining techniques are providing great aid in financial accounting fraud detection, since dealing with the large data volumes and complexities of financial data are big challenges for forensic accounting. This paper presents a comprehensive review of the literature on the application of data mining techniques for the detection of financial accounting fraud and proposes a framework for data mining techniques based accounting fraud detection. The systematic and comprehensive literature review of the data mining techniques applicable to financial accounting fraud detection may provide a foundation to future research in this field. The findings of this review show that data mining techniques like logistic models, neural networks, Bayesian belief network, and decision trees have been applied most extensively to provide primary solutions to the problems inherent in the detection and classification of fraudulent data.

  3. Multi-Frequency Target Detection Techniques for DVB-T Based Passive Radar Sensors

    Directory of Open Access Journals (Sweden)

    Tatiana Martelli

    2016-09-01

    Full Text Available This paper investigates the possibility to improve target detection capability in a DVB-T- based passive radar sensor by jointly exploiting multiple digital television channels broadcast by the same transmitter of opportunity. Based on the remarkable results obtained by such a multi-frequency approach using other signals of opportunity (i.e., FM radio broadcast transmissions, we propose appropriate modifications to the previously devised signal processing techniques for them to be effective in the newly considered scenarios. The resulting processing schemes are extensively applied against experimental DVB-T-based passive radar data pertaining to different surveillance applications. The obtained results clearly show the effectiveness of the proposed multi-frequency approaches and demonstrate their suitability for application in the considered scenarios.

  4. An Exploration of Participative Motivations in a Community-Based Online English Extensive Reading Contest with Respect to Gender Difference

    Science.gov (United States)

    Liu, I-Fan; Young, Shelley S. -C.

    2017-01-01

    The purpose of this study is to describe an online community-based English extensive reading contest to investigate whether the participants' intrinsic, extrinsic, and interpersonal motivations and learning results show significant gender differences. A total of 501 valid questionnaires (285 females and 216 males) from Taiwanese high school…

  5. Eat, Grow, Lead 4-H: An Innovative Approach to Deliver Campus- Based Field Experiences to Pre-Entry Extension Educators

    Science.gov (United States)

    Weeks, Penny Pennington; Weeks, William G.

    2012-01-01

    Eat, Grow, Lead 4-H Club was created as a pilot program for college students seeking to gain experience as non-formal youth educators, specifically serving pre-entry level Extension educators through a university-based 4-H club. Seventeen student volunteers contributed an estimated 630 hours of service to the club during spring 2011. The club…

  6. NEW VERSATILE CAMERA CALIBRATION TECHNIQUE BASED ON LINEAR RECTIFICATION

    Institute of Scientific and Technical Information of China (English)

    Pan Feng; Wang Xuanyin

    2004-01-01

    A new versatile camera calibration technique for machine vision using off-the-shelf cameras is described. Aimed at the large distortion of the off-the-shelf cameras, a new camera distortion rectification technology based on line-rectification is proposed. A full-camera-distortion model is introduced and a linear algorithm is provided to obtain the solution. After the camera rectification intrinsic and extrinsic parameters are obtained based on the relationship between the homograph and absolute conic. This technology needs neither a high-accuracy three-dimensional calibration block, nor a complicated translation or rotation platform. Both simulations and experiments show that this method is effective and robust.

  7. User Identification Detector Based on Power of R Technique

    Institute of Scientific and Technical Information of China (English)

    WANG Chun-jiang; YU Quan; LIU Yuan-an

    2005-01-01

    To avoid the inaccurate estimation of the active user's number and the corresponding performance degradation, a novel POR-based User Identification Detector (UID) is proposed for the Code Division Multiple Access (CDMA) systems. The new detector adopts the Power of R (POR) technique and the Multiple Signal Classification (MUSIC) method, which does not require the estimation of active users' number, and obtains lower false alarm probability than the subspace-based UID in the multipath channels. However, from our analysis, increasing the order m does not improve the performance. Therefore, when m is one, the performance of the new detector is maximal.

  8. Comparing Four Touch-Based Interaction Techniques for an Image-Based Audience Response System

    NARCIS (Netherlands)

    Jorritsma, Wiard; Prins, Jonatan T.; van Ooijen, Peter M. A.

    2015-01-01

    This study aimed to determine the most appropriate touch-based interaction technique for I2Vote, an image-based audience response system for radiology education in which users need to accurately mark a target on a medical image. Four plausible techniques were identified: land-on, take-off, zoom-poin

  9. Herd-scale measurements of methane emissions from cattle grazing extensive sub-tropical grasslands using the open-path laser technique.

    Science.gov (United States)

    Tomkins, N W; Charmley, E

    2015-12-01

    Methane (CH4) emissions associated with beef production systems in northern Australia are yet to be quantified. Methodologies are available to measure emissions, but application in extensive grazing environments is challenging. A micrometeorological methodology for estimating herd-scale emissions using an indirect open-path spectroscopic technique and an atmospheric dispersion model is described. The methodology was deployed on five cattle properties across Queensland and Northern Territory, with measurements conducted during two occasions at one site. On each deployment, data were collected every 10 min for up to 7 h a day over 4 to 16 days. To increase the atmospheric concentration of CH4 to measurable levels, cattle were confined to a known area around water points from ~0800 to 1600 h, during which time measurements of wind statistics and line-averaged CH4 concentration were taken. Filtering to remove erroneous data accounted for 35% of total observations. For five of the six deployments CH4 emissions were within the expected range of 0.4 to 0.6 g/kg BW. At one site, emissions were ~2 times expected values. There was small but consistent variation with time of day, although for some deployments measurements taken early in the day tended to be higher than at the other times. There was a weak linear relationship (R 2=0.47) between animal BW and CH4 emission per kg BW. Where it was possible to compare emissions in the early and late dry season at one site, it was speculated that higher emissions at the late dry season may have been attributed to poorer diet quality. It is concluded that the micrometeorological methodology using open-path lasers can be successfully deployed in extensive grazing conditions to directly measure CH4 emissions from cattle at a herd scale.

  10. A Different Web-Based Geocoding Service Using Fuzzy Techniques

    Science.gov (United States)

    Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.

    2015-12-01

    Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.

  11. A DIFFERENT WEB-BASED GEOCODING SERVICE USING FUZZY TECHNIQUES

    Directory of Open Access Journals (Sweden)

    P. Pahlavani

    2015-12-01

    Full Text Available Geocoding – the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.

  12. A new extension algorithm for cubic B-splines based on minimal strain energy

    Institute of Scientific and Technical Information of China (English)

    MO Guo-liang; ZHAO Ya-nan

    2006-01-01

    Extension ora B-spline curve or surface is a useful function in a CAD system. This paper presents an algorithm for extending cubic B-spline curves or surfaces to one or more target points. To keep the extension curve segment GC2-continuous with the original one, a family of cubic polynomial interpolation curves can be constructed. One curve is chosen as the solution from a sub-class of such a family by setting one GC2 parameter to be zero and determining the second GC2 parameter by minimizing the strain energy. To simplify the final curve representation, the extension segment is reparameterized to achieve C2-continuity with the given B-spline curve, and then knot removal from the curve is done. As a result, a sub-optimized solution subject to the given constraints and criteria is obtained. Additionally, new control points of the extension B-spline segment can be determined by solving lower triangular linear equations. Some computing examples for comparing our method and other methods are given.

  13. Designing a Competency-Based New County Extension Personnel Training Program: A Novel Approach

    Science.gov (United States)

    Brodeur, Cheri Winton; Higgins, Cynthia; Galindo-Gonzalez, Sebastian; Craig, Diane D.; Haile, Tyann

    2011-01-01

    Voluntary county personnel turnover occurs for a multitude of reasons, including the lack of job satisfaction, organizational commitment, and job embeddedness and lack of proper training. Loss of personnel can be costly both economically and in terms of human capital. Retention of Extension professionals can be improved through proper training or…

  14. Tools and Techniques for Wt1-Based Lineage Tracing.

    Science.gov (United States)

    Wilm, Bettina; Muñoz-Chapuli, Ramon

    2016-01-01

    The spatiotemporal expression pattern of Wt1 has been extensively studied in a number of animal models to establish its function and the developmental fate of the cells expressing this gene. In this chapter, we review the available animal models for Wt1-expressing cell lineage analysis, including direct Wt1 expression reporters and systems for permanent Wt1 lineage tracing. We describe the presently used constitutive or inducible genetic lineage tracing approaches based on the Cre/loxP system utilizing Cre recombinase expression under control of a Wt1 promoter.To make these systems accessible, we provide laboratory protocols that include dissection and processing of the tissues for immunofluorescence and histopathological analysis of the lineage-labeled Wt1-derived cells within the embryo/tissue context.

  15. Regression based peak load forecasting using a transformation technique

    Energy Technology Data Exchange (ETDEWEB)

    Haida, Takeshi; Muto, Shoichi (Tokyo Electric Power Co. (Japan). Computer and Communication Research Center)

    1994-11-01

    This paper presents a regression based daily peak load forecasting method with a transformation technique. In order to forecast the load precisely through a year, the authors should consider seasonal load change, annual load growth and the latest daily load change. To deal with these characteristics in the load forecasting, a transformation technique is presented. This technique consists of a transformation function with translation and reflection methods. The transformation function is estimated with the previous year's data points, in order that the function converts the data points into a set of new data points with preserving the shape of temperature-load relationships in the previous year. Then, the function is slightly translated so that the transformed data points will fit the shape of temperature-load relationships in the year. Finally, multivariate regression analysis with the latest daily loads and weather observations estimates the forecasting model. Large forecasting errors caused by the weather-load nonlinear characteristic in the transitional seasons such as spring and fall are reduced. Performance of the technique which is verified with simulations on actual load data of Tokyo Electric Power Company is also described.

  16. Evaluation of Teen Cuisine: An Extension-Based Cooking Program to Increase Self-efficacy in Teens

    OpenAIRE

    Petty, Heather Keyronica

    2016-01-01

    Heather K. Petty ABSTRACT Title: Evaluation of Teen Cuisine: An Extension-Based Cooking Program to Increase Self-efficacy in Teens Background: Childhood, adolescent, and adult obesity is a major health and economic concern affecting the United States and various countries across the globe. Obese children and adolescents are at a potential risk for developing certain chronic diseases as they transition into adulthood. There are community-based cooking intervention programs designed ...

  17. Noninvasive in vivo glucose sensing using an iris based technique

    Science.gov (United States)

    Webb, Anthony J.; Cameron, Brent D.

    2011-03-01

    Physiological glucose monitoring is important aspect in the treatment of individuals afflicted with diabetes mellitus. Although invasive techniques for glucose monitoring are widely available, it would be very beneficial to make such measurements in a noninvasive manner. In this study, a New Zealand White (NZW) rabbit animal model was utilized to evaluate a developed iris-based imaging technique for the in vivo measurement of physiological glucose concentration. The animals were anesthetized with isoflurane and an insulin/dextrose protocol was used to control blood glucose concentration. To further help restrict eye movement, a developed ocular fixation device was used. During the experimental time frame, near infrared illuminated iris images were acquired along with corresponding discrete blood glucose measurements taken with a handheld glucometer. Calibration was performed using an image based Partial Least Squares (PLS) technique. Independent validation was also performed to assess model performance along with Clarke Error Grid Analysis (CEGA). Initial validation results were promising and show that a high percentage of the predicted glucose concentrations are within 20% of the reference values.

  18. IMAGE SEGMENTATION BASED ON MARKOV RANDOM FIELD AND WATERSHED TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    This paper presented a method that incorporates Markov Random Field(MRF), watershed segmentation and merging techniques for performing image segmentation and edge detection tasks. MRF is used to obtain an initial estimate of x regions in the image under process where in MRF model, gray level x, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The process needs an initial segmented result. An initial segmentation is got based on K-means clustering technique and the minimum distance, then the region process in modeled by MRF to obtain an image contains different intensity regions. Starting from this we calculate the gradient values of that image and then employ a watershed technique. When using MRF method it obtains an image that has different intensity regions and has all the edge and region information, then it improves the segmentation result by superimpose closed and an accurate boundary of each region using watershed algorithm. After all pixels of the segmented regions have been processed, a map of primitive region with edges is generated. Finally, a merge process based on averaged mean values is employed. The final segmentation and edge detection result is one closed boundary per actual region in the image.

  19. Implementation of Obstacle-Avoidance Control for an Autonomous Omni-Directional Mobile Robot Based on Extension Theory

    Directory of Open Access Journals (Sweden)

    Yi-Chung Lai

    2012-10-01

    Full Text Available The paper demonstrates a following robot with omni-directional wheels, which is able to take action to avoid obstacles. The robot design is based on both fuzzy and extension theory. Fuzzy theory was applied to tune the PMW signal of the motor revolution, and correct path deviation issues encountered when the robot is moving. Extension theory was used to build a robot obstacle-avoidance model. Various mobile models were developed to handle different types of obstacles. The ultrasonic distance sensors mounted on the robot were used to estimate the distance to obstacles. If an obstacle is encountered, the correlation function is evaluated and the robot avoids the obstacle autonomously using the most appropriate mode. The effectiveness of the proposed approach was verified through several tracking experiments, which demonstrates the feasibility of a fuzzy path tracker as well as the extensible collision avoidance system.

  20. Implementation of obstacle-avoidance control for an autonomous omni-directional mobile robot based on extension theory.

    Science.gov (United States)

    Pai, Neng-Sheng; Hsieh, Hung-Hui; Lai, Yi-Chung

    2012-10-16

    The paper demonstrates a following robot with omni-directional wheels, which is able to take action to avoid obstacles. The robot design is based on both fuzzy and extension theory. Fuzzy theory was applied to tune the PMW signal of the motor revolution, and correct path deviation issues encountered when the robot is moving. Extension theory was used to build a robot obstacle-avoidance model. Various mobile models were developed to handle different types of obstacles. The ultrasonic distance sensors mounted on the robot were used to estimate the distance to obstacles. If an obstacle is encountered, the correlation function is evaluated and the robot avoids the obstacle autonomously using the most appropriate mode. The effectiveness of the proposed approach was verified through several tracking experiments, which demonstrates the feasibility of a fuzzy path tracker as well as the extensible collision avoidance system.

  1. New modulation-based watermarking technique for video

    Science.gov (United States)

    Lemma, Aweke; van der Veen, Michiel; Celik, Mehmet

    2006-02-01

    Successful watermarking algorithms have already been developed for various applications ranging from meta-data tagging to forensic tracking. Nevertheless, it is commendable to develop alternative watermarking techniques that provide a broader basis for meeting emerging services, usage models and security threats. To this end, we propose a new multiplicative watermarking technique for video, which is based on the principles of our successful MASK audio watermark. Audio-MASK has embedded the watermark by modulating the short-time envelope of the audio signal and performed detection using a simple envelope detector followed by a SPOMF (symmetrical phase-only matched filter). Video-MASK takes a similar approach and modulates the image luminance envelope. In addition, it incorporates a simple model to account for the luminance sensitivity of the HVS (human visual system). Preliminary tests show algorithms transparency and robustness to lossy compression.

  2. Novel synchrotron based techniques for characterization of energy materials

    Energy Technology Data Exchange (ETDEWEB)

    Poulsen, H.F.; Nielsen, S.F.; Olsen, U.L.; Schmidt, S. (Risoe DTU, Materials Research Dept., Roskilde (Denmark)); Wright, J. (European Synchrotron Radiation Facility, Grenoble Cedex (France))

    2008-10-15

    Two synchrotron techniques are reviewed, both based on the use of high energy x-rays, and both applicable to in situ studies of bulk materials. Firstly, 3DXRD microscopy, which enables 3D characterization of the position, morphology, phase, elastic strain and crystallographic orientation of the individual embedded grains in polycrystalline specimens. In favourable cases, hundreds of grains can be studied simultaneously during processing. Secondly, plastic strain tomography: a unique method for determining the plastic strain field within materials during processing the potential applications of these techniques for basic and applied studies of four types of energy materials are discussed: polymer composites for wind turbines, solid oxide fuel cells, hydrogen storage materials and superconducting tapes. Furthermore, progress on new detectors aiming at improving the spatial and temporal resolution of such measurements is described. (au)

  3. An Improved Face Recognition Technique Based on Modular LPCA Approach

    Directory of Open Access Journals (Sweden)

    Mathu S.S. Kumar

    2011-01-01

    Full Text Available Problem statement: A face identification algorithm based on modular localized variation by Eigen Subspace technique, also called modular localized principal component analysis, is presented in this study. Approach: The face imagery was partitioned into smaller sub-divisions from a predefined neighborhood and they were ultimately fused to acquire many sets of features. Since a few of the normal facial features of an individual do not differ even when the pose and illumination may differ, the proposed method manages these variations. Results: The proposed feature selection module has significantly, enhanced the identification precision using standard face databases when compared to conservative and modular PCA techniques. Conclusion: The proposed algorithm, when related with conservative PCA algorithm and modular PCA, has enhanced recognition accuracy for face imagery with illumination, expression and pose variations.

  4. Efficient Identification Using a Prime-Feature-Based Technique

    DEFF Research Database (Denmark)

    Hussain, Dil Muhammad Akbar; Haq, Shaiq A.; Valente, Andrea

    2011-01-01

    Identification of authorized train drivers through biometrics is a growing area of interest in locomotive radio remote control systems. The existing technique of password authentication is not very reliable and potentially unauthorized personnel may also operate the system on behalf of the operator....... Fingerprint identification system, implemented on PC/104 based real-time systems, can accurately identify the operator. Traditionally, the uniqueness of a fingerprint is determined by the overall pattern of ridges and valleys as well as the local ridge anomalies e.g., a ridge bifurcation or a ridge ending...... in this paper. The technique involves identifying the most prominent feature of the fingerprint and searching only for that feature in the database to expedite the search process. The proposed architect provides efficient matching process and indexing feature for identification is unique....

  5. New Intellectual Economized Technique on Electricity Based on DSP

    Institute of Scientific and Technical Information of China (English)

    Chang-ming LI; Tao JI; Ying SUN

    2010-01-01

    In order to resolve the problem of the unbalanced threephase and unstable voltage,intellectual economized technique on electricity based on electromagnetic regulation and control is proposed in this paper.We choose the TMS320LF2407A as the control chip and stepper motor as the executing agency.The equipment controls the movable contact reaching to the assigned position on the magnetic coil quickly and accurately,and outputs the sine-wave voltage steadily along with the network voltage variation though the fuzzy Porpornonal Integral Derivative(PID)control algorithm of integral separation and incremental mode with setting dead area.The principle of work and the key technique on the electromagnetic regulation and control are introduced in detail in this paper.The experiment result gives a proof for all the algorithm mentioned in this paper.

  6. In patients with extensive subcutaneous emphysema, which technique achieves maximal clinical resolution: infraclavicular incisions, subcutaneous drain insertion or suction on in situ chest drain?

    Science.gov (United States)

    Johnson, Charles H N; Lang, Sommer A; Bilal, Haris; Rammohan, Kandadai S

    2014-06-01

    A best evidence topic in cardiac surgery was written according to a structured protocol. The question addressed was: 'In patients with extensive subcutaneous emphysema, which technique achieves maximal clinical resolution: infraclavicular incisions, subcutaneous drain insertion or suction on in situ chest drain?'. Altogether more than 200 papers were found using the reported search, of which 14 represented the best evidence to answer the clinical question. The authors, journal, date and country of publication, patient group studied, study type, relevant outcomes and results of these papers are tabulated. Subcutaneous emphysema is usually a benign, self-limiting condition only requiring conservative management. Interventions are useful in the context of severe patient discomfort, respiratory distress or persistent air leak. In the absence of any comparative study, it is not possible to choose definitively between infraclavicular incisions, drain insertion and increasing suction on an in situ drain as the best method for managing severe subcutaneous emphysema. All the three techniques described have been shown to provide effective relief. Increasing suction on a chest tube already in situ provided rapid relief in patients developing SE following pulmonary resection. A retrospective study showed resolution in 66%, increasing to 98% in those who underwent video-assisted thoracic surgery with identification and closure of the leak. Insertion of a drain into the subcutaneous tissue also provided rapid sustained relief. Several studies aided drainage by using regular compressive massage. Infraclavicular incisions were also shown to provide rapid relief, but were noted to be more invasive and carried the potential for cosmetic defect. No major complications were illustrated.

  7. A New Rerouting Technique for the Extensor Pollicis Longus in Palliative Treatment for Wrist and Finger Extension Paralysis Resulting From Radial Nerve and C5C6C7 Root Injury.

    Science.gov (United States)

    Laravine, Jennifer; Cambon-Binder, Adeline; Belkheyar, Zoubir

    2016-03-01

    Wrist and finger extension paralysis is a consequence of an injury to the radial nerve or the C5C6C7 roots. Despite these 2 different levels of lesions, palliative treatment for this type of paralysis depends on the same tendon transfers. A large majority of the patients are able to compensate for a deficiency of the extension of the wrist and fingers. However, a deficiency in the opening of the first web space, which could be responsible for transfers to the abductor pollicis longus, the extensor pollicis brevis, and the extensor pollicis longus (EPL), frequently exists. The aim of this work was to evaluate the feasibility of a new EPL rerouting technique outside of Lister's tubercle. Another aim was to verify whether this technique allows a better opening of the thumb-index pinch in this type of paralysis. In the first part, we performed an anatomic study comparing the EPL rerouting technique and the frequently used technique for wrist and finger extension paralyses. In the second part, we present 2 clinical cases in which this new technique will be practiced. Preliminary results during this study favor the EPL rerouting technique. This is a simple and reproducible technique that allows for good opening of the first web space in the treatment of wrist and finger extension paralysis.

  8. Feature-based multiresolution techniques for product design

    Institute of Scientific and Technical Information of China (English)

    LEE Sang Hun; LEE Kunwoo

    2006-01-01

    3D computer-aided design (CAD) systems based on feature-based solid modelling technique have been widely spread and used for product design. However, when part models associated with features are used in various downstream applications,simplified models in various levels of detail (LODs) are frequently more desirable than the full details of the parts. In particular,the need for feature-based multiresolution representation of a solid model representing an object at multiple LODs in the feature unit is increasing for engineering tasks. One challenge is to generate valid models at various LODs after an arbitrary rearrangement of features using a certain LOD criterion, because composite Boolean operations consisting of union and subtraction are not commutative. The other challenges are to devise proper topological framework for multiresolution representation, to suggest more reasonable LOD criteria, and to extend applications. This paper surveys the recent research on these issues.

  9. Assessment of Urban Ecosystem Health Based on Entropy Weight Extension Decision Model in Urban Agglomeration

    OpenAIRE

    Qian Yang; Aiwen Lin; Zhenzhen Zhao; Ling Zou; Cheng Sun

    2016-01-01

    Urban ecosystem health evaluation can assist in sustainable ecological management at a regional level. This study examined urban agglomeration ecosystem health in the middle reaches of the Yangtze River with entropy weight and extension theories. The model overcomes information omissions and subjectivity problems in the evaluation process of urban ecosystem health. Results showed that human capital and education, economic development level as well as urban infrastructure have a significant ef...

  10. Research on Deep Joints and Lode Extension Based on Digital Borehole Camera Technology

    Directory of Open Access Journals (Sweden)

    Han Zengqiang

    2015-09-01

    Full Text Available Structure characteristics of rock and orebody in deep borehole are obtained by borehole camera technology. By investigating on the joints and fissures in Shapinggou molybdenum mine, the dominant orientation of joint fissure in surrounding rock and orebody were statistically analyzed. Applying the theory of metallogeny and geostatistics, the relationship between joint fissure and lode’s extension direction is explored. The results indicate that joints in the orebody of ZK61borehole have only one dominant orientation SE126° ∠68°, however, the dominant orientations of joints in surrounding rock were SE118° ∠73°, SW225° ∠70° and SE122° ∠65°, NE79° ∠63°. Then a preliminary conclusion showed that the lode’s extension direction is specific and it is influenced by joints of surrounding rock. Results of other boreholes are generally agree well with the ZK61, suggesting the analysis reliably reflects the lode’s extension properties and the conclusion presents important references for deep ore prospecting.

  11. Transformer-based design techniques for oscillators and frequency dividers

    CERN Document Server

    Luong, Howard Cam

    2016-01-01

    This book provides in-depth coverage of transformer-based design techniques that enable CMOS oscillators and frequency dividers to achieve state-of-the-art performance.  Design, optimization, and measured performance of oscillators and frequency dividers for different applications are discussed in detail, focusing on not only ultra-low supply voltage but also ultra-wide frequency tuning range and locking range.  This book will be an invaluable reference for anyone working or interested in CMOS radio-frequency or mm-Wave integrated circuits and systems.

  12. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune

    1998-01-01

    This article describes the work carried out within the project: Modal Analysis Based on the Random Decrement Technique - Application to Civil Engineering Structures. The project is part of the research programme: Dynamics of Structures sponsored by the Danish Technical Research Counsil. The planned...... contents and the requirement for the project prior to its start are described together with thee results obtained during the 3 year period of the project. The project was mainly carried out as a Ph.D project by the first author from September 1994 to August 1997 in cooperation with associate professor Rune...

  13. A Comparative Analysis of Exemplar Based and Wavelet Based Inpainting Technique

    Directory of Open Access Journals (Sweden)

    Vaibhav V Nalawade

    2012-06-01

    Full Text Available Image inpainting is the process of filling in of missing region so as to preserve its overall continuity. Image inpainting is manipulation and modification of an image in a form that is not easily detected. Digital image inpainting is relatively new area of research, but numerous and different approaches to tackle the inpainting problem have been proposed since the concept was first introduced. This paper compares two separate techniques viz, Exemplar based inpainting technique and Wavelet based inpainting technique, each portraying a different set of characteristics. The algorithms analyzed under exemplar technique are large object removal by exemplar based inpainting technique (Criminisi’s and modified exemplar (Cheng. The algorithm analyzed under wavelet is Chen’s visual image inpainting method. A number of examples on real and synthetic images are demonstrated to compare the results of different algorithms using both qualitative and quantitative parameters.

  14. PDE-based nonlinear diffusion techniques for denoising scientific and industrial images: an empirical study

    Science.gov (United States)

    Weeratunga, Sisira K.; Kamath, Chandrika

    2002-05-01

    Removing noise from data is often the first step in data analysis. Denoising techniques should not only reduce the noise, but do so without blurring or changing the location of the edges. Many approaches have been proposed to accomplish this; in this paper, we focus on one such approach, namely the use of non-linear diffusion operators. This approach has been studied extensively from a theoretical viewpoint ever since the 1987 work of Perona and Malik showed that non-linear filters outperformed the more traditional linear Canny edge detector. We complement this theoretical work by investigating the performance of several isotropic diffusion operators on test images from scientific domains. We explore the effects of various parameters such as the choice of diffusivity function, explicit and implicit methods for the discretization of the PDE, and approaches for the spatial discretization of the non-linear operator etc. We also compare these schemes with simple spatial filters and the more complex wavelet-based shrinkage techniques. Our empirical results show that, with an appropriate choice of parameters, diffusion-based schemes can be as effective as competitive techniques.

  15. Galaxy Cluster Mass Reconstruction Project: I. Methods and first results on galaxy-based techniques

    CERN Document Server

    Old, L; Pearce, F R; Croton, D; Muldrew, S I; Muñoz-Cuartas, J C; Gifford, D; Gray, M E; von der Linden, A; Mamon, G A; Merrifield, M R; Müller, V; Pearson, R J; Ponman, T J; Saro, A; Sepp, T; Sifón, C; Tempel, E; Tundo, E; Wang, Y O; Wojtak, R

    2014-01-01

    This paper is the first in a series in which we perform an extensive comparison of various galaxy-based cluster mass estimation techniques that utilise the positions, velocities and colours of galaxies. Our primary aim is to test the performance of these cluster mass estimation techniques on a diverse set of models that will increase in complexity. We begin by providing participating methods with data from a simple model that delivers idealised clusters, enabling us to quantify the underlying scatter intrinsic to these mass estimation techniques. The mock catalogue is based on a Halo Occupation Distribution (HOD) model that assumes spherical Navarro, Frenk and White (NFW) haloes truncated at R_200, with no substructure nor colour segregation, and with isotropic, isothermal Maxwellian velocities. We find that, above 10^14 M_solar, recovered cluster masses are correlated with the true underlying cluster mass with an intrinsic scatter of typically a factor of two. Below 10^14 M_solar, the scatter rises as the nu...

  16. An extension of the immersed boundary method based on the distributed Lagrange multiplier approach

    Science.gov (United States)

    Feldman, Yuri; Gulberg, Yosef

    2016-10-01

    An extended formulation of the immersed boundary method, which facilitates simulation of incompressible isothermal and natural convection flows around immersed bodies and which may be applied for linear stability analysis of the flows, is presented. The Lagrangian forces and heat sources are distributed on the fluid-structure interface. The method treats pressure, the Lagrangian forces, and heat sources as distributed Lagrange multipliers, thereby implicitly providing the kinematic constraints of no-slip and the corresponding thermal boundary conditions for immersed surfaces. Extensive verification of the developed method for both isothermal and natural convection 2D flows is provided. Strategies for adapting the developed approach to realistic 3D configurations are discussed.

  17. On combining Laplacian and optimization-based mesh smoothing techniques

    Energy Technology Data Exchange (ETDEWEB)

    Freitag, L.A.

    1997-07-01

    Local mesh smoothing algorithms have been shown to be effective in repairing distorted elements in automatically generated meshes. The simplest such algorithm is Laplacian smoothing, which moves grid points to the geometric center of incident vertices. Unfortunately, this method operates heuristically and can create invalid meshes or elements of worse quality than those contained in the original mesh. In contrast, optimization-based methods are designed to maximize some measure of mesh quality and are very effective at eliminating extremal angles in the mesh. These improvements come at a higher computational cost, however. In this article the author proposes three smoothing techniques that combine a smart variant of Laplacian smoothing with an optimization-based approach. Several numerical experiments are performed that compare the mesh quality and computational cost for each of the methods in two and three dimensions. The author finds that the combined approaches are very cost effective and yield high-quality meshes.

  18. ONLINE GRINDING WHEEL WEAR COMPENSATION BY IMAGE BASED MEASURING TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    WAN Daping; HU Dejin; WU Qi; ZHANG Yonghong

    2006-01-01

    Automatic compensation of grinding wheel wear in dry grinding is accomplished by an image based online measurement method. A kind of PC-based charge-coupled device image recognition system is schemed out, which detects the topography changes of the grinding wheel surface. Profile data, which corresponds to the wear and the topography, is measured by using a digital image processing method. The grinding wheel wear is evaluated by analyzing the position deviation of the grinding wheel edge. The online wear compensation is achieved according to the measure results. The precise detection and automatic compensation system is integrated into an open structure CNC curve grinding machine. A practical application is carried out to fulfil the precision curve grinding. The experimental results confirm the benefits of the proposed techniques, and the online detection accuracy is less than 5 μm. The grinding machine provides higher precision according to the in-process grinding wheel error compensation.

  19. Generalisation and extension of a web-based data collection system for clinical studies using Java and CORBA.

    Science.gov (United States)

    Eich, H P; Ohmann, C

    1999-01-01

    Inadequate informatical support of multi-centre clinical trials lead to pure quality. In order to support a multi-centre clinical trial a data collection via WWW and Internet based on Java has been developed. In this study a generalization and extension of this prototype has been performed. The prototype has been applied to another clinical trial and a knowledge server based on C+t has been integrated via CORBA. The investigation and implementation of security aspects of web-based data collection is now under evaluation.

  20. Hash Based Least Significant Bit Technique For Video Steganography

    Directory of Open Access Journals (Sweden)

    Prof. Dr. P. R. Deshmukh ,

    2014-01-01

    Full Text Available The Hash Based Least Significant Bit Technique For Video Steganography deals with hiding secret message or information within a video.Steganography is nothing but the covered writing it includes process that conceals information within other data and also conceals the fact that a secret message is being sent.Steganography is the art of secret communication or the science of invisible communication. In this paper a Hash based least significant bit technique for video steganography has been proposed whose main goal is to embed a secret information in a particular video file and then extract it using a stego key or password. In this Least Significant Bit insertion method is used for steganography so as to embed data in cover video with change in the lower bit.This LSB insertion is not visible.Data hidding is the process of embedding information in a video without changing its perceptual quality. The proposed method involve with two terms that are Peak Signal to Noise Ratio (PSNR and the Mean Square Error (MSE .This two terms measured between the original video files and steganographic video files from all video frames where a distortion is measured using PSNR. A hash function is used to select the particular position for insertion of bits of secret message in LSB bits.

  1. An Empirical Comparative Study of Checklist based and Ad Hoc Code Reading Techniques in a Distributed Groupware Environment

    CERN Document Server

    Akinola, Olalekan S

    2009-01-01

    Software inspection is a necessary and important tool for software quality assurance. Since it was introduced by Fagan at IBM in 1976, arguments exist as to which method should be adopted to carry out the exercise, whether it should be paper based or tool based, and what reading technique should be used on the inspection document. Extensive works have been done to determine the effectiveness of reviewers in paper based environment when using ad hoc and checklist reading techniques. In this work, we take the software inspection research further by examining whether there is going to be any significant difference in defect detection effectiveness of reviewers when they use either ad hoc or checklist reading techniques in a distributed groupware environment. Twenty final year undergraduate students of computer science, divided into ad hoc and checklist reviewers groups of ten members each were employed to inspect a medium sized java code synchronously on groupware deployed on the Internet. The data obtained were...

  2. Actual extension of sinkholes: Considerations about geophysical, geomorphological, and field inspection techniques in urban planning projects in the Ebro basin (NE Spain)

    Science.gov (United States)

    Pueyo Anchuela, Ó.; Pocoví Juan, A.; Casas-Sainz, A. M.; Ansón-López, D.; Gil-Garbi, H.

    2013-05-01

    Aerial photographs, historical cartographies, and field inspection are useful tools in urban planning design on mantled karst because they permit a wide time interval to be analyzed. In the case of Zaragoza city, several works have confirmed the interest of these approaches in configuring the urban planning code and therefore represent a promising technique. Nevertheless, some caveats should be taken into account when using this kind of information. A detailed analysis is presented comparing (in a case study from the surroundings of Zaragoza) geomorphological, historical analysis, and field inspection with geophysical data. Field inspection in a noncultivated area permits the constraint of the presence of karst indicators below the geomorphological resolution of aerial photographs and shows results consistent with geophysical surveys. The studied case shows an inner zone coinciding with the sinkhole mapped from aerial photographs that correlates with changes in the position of the substratum and changes in thickness of alluvial sediments. The integrated analysis permits us to define an external subsidence ring around the geomorphological sinkhole whose surface is twice the size of the inner zone. This outer ring is indicated by geometrical changes in GPR profiles, increases of thickness of the conductive shallower unit toward the collapse, and small collapses on marginal cracks. These results support the higher extension of karst hazards linked to sinkholes with respect to their geomorphological expression and the needed detailed analysis to constrain the real sinkhole size or the use of security radii surrounding this surficial evidence when geomorphological data is used for the hazard analyses or the urban planning at karstic zones.

  3. An interactive tutorial-based training technique for vertebral morphometry.

    Science.gov (United States)

    Gardner, J C; von Ingersleben, G; Heyano, S L; Chesnut, C H

    2001-01-01

    The purpose of this work was to develop a computer-based procedure for training technologists in vertebral morphometry. The utility of the resulting interactive, tutorial based training method was evaluated in this study. The training program was composed of four steps: (1) review of an online tutorial, (2) review of analyzed spine images, (3) practice in fiducial point placement and (4) testing. During testing, vertebral heights were measured from digital, lateral spine images containing osteoporotic fractures. Inter-observer measurement precision was compared between research technicians, and between technologists and radiologist. The technologists participating in this study had no prior experience in vertebral morphometry. Following completion of the online training program, good inter-observer measurement precision was seen between technologists, showing mean coefficients of variation of 2.33% for anterior, 2.87% for central and 2.65% for posterior vertebral heights. Comparisons between the technicians and radiologist ranged from 2.19% to 3.18%. Slightly better precision values were seen with height measurements compared with height ratios, and with unfractured compared with fractured vertebral bodies. The findings of this study indicate that self-directed, tutorial-based training for spine image analyses is effective, resulting in good inter-observer measurement precision. The interactive tutorial-based approach provides standardized training methods and assures consistency of instructional technique over time.

  4. Enhancing the effectiveness of IST through risk-based techniques

    Energy Technology Data Exchange (ETDEWEB)

    Floyd, S.D.

    1996-12-01

    Current IST requirements were developed mainly through deterministic-based methods. While this approach has resulted in an adequate level of safety and reliability for pumps and valves, insights from probabilistic safety assessments suggest a better safety focus can be achieved at lower costs. That is, some high safety impact pumps and valves are currently not tested under the IST program and should be added, while low safety impact valves could be tested at significantly greater intervals than allowed by the current IST program. The nuclear utility industry, through the Nuclear Energy Institute (NEI), has developed a draft guideline for applying risk-based techniques to focus testing on those pumps and valves with a high safety impact while reducing test frequencies on low safety impact pumps and valves. The guideline is being validated through an industry pilot application program that is being reviewed by the U.S. Nuclear Regulatory Commission. NEI and the ASME maintain a dialogue on the two groups` activities related to risk-based IST. The presenter will provide an overview of the NEI guideline, discuss the methodological approach for applying risk-based technology to IST and provide the status of the industry pilot plant effort.

  5. Testing of Large Diameter Fresnel Optics for Space Based Observations of Extensive Air Showers

    Science.gov (United States)

    Adams, James H.; Christl, Mark J.; Young, Roy M.

    2011-01-01

    The JEM-EUSO mission will detect extensive air showers produced by extreme energy cosmic rays. It operates from the ISS looking down on Earth's night time atmosphere to detect the nitrogen fluorescence and Cherenkov produce by the charged particles in the EAS. The JEM-EUSO science objectives require a large field of view, sensitivity to energies below 50 EeV, and must fit within available ISS resources. The JEM-EUSO optic module uses three large diameter, thin plastic lenses with Fresnel surfaces to meet the instrument requirements. A bread-board model of the optic has been manufactured and has undergone preliminary tests. We report the results of optical performance tests and evaluate the present capability to manufacture these optical elements.

  6. AvoPlot: An extensible scientific plotting tool based on matplotlib

    Directory of Open Access Journals (Sweden)

    Nial Peters

    2014-02-01

    Full Text Available AvoPlot is a simple-to-use graphical plotting program written in Python and making extensive use of the matplotlib plotting library. It can be found at http://code.google.com/p/avoplot/. In addition to providing a user-friendly interface to the powerful capabilities of the matplotlib library, it also offers users the possibility of extending its functionality by creating plug-ins. These can import specific types of data into the interface and also provide new tools for manipulating them. In this respect, AvoPlot is a convenient platform for researchers to build their own data analysis tools on top of, as well as being a useful standalone program.

  7. A restrained-torque-based motion instructor: forearm flexion/extension-driving exoskeleton

    Science.gov (United States)

    Nishimura, Takuya; Nomura, Yoshihiko; Sakamoto, Ryota

    2013-01-01

    When learning complicated movements by ourselves, we encounter such problems as a self-rightness. The self-rightness results in a lack of detail and objectivity, and it may cause to miss essences and even twist the essences. Thus, we sometimes fall into the habits of doing inappropriate motions. To solve these problems or to alleviate the problems as could as possible, we have been developed mechanical man-machine human interfaces to support us learning such motions as cultural gestures and sports form. One of the promising interfaces is a wearable exoskeleton mechanical system. As of the first try, we have made a prototype of a 2-link 1-DOF rotational elbow joint interface that is applied for teaching extension-flexion operations with forearms and have found its potential abilities for teaching the initiating and continuing flection motion of the elbow.

  8. An extensive survey of dayside diffuse aurora based on optical observations at Yellow River Station

    CERN Document Server

    Han, De-Sheng; Liu, Jian-Jun; Qiu, Qi; Keika, K; Hu, Ze-Jun; Liu, Jun-Ming; Hu, Hong-Qiao; Yang, Hui-Gen

    2016-01-01

    By using 7 years optical auroral observations obtained at Yellow River Station (magnetic latitude $76.24\\,^{\\circ}{\\rm C}$N) at Ny-Alesund, Svalbard, we performed the first extensive survey for the dayside diffuse auroras (DDAs) and acquired observational results as follows. (1) The DDAs can be classified into two broad categories, i.e., unstructured and structured DDAs. The unstructured DDAs are mainly distributed in the morning and afternoon, but the structured DDAs predominantly occurred around the magnetic local noon (MLN). (2) The unstructured DDAs observed in morning and afternoon present obviously different properties. The afternoon ones are much stable and seldom show pulsating property. (3) The DDAs are more easily observed under geomagnetically quiet times. (4) The structured DDAsmainly show patchy, stripy, and irregular forms and are often pulsating and drifting. The drifting directions are mostly westward (with speed $\\sim$5km/s), but there are cases showing eastward or poleward drifting. (5) The ...

  9. PDMS microchannel fabrication technique based on microwire-molding

    Institute of Scientific and Technical Information of China (English)

    JIA YueFei; JIANG JiaHuan; MA XiaoDong; LI Yuan; HUANG HeMing; CAI KunBao; CAI ShaoXi; WU YunPeng

    2008-01-01

    Micro-flow channel is basic functional component of microfluidic chip, and every step-forward of its construction technique has been receiving concern all over the world. This article presents a notcomplicated but flexible method for fabrication of micro-flow channels. This method mainly utilizes the conventional molding capability of polydimethylsiloxane (PDMS) and widespread commercial microwires as templates. We have fabricated out some conventional types of microchannels with different topological shapes, as examples for the demonstration of this flexible fabrication route which was not dependent on the stringent demands of photolithographical or microelectromechanical system (MEMS)techniques. The smooth surface, high-intensity, and high flexibility of the wires made it possible to create many types of topological structures of the two-dimensional or three-dimensional microchannel or channel array. The geometric shape of the cross-section of thus forming microchannel in PDMS was the negative of that of embedded-in microwire, in high-fidelity if suitable measures were taken. Moreover, such a microchannel fabrication process can easily integrate the conductivity and low resistivity of the metal wire to create micro-flow devices that are suitable for the electromagnetic control of liquid or the temperature regulation in the microchannel. Furthermore some preliminary optical analysis was provided for the observation of thus forming rounded microchannel. Based on this molding strategy,we even made some prototypes for functional microflow application, such as microsolenoids chip and temperature control gadgets. And an experiment of forming a droplet in the cross channel further confirmed the feasibility and applicability of this flexible microchannel forming technique.

  10. Detecting Molecular Properties by Various Laser-Based Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hsin, Tse-Ming [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    Four different laser-based techniques were applied to study physical and chemical characteristics of biomolecules and dye molecules. These techniques are liole burning spectroscopy, single molecule spectroscopy, time-resolved coherent anti-Stokes Raman spectroscopy and laser-induced fluorescence microscopy. Results from hole burning and single molecule spectroscopy suggested that two antenna states (C708 & C714) of photosystem I from cyanobacterium Synechocystis PCC 6803 are connected by effective energy transfer and the corresponding energy transfer time is ~6 ps. In addition, results from hole burning spectroscopy indicated that the chlorophyll dimer of the C714 state has a large distribution of the dimer geometry. Direct observation of vibrational peaks and evolution of coumarin 153 in the electronic excited state was demonstrated by using the fs/ps CARS, a variation of time-resolved coherent anti-Stokes Raman spectroscopy. In three different solvents, methanol, acetonitrile, and butanol, a vibration peak related to the stretch of the carbonyl group exhibits different relaxation dynamics. Laser-induced fluorescence microscopy, along with the biomimetic containers-liposomes, allows the measurement of the enzymatic activity of individual alkaline phosphatase from bovine intestinal mucosa without potential interferences from glass surfaces. The result showed a wide distribution of the enzyme reactivity. Protein structural variation is one of the major reasons that are responsible for this highly heterogeneous behavior.

  11. Investigations on landmine detection by neutron-based techniques.

    Science.gov (United States)

    Csikai, J; Dóczi, R; Király, B

    2004-07-01

    Principles and techniques of some neutron-based methods used to identify the antipersonnel landmines (APMs) are discussed. New results have been achieved in the field of neutron reflection, transmission, scattering and reaction techniques. Some conclusions are as follows: The neutron hand-held detector is suitable for the observation of anomaly caused by a DLM2-like sample in different soils with a scanning speed of 1m(2)/1.5 min; the reflection cross section of thermal neutrons rendered the determination of equivalent thickness of different soil components possible; a simple method was developed for the determination of the thermal neutron flux perturbation factor needed for multi-elemental analysis of bulky samples; unfolded spectra of elastically backscattered neutrons using broad-spectrum sources render the identification of APMs possible; the knowledge of leakage spectra of different source neutrons is indispensable for the determination of the differential and integrated reaction rates and through it the dimension of the interrogated volume; the precise determination of the C/O atom fraction requires the investigations on the angular distribution of the 6.13MeV gamma-ray emitted in the (16)O(n,n'gamma) reaction. These results, in addition to the identification of landmines, render the improvement of the non-intrusive neutron methods possible.

  12. Investigations on landmine detection by neutron-based techniques

    Energy Technology Data Exchange (ETDEWEB)

    Csikai, J. E-mail: csikai@delfin.klte.hu; Doczi, R.; Kiraly, B

    2004-07-01

    Principles and techniques of some neutron-based methods used to identify the antipersonnel landmines (APMs) are discussed. New results have been achieved in the field of neutron reflection, transmission, scattering and reaction techniques. Some conclusions are as follows: The neutron hand-held detector is suitable for the observation of anomaly caused by a DLM2-like sample in different soils with a scanning speed of 1 m{sup 2}/1.5 min; the reflection cross section of thermal neutrons rendered the determination of equivalent thickness of different soil components possible; a simple method was developed for the determination of the thermal neutron flux perturbation factor needed for multi-elemental analysis of bulky samples; unfolded spectra of elastically backscattered neutrons using broad-spectrum sources render the identification of APMs possible; the knowledge of leakage spectra of different source neutrons is indispensable for the determination of the differential and integrated reaction rates and through it the dimension of the interrogated volume; the precise determination of the C/O atom fraction requires the investigations on the angular distribution of the 6.13 MeV gamma-ray emitted in the {sup 16}O(n,n'{gamma}) reaction. These results, in addition to the identification of landmines, render the improvement of the non-intrusive neutron methods possible.

  13. A New Particle Swarm Optimization Based Stock Market Prediction Technique

    Directory of Open Access Journals (Sweden)

    Essam El. Seidy

    2016-04-01

    Full Text Available Over the last years, the average person's interest in the stock market has grown dramatically. This demand has doubled with the advancement of technology that has opened in the International stock market, so that nowadays anybody can own stocks, and use many types of software to perform the aspired profit with minimum risk. Consequently, the analysis and prediction of future values and trends of the financial markets have got more attention, and due to large applications in different business transactions, stock market prediction has become a critical topic of research. In this paper, our earlier presented particle swarm optimization with center of mass technique (PSOCoM is applied to the task of training an adaptive linear combiner to form a new stock market prediction model. This prediction model is used with some common indicators to maximize the return and minimize the risk for the stock market. The experimental results show that the proposed technique is superior than the other PSO based models according to the prediction accuracy.

  14. NVC Based Model for Selecting Effective Requirement Elicitation Technique

    Directory of Open Access Journals (Sweden)

    Md. Rizwan Beg

    2012-10-01

    Full Text Available Requirement Engineering process starts from gathering of requirements i.e.; requirements elicitation. Requirementselicitation (RE is the base building block for a software project and has very high impact onsubsequent design and builds phases as well. Accurately capturing system requirements is the major factorin the failure of most of software projects. Due to the criticality and impact of this phase, it is very importantto perform the requirements elicitation in no less than a perfect manner. One of the most difficult jobsfor elicitor is to select appropriate technique for eliciting the requirement. Interviewing and Interactingstakeholder during Elicitation process is a communication intensive activity involves Verbal and Nonverbalcommunication (NVC. Elicitor should give emphasis to Non-verbal communication along with verbalcommunication so that requirements recorded more efficiently and effectively. In this paper we proposea model in which stakeholders are classified by observing non-verbal communication and use it as a basefor elicitation technique selection. We also propose an efficient plan for requirements elicitation which intendsto overcome on the constraints, faced by elicitor.

  15. Filling-Based Techniques Applied to Object Projection Feature Estimation

    CERN Document Server

    Quesada, Luis

    2012-01-01

    3D motion tracking is a critical task in many computer vision applications. Unsupervised markerless 3D motion tracking systems determine the most relevant object in the screen and then track it by continuously estimating its projection features (center and area) from the edge image and a point inside the relevant object projection (namely, inner point), until the tracking fails. Existing object projection feature estimation techniques are based on ray-casting from the inner point. These techniques present three main drawbacks: when the inner point is surrounded by edges, rays may not reach other relevant areas; as a consequence of that issue, the estimated features may greatly vary depending on the position of the inner point relative to the object projection; and finally, increasing the number of rays being casted and the ray-casting iterations (which would make the results more accurate and stable) increases the processing time to the point the tracking cannot be performed on the fly. In this paper, we anal...

  16. Model-checking techniques based on cumulative residuals.

    Science.gov (United States)

    Lin, D Y; Wei, L J; Ying, Z

    2002-03-01

    Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.

  17. Astronomical Image Compression Techniques Based on ACC and KLT Coder

    Directory of Open Access Journals (Sweden)

    J. Schindler

    2011-01-01

    Full Text Available This paper deals with a compression of image data in applications in astronomy. Astronomical images have typical specific properties — high grayscale bit depth, size, noise occurrence and special processing algorithms. They belong to the class of scientific images. Their processing and compression is quite different from the classical approach of multimedia image processing. The database of images from BOOTES (Burst Observer and Optical Transient Exploring System has been chosen as a source of the testing signal. BOOTES is a Czech-Spanish robotic telescope for observing AGN (active galactic nuclei and the optical transient of GRB (gamma ray bursts searching. This paper discusses an approach based on an analysis of statistical properties of image data. A comparison of two irrelevancy reduction methods is presented from a scientific (astrometric and photometric point of view. The first method is based on a statistical approach, using the Karhunen-Loeve transform (KLT with uniform quantization in the spectral domain. The second technique is derived from wavelet decomposition with adaptive selection of used prediction coefficients. Finally, the comparison of three redundancy reduction methods is discussed. Multimedia format JPEG2000 and HCOMPRESS, designed especially for astronomical images, are compared with the new Astronomical Context Coder (ACC coder based on adaptive median regression.

  18. Diagnosis of Dengue Infection Using Conventional and Biosensor Based Techniques

    Directory of Open Access Journals (Sweden)

    Om Parkash

    2015-10-01

    Full Text Available Dengue is an arthropod-borne viral disease caused by four antigenically different serotypes of dengue virus. This disease is considered as a major public health concern around the world. Currently, there is no licensed vaccine or antiviral drug available for the prevention and treatment of dengue disease. Moreover, clinical features of dengue are indistinguishable from other infectious diseases such as malaria, chikungunya, rickettsia and leptospira. Therefore, prompt and accurate laboratory diagnostic test is urgently required for disease confirmation and patient triage. The traditional diagnostic techniques for the dengue virus are viral detection in cell culture, serological testing, and RNA amplification using reverse transcriptase PCR. This paper discusses the conventional laboratory methods used for the diagnosis of dengue during the acute and convalescent phase and highlights the advantages and limitations of these routine laboratory tests. Subsequently, the biosensor based assays developed using various transducers for the detection of dengue are also reviewed.

  19. Crop Yield Forecasted Model Based on Time Series Techniques

    Institute of Scientific and Technical Information of China (English)

    Li Hong-ying; Hou Yan-lin; Zhou Yong-juan; Zhao Hui-ming

    2012-01-01

    Traditional studies on potential yield mainly referred to attainable yield: the maximum yield which could be reached by a crop in a given environment. The new concept of crop yield under average climate conditions was defined in this paper, which was affected by advancement of science and technology. Based on the new concept of crop yield, the time series techniques relying on past yield data was employed to set up a forecasting model. The model was tested by using average grain yields of Liaoning Province in China from 1949 to 2005. The testing combined dynamic n-choosing and micro tendency rectification, and an average forecasting error was 1.24%. In the trend line of yield change, and then a yield turning point might occur, in which case the inflexion model was used to solve the problem of yield turn point.

  20. A polarization-based Thomson scattering technique for burning plasmas

    CERN Document Server

    Parke, E; Hartog, D J Den

    2013-01-01

    The traditional Thomson scattering diagnostic is based on measurement of the wavelength spectrum of scattered light, where electron temperature measurements are inferred from thermal broadening of the scattered laser light. At sufficiently high temperatures, especially those predicted for ITER and other burning plasmas, relativistic effects cause a change in the polarization state of the scattered photons. The resulting depolarization of the scattered light is temperature dependent and has been proposed elsewhere as a potential alternative to the traditional spectral decomposition technique. Following similar work, we analytically calculate the degree of polarization for incoherent Thomson scattering. For the first time, we obtain exact results valid for the full range of incident laser polarization states and electron temperatures. While previous work focused only on linear polarization, we show that circularly polarized incident light optimizes the degree of depolarization for a wide range of temperatures r...

  1. A 3-Level Secure Histogram Based Image Steganography Technique

    Directory of Open Access Journals (Sweden)

    G V Chaitanya

    2013-04-01

    Full Text Available Steganography is an art that involves communication of secret data in an appropriate carrier, eg. images, audio, video, etc. with a goal to hide the very existence of embedded data so as not to arouse an eavesdropper’s suspicion. In this paper, a steganographic technique with high level of security and having a data hiding capacity close to 20% of cover image data has been developed. An adaptive and matched bit replacement method is used based on the sensitivity of Human Visual System (HVS at different intensities. The proposed algorithm ensures that the generated stego image has a PSNR greater than 38.5 and is also resistant to visual attack. A three level security is infused into the algorithm which makes data retrieval from the stego image possible only in case of having all the right keys.

  2. Proposed Arabic Text Steganography Method Based on New Coding Technique

    Directory of Open Access Journals (Sweden)

    Assist. prof. Dr. Suhad M. Kadhem

    2016-09-01

    Full Text Available Steganography is one of the important fields of information security that depend on hiding secret information in a cover media (video, image, audio, text such that un authorized person fails to realize its existence. One of the lossless data compression techniques which are used for a given file that contains many redundant data is run length encoding (RLE. Sometimes the RLE output will be expanded rather than compressed, and this is the main problem of RLE. In this paper we will use a new coding method such that its output will be contains sequence of ones with few zeros, so modified RLE that we proposed in this paper will be suitable for compression, finally we employ the modified RLE output for stenography purpose that based on Unicode and non-printed characters to hide the secret information in an Arabic text.

  3. Whitelists Based Multiple Filtering Techniques in SCADA Sensor Networks

    Directory of Open Access Journals (Sweden)

    DongHo Kang

    2014-01-01

    Full Text Available Internet of Things (IoT consists of several tiny devices connected together to form a collaborative computing environment. Recently IoT technologies begin to merge with supervisory control and data acquisition (SCADA sensor networks to more efficiently gather and analyze real-time data from sensors in industrial environments. But SCADA sensor networks are becoming more and more vulnerable to cyber-attacks due to increased connectivity. To safely adopt IoT technologies in the SCADA environments, it is important to improve the security of SCADA sensor networks. In this paper we propose a multiple filtering technique based on whitelists to detect illegitimate packets. Our proposed system detects the traffic of network and application protocol attacks with a set of whitelists collected from normal traffic.

  4. Clustering economies based on multiple criteria decision making techniques

    Directory of Open Access Journals (Sweden)

    Mansour Momeni

    2011-10-01

    Full Text Available One of the primary concerns on many countries is to determine different important factors affecting economic growth. In this paper, we study some factors such as unemployment rate, inflation ratio, population growth, average annual income, etc to cluster different countries. The proposed model of this paper uses analytical hierarchy process (AHP to prioritize the criteria and then uses a K-mean technique to cluster 59 countries based on the ranked criteria into four groups. The first group includes countries with high standards such as Germany and Japan. In the second cluster, there are some developing countries with relatively good economic growth such as Saudi Arabia and Iran. The third cluster belongs to countries with faster rates of growth compared with the countries located in the second group such as China, India and Mexico. Finally, the fourth cluster includes countries with relatively very low rates of growth such as Jordan, Mali, Niger, etc.

  5. An RSS based location estimation technique for cognitive relay networks

    KAUST Repository

    Qaraqe, Khalid A.

    2010-11-01

    In this paper, a received signal strength (RSS) based location estimation method is proposed for a cooperative wireless relay network where the relay is a cognitive radio. We propose a method for the considered cognitive relay network to determine the location of the source using the direct and the relayed signal at the destination. We derive the Cramer-Rao lower bound (CRLB) expressions separately for x and y coordinates of the location estimate. We analyze the effects of cognitive behaviour of the relay on the performance of the proposed method. We also discuss and quantify the reliability of the location estimate using the proposed technique if the source is not stationary. The overall performance of the proposed method is presented through simulations. ©2010 IEEE.

  6. Hierarchical Spread Spectrum Fingerprinting Scheme Based on the CDMA Technique

    Directory of Open Access Journals (Sweden)

    Kuribayashi Minoru

    2011-01-01

    Full Text Available Abstract Digital fingerprinting is a method to insert user's own ID into digital contents in order to identify illegal users who distribute unauthorized copies. One of the serious problems in a fingerprinting system is the collusion attack such that several users combine their copies of the same content to modify/delete the embedded fingerprints. In this paper, we propose a collusion-resistant fingerprinting scheme based on the CDMA technique. Our fingerprint sequences are orthogonal sequences of DCT basic vectors modulated by PN sequence. In order to increase the number of users, a hierarchical structure is produced by assigning a pair of the fingerprint sequences to a user. Under the assumption that the frequency components of detected sequences modulated by PN sequence follow Gaussian distribution, the design of thresholds and the weighting of parameters are studied to improve the performance. The robustness against collusion attack and the computational costs required for the detection are estimated in our simulation.

  7. Dynamic analysis of granite rockburst based on the PIV technique

    Institute of Scientific and Technical Information of China (English)

    Wang Hongjian; Liu Da’an; Gong Weili; Li Liyun

    2015-01-01

    This paper describes the deep rockburst simulation system to reproduce the granite instantaneous rock-burst process. Based on the PIV (Particle Image Velocimetry) technique, quantitative analysis of a rock-burst, the images of tracer particle, displacement and strain fields can be obtained, and the debris trajectory described. According to the observation of on-site tests, the dynamic rockburst is actually a gas–solid high speed flow process, which is caused by the interaction of rock fragments and surrounding air. With the help of analysis on high speed video and PIV images, the granite rockburst failure process is composed of six stages of platey fragment spalling and debris ejection. Meanwhile, the elastic energy for these six stages has been calculated to study the energy variation. The results indicate that the rockburst process can be summarized as:an initiating stage, intensive developing stage and gradual decay stage. This research will be helpful for our further understanding of the rockburst mechanism.

  8. Demand Management Based on Model Predictive Control Techniques

    Directory of Open Access Journals (Sweden)

    Yasser A. Davizón

    2014-01-01

    Full Text Available Demand management (DM is the process that helps companies to sell the right product to the right customer, at the right time, and for the right price. Therefore the challenge for any company is to determine how much to sell, at what price, and to which market segment while maximizing its profits. DM also helps managers efficiently allocate undifferentiated units of capacity to the available demand with the goal of maximizing revenue. This paper introduces control system approach to demand management with dynamic pricing (DP using the model predictive control (MPC technique. In addition, we present a proper dynamical system analogy based on active suspension and a stability analysis is provided via the Lyapunov direct method.

  9. SAR IMAGE ENHANCEMENT BASED ON BEAM SHARPENING TECHNIQUE

    Institute of Scientific and Technical Information of China (English)

    LIYong; ZI-IANGKun-hui; ZHUDai-yin; ZHUZhao-da

    2004-01-01

    A major problem encountered in enhancing SAR image is the total loss of phase information and the unknown parameters of imaging system. The beam sharpening technique, combined with synthetic aperture radiation pattern estimation provides an approach to process this kind of data to achieve higher apparent resolution. Based on the criterion of minimizing the expected quadratic estimation error, an optimum FIR filter with a symmetrical structure is designed whose coefficients depend on the azimuth response of local isolated prominent points because this response can be approximately regarded as the synthetic aperture radiation pattern of the imaging system. The point target simulation shows that the angular resolution is improved by a ratio of almost two to one. The processing results of a live SAR image demonstrate the validity of the method.

  10. A Novel Technique Based on Node Registration in MANETs

    Directory of Open Access Journals (Sweden)

    Rashid Jalal Qureshi

    2012-09-01

    Full Text Available In ad hoc network communication links between the nodes are wireless and each node acts as a router for the other node and packet is forward from one node to other. This type of networks helps in solving challenges and problems that may arise in every day communication. Mobile Ad Hoc Networks is a new field of research and it is particularly useful in situations where network infrastructure is costly. Protecting MANETs from security threats is a challenging task because of the MANETs dynamic topology. Every node in a MANETs is independent and is free to move in any direction, therefore change its connections to other nodes frequently. Due to its decentralized nature different types of attacks can be occur. The aim of this research paper is to investigate different MANETs security attacks and proposed nodes registration based technique by using cryptography functions.

  11. Knee extension isometric torque production differences based on verbal motivation given to introverted and extroverted female children.

    Science.gov (United States)

    McWhorter, J Wesley; Landers, Merrill; Young, Daniel; Puentedura, E Louie; Hickman, Robbin A; Brooksby, Candi; Liveratti, Marc; Taylor, Lisa

    2011-08-01

    To date, little research has been conducted to test the efficacy of different forms of motivation based on a female child's personality type. The purpose of this study was to evaluate the ability of female children to perform a maximal knee extension isometric torque test with varying forms of motivation, based on the child's personality type (introvert vs. extrovert). The subjects were asked to perform a maximal isometric knee extension test under three different conditions: 1) with no verbal motivation, 2) with verbal motivation from the evaluator only, and 3) with verbal motivation from a group of their peers and the evaluator combined. A 2×3 mixed ANOVA was significant for an interaction (F 2,62=17.530; pintroverted group showed that scores without verbal motivation were significantly higher than with verbal motivation from the evaluator or the evaluator plus the peers. The extroverted group revealed that scores with verbal motivation from the evaluator or the evaluator plus the peers were significantly higher than without verbal motivation. Results suggest that verbal motivation has a varying effect on isometric knee extension torque production in female children with different personality types. Extroverted girls perform better with motivation, whereas introverted girls perform better without motivation from others.

  12. Streaming Media over a Color Overlay Based on Forward Error Correction Technique

    Institute of Scientific and Technical Information of China (English)

    张晓瑜; 沈国斌; 李世鹏; 钟玉琢

    2004-01-01

    The number of clients that receive high-quality streaming video from a source is greatly limited by the application requirements,such as the high bandwidth and reliability.In this work,a method was developed to construct a color overlay,which enables clients to receive data across multiple paths,based on the forward error correction technique.The color overlay enlarges system capacity by reducing the bottlenecks and extending the bandwidth,improves reliability against node failure,and is more resilient to fluctuations of network metrics.A light-weight protocol for building the overlay is also presented.Extensive simulations were conducted and the results clearly support the claimed advantages.

  13. Energy-Efficient Network Transmission between Satellite Swarms and Earth Stations Based on Lyapunov Optimization Techniques

    Directory of Open Access Journals (Sweden)

    Weiwei Fang

    2014-01-01

    Full Text Available The recent advent of satellite swarm technologies has enabled space exploration with a massive number of picoclass, low-power, and low-weight spacecraft. However, developing swarm-based satellite systems, from conceptualization to validation, is a complex multidisciplinary activity. One of the primary challenges is how to achieve energy-efficient data transmission between the satellite swarm and terrestrial terminal stations. Employing Lyapunov optimization techniques, we present an online control algorithm to optimally dispatch traffic load among different satellite-ground links for minimizing overall energy consumption over time. Our algorithm is able to independently and simultaneously make control decisions on traffic dispatching over intersatellite-links and up-down-links so as to offer provable energy and delay guarantees, without requiring any statistical information of traffic arrivals and link condition. Rigorous analysis and extensive simulations have demonstrated the performance and robustness of the proposed new algorithm.

  14. Ionospheric Plasma Drift Analysis Technique Based On Ray Tracing

    Science.gov (United States)

    Ari, Gizem; Toker, Cenk

    2016-07-01

    Ionospheric drift measurements provide important information about the variability in the ionosphere, which can be used to quantify ionospheric disturbances caused by natural phenomena such as solar, geomagnetic, gravitational and seismic activities. One of the prominent ways for drift measurement depends on instrumentation based measurements, e.g. using an ionosonde. The drift estimation of an ionosonde depends on measuring the Doppler shift on the received signal, where the main cause of Doppler shift is the change in the length of the propagation path of the signal between the transmitter and the receiver. Unfortunately, ionosondes are expensive devices and their installation and maintenance require special care. Furthermore, the ionosonde network over the world or even Europe is not dense enough to obtain a global or continental drift map. In order to overcome the difficulties related to an ionosonde, we propose a technique to perform ionospheric drift estimation based on ray tracing. First, a two dimensional TEC map is constructed by using the IONOLAB-MAP tool which spatially interpolates the VTEC estimates obtained from the EUREF CORS network. Next, a three dimensional electron density profile is generated by inputting the TEC estimates to the IRI-2015 model. Eventually, a close-to-real situation electron density profile is obtained in which ray tracing can be performed. These profiles can be constructed periodically with a period of as low as 30 seconds. By processing two consequent snapshots together and calculating the propagation paths, we estimate the drift measurements over any coordinate of concern. We test our technique by comparing the results to the drift measurements taken at the DPS ionosonde at Pruhonice, Czech Republic. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.

  15. Advanced Multipath Mitigation Techniques for Satellite-Based Positioning Applications

    Directory of Open Access Journals (Sweden)

    Mohammad Zahidul H. Bhuiyan

    2010-01-01

    Full Text Available Multipath remains a dominant source of ranging errors in Global Navigation Satellite Systems (GNSS, such as the Global Positioning System (GPS or the future European satellite navigation system Galileo. Multipath is generally considered undesirable in the context of GNSS, since the reception of multipath can make significant distortion to the shape of the correlation function used for time delay estimation. However, some wireless communications techniques exploit multipath in order to provide signal diversity though in GNSS, the major challenge is to effectively mitigate the multipath, since we are interested only in the satellite-receiver transit time offset of the Line-Of-Sight (LOS signal for the receiver's position estimate. Therefore, the multipath problem has been approached from several directions in order to mitigate the impact of multipath on navigation receivers, including the development of novel signal processing techniques. In this paper, we propose a maximum likelihood-based technique, namely, the Reduced Search Space Maximum Likelihood (RSSML delay estimator, which is capable of mitigating the multipath effects reasonably well at the expense of increased complexity. The proposed RSSML attempts to compensate the multipath error contribution by performing a nonlinear curve fit on the input correlation function, which finds a perfect match from a set of ideal reference correlation functions with certain amplitude(s, phase(s, and delay(s of the multipath signal. It also incorporates a threshold-based peak detection method, which eventually reduces the code-delay search space significantly. However, the downfall of RSSML is the memory requirement which it uses to store the reference correlation functions. The multipath performance of other delay-tracking methods previously studied for Binary Phase Shift Keying-(BPSK- and Sine Binary Offset Carrier- (SinBOC- modulated signals is also analyzed in closed loop model with the new Composite

  16. CANDU in-reactor quantitative visual-based inspection techniques

    Science.gov (United States)

    Rochefort, P. A.

    2009-02-01

    This paper describes two separate visual-based inspection procedures used at CANDU nuclear power generating stations. The techniques are quantitative in nature and are delivered and operated in highly radioactive environments with access that is restrictive, and in one case is submerged. Visual-based inspections at stations are typically qualitative in nature. For example a video system will be used to search for a missing component, inspect for a broken fixture, or locate areas of excessive corrosion in a pipe. In contrast, the methods described here are used to measure characteristic component dimensions that in one case ensure ongoing safe operation of the reactor and in the other support reactor refurbishment. CANDU reactors are Pressurized Heavy Water Reactors (PHWR). The reactor vessel is a horizontal cylindrical low-pressure calandria tank approximately 6 m in diameter and length, containing heavy water as a neutron moderator. Inside the calandria, 380 horizontal fuel channels (FC) are supported at each end by integral end-shields. Each FC holds 12 fuel bundles. The heavy water primary heat transport water flows through the FC pressure tube, removing the heat from the fuel bundles and delivering it to the steam generator. The general design of the reactor governs both the type of measurements that are required and the methods to perform the measurements. The first inspection procedure is a method to remotely measure the gap between FC and other in-core horizontal components. The technique involves delivering vertically a module with a high-radiation-resistant camera and lighting into the core of a shutdown but fuelled reactor. The measurement is done using a line-of-sight technique between the components. Compensation for image perspective and viewing elevation to the measurement is required. The second inspection procedure measures flaws within the reactor's end shield FC calandria tube rolled joint area. The FC calandria tube (the outer shell of the FC) is

  17. Assessment of Urban Ecosystem Health Based on Entropy Weight Extension Decision Model in Urban Agglomeration

    Directory of Open Access Journals (Sweden)

    Qian Yang

    2016-08-01

    Full Text Available Urban ecosystem health evaluation can assist in sustainable ecological management at a regional level. This study examined urban agglomeration ecosystem health in the middle reaches of the Yangtze River with entropy weight and extension theories. The model overcomes information omissions and subjectivity problems in the evaluation process of urban ecosystem health. Results showed that human capital and education, economic development level as well as urban infrastructure have a significant effect on the health states of urban agglomerations. The health status of the urban agglomeration’s ecosystem was not optimistic in 2013. The majority of the cities were unhealthy or verging on unhealthy, accounting for 64.52% of the total number of cities in the urban agglomeration. The regional differences of the 31 cities’ ecosystem health are significant. The cause originated from an imbalance in economic development and the policy guidance of city development. It is necessary to speed up the integration process to promote coordinated regional development. The present study will aid us in understanding and advancing the health situation of the urban ecosystem in the middle reaches of the Yangtze River and will provide an efficient urban ecosystem health evaluation method that can be used in other areas.

  18. A Dynamic XML-NS View Based Approach for the Extensible Integration of Web Data Sources

    Institute of Scientific and Technical Information of China (English)

    WU Wei; LU Zheng-ding; LI Rui-xuan; WANG Zhi-gang

    2004-01-01

    We propose a three-step technique to achieve this purpose.First, we utilize a collection of XML namespaces organized into hierarchical structure as a medium for expressing data semantics.Second, we define the format of resource descriptor for the information source discovery scheme so that we can dynamically register and/or deregister the Web data sources on the fly.Third, we employ an inverted-index mechanism to identify the subset of information sources that are relevant to a particular user query.We describe the design, architecture, and implementation of our approach--IWDS, and illustrate its use through case examples.

  19. Parameter tuning of PVD process based on artificial intelligence technique

    Science.gov (United States)

    Norlina, M. S.; Diyana, M. S. Nor; Mazidah, P.; Rusop, M.

    2016-07-01

    In this study, an artificial intelligence technique is proposed to be implemented in the parameter tuning of a PVD process. Due to its previous adaptation in similar optimization problems, genetic algorithm (GA) is selected to optimize the parameter tuning of the RF magnetron sputtering process. The most optimized parameter combination obtained from GA's optimization result is expected to produce the desirable zinc oxide (ZnO) thin film from the sputtering process. The parameters involved in this study were RF power, deposition time and substrate temperature. The algorithm was tested to optimize the 25 datasets of parameter combinations. The results from the computational experiment were then compared with the actual result from the laboratory experiment. Based on the comparison, GA had shown that the algorithm was reliable to optimize the parameter combination before the parameter tuning could be done to the RF magnetron sputtering machine. In order to verify the result of GA, the algorithm was also been compared to other well known optimization algorithms, which were, particle swarm optimization (PSO) and gravitational search algorithm (GSA). The results had shown that GA was reliable in solving this RF magnetron sputtering process parameter tuning problem. GA had shown better accuracy in the optimization based on the fitness evaluation.

  20. Damage detection technique by measuring laser-based mechanical impedance

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hyeonseok; Sohn, Hoon [Department of Civil and Environmental Engineering, Korea Advanced Institute of Science and Technology (Daehak-ro 291, Yuseong-gu, Daejeon 305-701) (Korea, Republic of)

    2014-02-18

    This study proposes a method for measurement of mechanical impedance using noncontact laser ultrasound. The measurement of mechanical impedance has been of great interest in nondestructive testing (NDT) or structural health monitoring (SHM) since mechanical impedance is sensitive even to small-sized structural defects. Conventional impedance measurements, however, have been based on electromechanical impedance (EMI) using contact-type piezoelectric transducers, which show deteriorated performances induced by the effects of a) Curie temperature limitations, b) electromagnetic interference (EMI), c) bonding layers and etc. This study aims to tackle the limitations of conventional EMI measurement by utilizing laser-based mechanical impedance (LMI) measurement. The LMI response, which is equivalent to a steady-state ultrasound response, is generated by shooting the pulse laser beam to the target structure, and is acquired by measuring the out-of-plane velocity using a laser vibrometer. The formation of the LMI response is observed through the thermo-mechanical finite element analysis. The feasibility of applying the LMI technique for damage detection is experimentally verified using a pipe specimen under high temperature environment.

  1. Electron tomography based on a total variation minimization reconstruction technique

    Energy Technology Data Exchange (ETDEWEB)

    Goris, B., E-mail: bart.goris@ua.ac.be [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Van den Broek, W. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Batenburg, K.J. [Centrum Wiskunde and Informatica, Science Park 123, NL-1098XG Amsterdam (Netherlands); Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Heidari Mezerji, H.; Bals, S. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium)

    2012-02-15

    The 3D reconstruction of a tilt series for electron tomography is mostly carried out using the weighted backprojection (WBP) algorithm or using one of the iterative algorithms such as the simultaneous iterative reconstruction technique (SIRT). However, it is known that these reconstruction algorithms cannot compensate for the missing wedge. Here, we apply a new reconstruction algorithm for electron tomography, which is based on compressive sensing. This is a field in image processing specialized in finding a sparse solution or a solution with a sparse gradient to a set of ill-posed linear equations. Therefore, it can be applied to electron tomography where the reconstructed objects often have a sparse gradient at the nanoscale. Using a combination of different simulated and experimental datasets, it is shown that missing wedge artefacts are reduced in the final reconstruction. Moreover, it seems that the reconstructed datasets have a higher fidelity and are easier to segment in comparison to reconstructions obtained by more conventional iterative algorithms. -- Highlights: Black-Right-Pointing-Pointer A reconstruction algorithm for electron tomography is investigated based on total variation minimization. Black-Right-Pointing-Pointer Missing wedge artefacts are reduced by this algorithm. Black-Right-Pointing-Pointer The reconstruction is easier to segment. Black-Right-Pointing-Pointer More reliable quantitative information can be obtained.

  2. The effects of processing techniques on magnesium-based composite

    Science.gov (United States)

    Rodzi, Siti Nur Hazwani Mohamad; Zuhailawati, Hussain

    2016-12-01

    The aim of this study is to investigate the effect of processing techniques on the densification, hardness and compressive strength of Mg alloy and Mg-based composite for biomaterial application. The control sample (pure Mg) and Mg-based composite (Mg-Zn/HAp) were fabricated through mechanical alloying process using high energy planetary mill, whilst another Mg-Zn/HAp composite was fabricated through double step processing (the matrix Mg-Zn alloy was fabricated by planetary mill, subsequently HAp was dispersed by roll mill). As-milled powder was then consolidated by cold press into 10 mm diameter pellet under 400 MPa compaction pressure before being sintered at 300 °C for 1 hour under the flow of argon. The densification of the sintered pellets were then determined by Archimedes principle. Mechanical properties of the sintered pellets were characterized by microhardness and compression test. The results show that the density of the pellets was significantly increased by addition of HAp, but the most optimum density was observed when the sample was fabricated through double step processing (1.8046 g/cm3). Slight increment in hardness and ultimate compressive strength were observed for Mg-Zn/HAp composite that was fabricated through double step processing (58.09 HV, 132.19 MPa), as compared to Mg-Zn/HAp produced through single step processing (47.18 HV, 122.49 MPa).

  3. Introducing Risk Management Techniques Within Project Based Software Engineering Courses

    Science.gov (United States)

    Port, Daniel; Boehm, Barry

    2002-03-01

    In 1996, USC switched its core two-semester software engineering course from a hypothetical-project, homework-and-exam course based on the Bloom taxonomy of educational objectives (knowledge, comprehension, application, analysis, synthesis, and evaluation). The revised course is a real-client team-project course based on the CRESST model of learning objectives (content understanding, problem solving, collaboration, communication, and self-regulation). We used the CRESST cognitive demands analysis to determine the necessary student skills required for software risk management and the other major project activities, and have been refining the approach over the last 5 years of experience, including revised versions for one-semester undergraduate and graduate project course at Columbia. This paper summarizes our experiences in evolving the risk management aspects of the project course. These have helped us mature more general techniques such as risk-driven specifications, domain-specific simplifier and complicator lists, and the schedule as an independent variable (SAIV) process model. The largely positive results in terms of review of pass / fail rates, client evaluations, product adoption rates, and hiring manager feedback are summarized as well.

  4. Validation techniques of agent based modelling for geospatial simulations

    Directory of Open Access Journals (Sweden)

    M. Darvishi

    2014-10-01

    Full Text Available One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS, biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI’s ArcGIS, OpenMap, GeoTools, etc for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  5. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  6. Ground-based intercomparison of two isoprene measurement techniques

    Directory of Open Access Journals (Sweden)

    E. Leibrock

    2003-01-01

    Full Text Available An informal intercomparison of two isoprene (C5H8 measurement techniques was carried out during Fall of 1998 at a field site located approximately 3 km west of Boulder, Colorado, USA. A new chemical ionization mass spectrometric technique (CIMS was compared to a well-established gas chromatographic technique (GC. The CIMS technique utilized benzene cation chemistry to ionize isoprene. The isoprene levels measured by the CIMS were often larger than those obtained with the GC. The results indicate that the CIMS technique suffered from an anthropogenic interference associated with air masses from the Denver, CO metropolitan area as well as an additional interference occurring in clean conditions. However, the CIMS technique is also demonstrated to be sensitive and fast. Especially after introduction of a tandem mass spectrometric technique, it is therefore a candidate for isoprene measurements in remote environments near isoprene sources.

  7. Whole Genome Sequencing Based Characterization of Extensively Drug-Resistant Mycobacterium tuberculosis Isolates from Pakistan

    KAUST Repository

    Ali, Asho

    2015-02-26

    Improved molecular diagnostic methods for detection drug resistance in Mycobacterium tuberculosis (MTB) strains are required. Resistance to first- and second- line anti-tuberculous drugs has been associated with single nucleotide polymorphisms (SNPs) in particular genes. However, these SNPs can vary between MTB lineages therefore local data is required to describe different strain populations. We used whole genome sequencing (WGS) to characterize 37 extensively drug-resistant (XDR) MTB isolates from Pakistan and investigated 40 genes associated with drug resistance. Rifampicin resistance was attributable to SNPs in the rpoB hot-spot region. Isoniazid resistance was most commonly associated with the katG codon 315 (92%) mutation followed by inhA S94A (8%) however, one strain did not have SNPs in katG, inhA or oxyR-ahpC. All strains were pyrazimamide resistant but only 43% had pncA SNPs. Ethambutol resistant strains predominantly had embB codon 306 (62%) mutations, but additional SNPs at embB codons 406, 378 and 328 were also present. Fluoroquinolone resistance was associated with gyrA 91-94 codons in 81% of strains; four strains had only gyr B mutations, while others did not have SNPs in either gyrA or gyrB. Streptomycin resistant strains had mutations in ribosomal RNA genes; rpsL codon 43 (42%); rrs 500 region (16%), and gidB (34%) while six strains did not have mutations in any of these genes. Amikacin/kanamycin/capreomycin resistance was associated with SNPs in rrs at nt1401 (78%) and nt1484 (3%), except in seven (19%) strains. We estimate that if only the common hot-spot region targets of current commercial assays were used, the concordance between phenotypic and genotypic testing for these XDR strains would vary between rifampicin (100%), isoniazid (92%), flouroquinolones (81%), aminoglycoside (78%) and ethambutol (62%); while pncA sequencing would provide genotypic resistance in less than half the isolates. This work highlights the importance of expanded

  8. Image-based Virtual Exhibit and Its Extension to 3D

    Institute of Scientific and Technical Information of China (English)

    Ming-Min Zhang; Zhi-Geng Pan; Li-Feng Ren; Peng Wang

    2007-01-01

    In this paper we introduce an image-based virtual exhibition system especially for clothing product. It can provide a powerful material substitution function, which is very useful for customization clothing-built. A novel color substitution algorithm and two texture morphing methods are designed to ensure realistic substitution result. To extend it to 3D, we need to do the model reconstruction based on photos. Thus we present an improved method for modeling human body. It deforms a generic model with shape details extracted from pictures to generate a new model. Our method begins with model image generation followed by silhouette extraction and segmentation. Then it builds a mapping between pixels inside every pair of silhouette segments in the model image and in the picture. Our mapping algorithm is based on a slice space representation that conforms to the natural features of human body.

  9. Extensions to Regret-based Decision Curve Analysis: An application to hospice referral for terminal patients

    Directory of Open Access Journals (Sweden)

    Tsalatsanis Athanasios

    2011-12-01

    Full Text Available Abstract Background Despite the well documented advantages of hospice care, most terminally ill patients do not reap the maximum benefit from hospice services, with the majority of them receiving hospice care either prematurely or delayed. Decision systems to improve the hospice referral process are sorely needed. Methods We present a novel theoretical framework that is based on well-established methodologies of prognostication and decision analysis to assist with the hospice referral process for terminally ill patients. We linked the SUPPORT statistical model, widely regarded as one of the most accurate models for prognostication of terminally ill patients, with the recently developed regret based decision curve analysis (regret DCA. We extend the regret DCA methodology to consider harms associated with the prognostication test as well as harms and effects of the management strategies. In order to enable patients and physicians in making these complex decisions in real-time, we developed an easily accessible web-based decision support system available at the point of care. Results The web-based decision support system facilitates the hospice referral process in three steps. First, the patient or surrogate is interviewed to elicit his/her personal preferences regarding the continuation of life-sustaining treatment vs. palliative care. Then, regret DCA is employed to identify the best strategy for the particular patient in terms of threshold probability at which he/she is indifferent between continuation of treatment and of hospice referral. Finally, if necessary, the probabilities of survival and death for the particular patient are computed based on the SUPPORT prognostication model and contrasted with the patient's threshold probability. The web-based design of the CDSS enables patients, physicians, and family members to participate in the decision process from anywhere internet access is available. Conclusions We present a theoretical

  10. Security extension for the Canetti-Krawczyk model in identity-based systems

    Institute of Scientific and Technical Information of China (English)

    LI Xinghua; MA Jianfeng; SangJae Moon

    2005-01-01

    The Canetti-Krawczyk (CK) model is a formalism for the analysis of keyexchange protocols, which can guarantee many security properties for the protocols proved secure by this model. But we find this model lacks the ability to guarantee key generation center (KGC) forward secrecy, which is an important security property for key-agreement protocols based on Identity. The essential reason leading to this weakness is that it does not fully consider the attacker's capabilities. In this paper, the CK model is accordingly extended with a new additional attacker's capability of the KGC corruption in Identity-based systems, which enables it to support KGC forward secrecy.

  11. Extensive simulation studies on the reconstructed image resolution of a position sensitive detector based on pixelated CdTe crystals

    CERN Document Server

    Zachariadou, K; Kaissas, I; Seferlis, S; Lambropoulos, C; Loukas, D; Potiriadis, C

    2011-01-01

    We present results on the reconstructed image resolution of a position sensitive radiation instrument (COCAE) based on extensive simulation studies. The reconstructed image resolution has been investigated in a wide range of incident photon energies emitted by point-like sources located at different source-to-detector distances on and off the detector's symmetry axis. The ability of the detector to distinguish multiple radioactive sources observed simultaneously is investigating by simulating point-like sources of different energies located on and off the detector's symmetry axis and at different positions

  12. Signal Processing Techniques for Silicon Drift Detector Based X-Ray Spectrometer for Planatary Instruments

    Science.gov (United States)

    Patel, A.; Shanmugam, M.; Ladiya, T.

    2016-10-01

    We are developing SDD based x-ray spectrometer using various pulse height analysis techniques. This study will help to identify the proper processing technique based on instrument specifications which can be used for future scientific missions.

  13. Water-based oligochitosan and nanowhisker chitosan as potential food preservatives for shelf-life extension of minced pork.

    Science.gov (United States)

    Chantarasataporn, Patomporn; Tepkasikul, Preenapha; Kingcha, Yutthana; Yoksan, Rangrong; Pichyangkura, Rath; Visessanguan, Wonnop; Chirachanchai, Suwabun

    2014-09-15

    Water-based chitosans in the forms of oligochitosan (OligoCS) and nanowhisker chitosan (CSWK) are proposed as a novel food preservative based on a minced pork model study. The high surface area with a positive charge over the neutral pH range (pH 5-8) of OligoCS and CSWK lead to an inhibition against Gram-positive (Staphylococcus aureus, Listeria monocytogenes, and Bacillus cereus) and Gram-negative microbes (Salmonella enteritidis and Escherichia coli O157:H7). In the minced pork model, OligoCS effectively performs a food preservative for shelf-life extension as clarified from the retardation of microbial growth, biogenic amine formation and lipid oxidation during the storage. OligoCS maintains almost all myosin heavy chain protein degradation as observed in the electrophoresis. The present work points out that water-based chitosan with its unique morphology not only significantly inhibits antimicrobial activity but also maintains the meat quality with an extension of shelf-life, and thus has the potential to be used as a food preservative.

  14. Depth-based coding of MVD data for 3D video extension of H.264/AVC

    Science.gov (United States)

    Rusanovskyy, Dmytro; Hannuksela, Miska M.; Su, Wenyi

    2013-06-01

    This paper describes a novel approach of using depth information for advanced coding of associated video data in Multiview Video plus Depth (MVD)-based 3D video systems. As a possible implementation of this conception, we describe two coding tools that have been developed for H.264/AVC based 3D Video Codec as response to Moving Picture Experts Group (MPEG) Call for Proposals (CfP). These tools are Depth-based Motion Vector Prediction (DMVP) and Backward View Synthesis Prediction (BVSP). Simulation results conducted under JCT-3V/MPEG 3DV Common Test Conditions show, that proposed in this paper tools reduce bit rate of coded video data by 15% of average delta bit rate reduction, which results in 13% of bit rate savings on total for the MVD data over the state-of-the-art MVC+D coding. Moreover, presented in this paper conception of depth-based coding of video has been further developed by MPEG 3DV and JCT-3V and this work resulted in even higher compression efficiency, bringing about 20% of delta bit rate reduction on total for coded MVD data over the reference MVC+D coding. Considering significant gains, proposed in this paper coding approach can be beneficial for development of new 3D video coding standards. [Figure not available: see fulltext.

  15. Extension of information entropy-based measures in incomplete information systems

    Institute of Scientific and Technical Information of China (English)

    LI Ren-pu; HUANG Dao; GAO Mao-ting

    2005-01-01

    It is helpful for people to understand the essence of rough set theory to study the concepts and operations of rough set theory from its information view. In this paper we address knowledge expression and knowledge reduction in incomplete information systems from the information view of rough set theory. First, by extending information entropy-based measures in complete information systems, two new measures of incomplete entropy and incomplete conditional entropy are presented for incomplete information systems. And then, based on these measures the problem of knowledge reduction in incomplete information systems is analyzed and the reduct definitions in incomplete information system and incomplete decision table are proposed respectively. Finally,the reduct definitions based on incomplete entropy and the reduct definitions based on similarity relation are compared. Two equivalent relationships between them are proved by theorems and an in equivalent relationship between them is illustrated by an example. The work of this paper extends the research of rough set theory from information view to incomplete information systems and establishes the theoretical basis for seeking efficient algorithm of knowledge acquisition in incomplete information systems.

  16. 78 FR 7654 - Extension of Exemptions for Security-Based Swaps

    Science.gov (United States)

    2013-02-04

    ... participants may be uncertain as to how to comply with the registration requirements of the Securities Act... VII effective date. \\15\\ A security-based swap execution facility is a trading system or platform in... and offers made by multiple participants in the facility or system, through any means of...

  17. An extensible agent architecture for a competitive market-based allocation of consumer attention space

    NARCIS (Netherlands)

    Hoen, P.J. 't; Bohte, S.M.; Gerding, E.H.; La Poutré, J.A.

    2002-01-01

    A competitive distributed recommendation mechanism is introduced based on adaptive software agents for efficiently allocating the ``customer attention space'', or banners. In the example of an electronic shopping mall, the task of correctly profiling and analyzing the customers is delegated to the

  18. Truth-value semantics and functional extensions for classical logic of partial terms based on equality

    CERN Document Server

    Parlamento, Franco

    2011-01-01

    We develop a bottom-up approach to truth-value semantics for classical logic of partial terms based on equality and apply it to prove the conservativity of the addition of partial description and partial selection functions, independently of any strictness assumption.

  19. Orientation of student entrepreneurial practices based on administrative techniques

    Directory of Open Access Journals (Sweden)

    Héctor Horacio Murcia Cabra

    2005-07-01

    Full Text Available As part of the second phase of the research project «Application of a creativity model to update the teaching of the administration in Colombian agricultural entrepreneurial systems» it was decided to re-enforce student planning and execution of the students of the Agricultural business Administration Faculty of La Salle University. Those finishing their studies were given special attention. The plan of action was initiated in the second semester of 2003. It was initially defined as a model of entrepreneurial strengthening based on a coherent methodology that included the most recent administration and management techniques. Later, the applicability of this model was tested in some organizations of the agricultural sector that had asked for support in their planning processes. Through an investigation-action process the methodology was redefined in order to arrive at a final model that could be used by faculty students and graduates. The results obtained were applied to the teaching of Entrepreneurial Laboratory of ninth semester students with the hope of improving administrative support to agricultural enterprises. Following this procedure more than 100 students and 200 agricultural producers have applied this procedure between June 2003 and July 2005. The methodology used and the results obtained are presented in this article.

  20. Image content authentication technique based on Laplacian Pyramid

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    This paper proposes a technique of image content authentication based on the Laplacian Pyramid to verify the authenticity of image content.First,the image is decomposed into Laplacian Pyramid before the transformation.Next,the smooth and detail properties of the original image are analyzed according to the Laplacian Pyramid,and the properties are classified and encoded to get the corresponding characteristic values.Then,the signature derived from the encrypted characteristic values is embedded in the original image as a watermark.After the reception,the characteristic values of the received image are compared with the watermark drawn out from the image.The algorithm automatically identifies whether the content is tampered by means of morphologic filtration.The information of tampered location is Presented at the same time.Experimental results show that the pro posed authentication algorithm can effectively detect the event and location when the original image content is tampered.Moreover,it can tolerate some distortions produced by compression,filtration and noise degradation.

  1. Research on technique of wavefront retrieval based on Foucault test

    Science.gov (United States)

    Yuan, Lvjun; Wu, Zhonghua

    2010-05-01

    During finely grinding the best fit sphere and initial stage of polishing, surface error of large aperture aspheric mirrors is too big to test using common interferometer. Foucault test is widely used in fabricating large aperture mirrors. However, the optical path is disturbed seriously by air turbulence, and changes of light and dark zones can not be identified, which often lowers people's judging ability and results in making mistake to diagnose surface error of the whole mirror. To solve the problem, the research presents wavefront retrieval based on Foucault test through digital image processing and quantitative calculation. Firstly, real Foucault image can be gained through collecting a variety of images by CCD, and then average these image to eliminate air turbulence. Secondly, gray values are converted into surface error values through principle derivation, mathematical modeling, and software programming. Thirdly, linear deviation brought by defocus should be removed by least-square method to get real surface error. At last, according to real surface error, plot wavefront map, gray contour map and corresponding pseudo color contour map. The experimental results indicates that the three-dimensional wavefront map and two-dimensional contour map are able to accurately and intuitively show surface error on the whole mirrors under test, and they are beneficial to grasp surface error as a whole. The technique can be used to guide the fabrication of large aperture and long focal mirrors during grinding and initial stage of polishing the aspheric surface, which improves fabricating efficiency and precision greatly.

  2. A formal model for integrity protection based on DTE technique

    Institute of Scientific and Technical Information of China (English)

    JI Qingguang; QING Sihan; HE Yeping

    2006-01-01

    In order to provide integrity protection for the secure operating system to satisfy the structured protection class' requirements, a DTE technique based integrity protection formalization model is proposed after the implications and structures of the integrity policy have been analyzed in detail. This model consists of some basic rules for configuring DTE and a state transition model, which are used to instruct how the domains and types are set, and how security invariants obtained from initial configuration are maintained in the process of system transition respectively. In this model, ten invariants are introduced, especially, some new invariants dealing with information flow are proposed, and their relations with corresponding invariants described in literatures are also discussed.The thirteen transition rules with well-formed atomicity are presented in a well-operational manner. The basic security theorems correspond to these invariants and transition rules are proved. The rationalities for proposing the invariants are further annotated via analyzing the differences between this model and ones described in literatures. At last but not least, future works are prospected, especially, it is pointed out that it is possible to use this model to analyze SE-Linux security.

  3. Biofunctionalization of Si nanowires using a solution based technique

    Science.gov (United States)

    Williams, Elissa H.; Davydov, Albert V.; Oleshko, Vladimir P.; Lin, Nancy J.; Steffens, Kristen L.; Manocchi, Amy K.; Krylyuk, Sergiy; Rao, Mulpuri V.; Schreifels, John A.

    2012-10-01

    Here we present a solution based functionalization technique for streptavidin (SA) protein conjugation to silicon nanowires (Si NWs). Si NWs, with a diameter of 110 nm to 130 nm and a length of 5 μm to 10 μm, were functionalized with 3-aminopropyltriethoxysilane (APTES) followed by biotin for the selective attachment of SA. High-resolution transmission electron microscopy (HRTEM) and atomic force microscopy (AFM) showed that the Si NWs were conformally coated with 20 nm to 30 nm thick APTES, biotin, and SA layers upon functionalization. Successful attachment of each bio/organic layer was confirmed by X-ray photoelectron spectroscopy (XPS) and fluorescence microscopy. Fluorescence microscopy also demonstrated that there was an undesirable non-specific binding of the SA protein as well as a control protein, bovine serum albumin (BSA), to the APTES-coated Si NWs. However, inhibition of BSA binding and enhancement of SA binding were achieved following the biotinylation step. The biofunctionalized Si NWs show potential as label-free biosensing platforms for the specific and selective detection of biomolecules.

  4. Channel Based Adaptive Rate Control Technique for MANET

    Directory of Open Access Journals (Sweden)

    R. Bharathiraja

    2014-04-01

    Full Text Available In Mobile Ad hoc Networks (MANET, most of the existing works does not consider energy efficiency during selecting the appropriate route. Hence in MANET selecting the appropriate route and also maintaining energy efficiency is very important. Hence in order to overcome these issues, in this study we propose Channel Based Adaptive Rate Control technique for MANET. Here the most appropriate links is selected to transmit the node with efficient power consumption. The node broadcasts the information of its outgoing and incoming links in NSET instead of waiting for the feedback informattion from receiver. The number of packets transmitted in a channel access time is maximized by implementing the benefit ratio in rate selection algorithm. This study also introduces node cooperation, in node cooperation the node determines the feasibility of new rate setting determined by rate selection algorithm and it carries out new setting if it is feasible by following help, ack, reject and accept method. By simulation results we show that the proposed approach is power efficient and also increases the trasmission rate.

  5. Formal Verification Techniques Based on Boolean Satisfiability Problem

    Institute of Scientific and Technical Information of China (English)

    Xiao-Wei Li; Guang-Hui Li; Ming Shao

    2005-01-01

    This paper exploits Boolean satisfiability problem in equivalence checking and model checking respectively. A combinational equivalence checking method based on incremental satisfiability is presented. This method chooses the can didate equivalent pairs with some new techniques, and uses incremental satisfiability algorithm to improve its performance. By substituting the internal equivalent pairs and converting the equivalence relations into conjunctive normal form (CNF) formulas, this approach can avoid the false negatives, and reduce the search space of SAT procedure. Experimental results on ISCAS'85 benchmark circuits show that, the presented approach is faster and more robust than those existed in literature.This paper also presents an algorithm for extracting of unsatisfiable core, which has an important application in abstraction and refinement for model checking to alleviate the state space explosion bottleneck. The error of approximate extraction is analyzed by means of simulation. An analysis reveals that an interesting phenomenon occurs, with the increasing density of the formula, the average error of the extraction is decreasing. An exact extraction approach for MU subformula, referred to as pre-assignment algorithm, is proposed. Both theoretical analysis and experimental results show that it is more efficient.

  6. WORMHOLE ATTACK MITIGATION IN MANET: A CLUSTER BASED AVOIDANCE TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Subhashis Banerjee

    2014-01-01

    Full Text Available A Mobile Ad-Hoc Network (MANET is a self configuring, infrastructure less network of mobile devices connected by wireless links. Loopholes like wireless medium, lack of a fixed infrastructure, dynamic topology, rapid deployment practices, and the hostile environments in which they may be deployed, make MANET vulnerable to a wide range of security attacks and Wormhole attack is one of them. During this attack a malicious node captures packets from one location in the network, and tunnels them to another colluding malicious node at a distant point, which replays them locally. This paper presents a cluster based Wormhole attack avoidance technique. The concept of hierarchical clustering with a novel hierarchical 32- bit node addressing scheme is used for avoiding the attacking path during the route discovery phase of the DSR protocol, which is considered as the under lying routing protocol. Pinpointing the location of the wormhole nodes in the case of exposed attack is also given by using this method.

  7. Structuring Task-based Interaction through Collaborative Learning Techniques (2)

    Institute of Scientific and Technical Information of China (English)

    William Littlewood

    2004-01-01

    @@ Techniques for collaborative learning In this section the focus will move from broad strategies to specific techniques (often also called "structures") through which the strategies can be realized. It gives a selection of techniques which have proved (in my own experience as well as that of others) particularly useful in pro-viding contexts for practice, exploration and /or interaction in the second language classroom.

  8. Extension VIKOR for Priority Orders Based on Three Parameters Interval Fuzzy Number

    Directory of Open Access Journals (Sweden)

    Qian Zhang

    2013-05-01

    Full Text Available In this study, an improved VIKOR method was presented to deal with multi-attribute decision-making based on three parameters interval fuzzy number. The attribute weights were unknown but alternative priority of object preference was given. A new non-linear rewards and punishment method in positive interval was proposed to make the attributes normal, information covered reliability and relative superiority degree two methods were used to compare and sort the Three Parameters Interval Fuzzy Number (TPIFN and a quadratic programming based on contribution was constructed to get attribute weights, then defined the information entropy distance between TPIFN and the optimum object orders was obtained by VIKOR. The numerical example was provided to demonstrate the feasibility and validity.

  9. Extension of a Kolmogorov Atmospheric Turbulence Model for Time-Based Simulation Implementation

    Science.gov (United States)

    McMinn, John D.

    1997-01-01

    The development of any super/hypersonic aircraft requires the interaction of a wide variety of technical disciplines to maximize vehicle performance. For flight and engine control system design and development on this class of vehicle, realistic mathematical simulation models of atmospheric turbulence, including winds and the varying thermodynamic properties of the atmosphere, are needed. A model which has been tentatively selected by a government/industry group of flight and engine/inlet controls representatives working on the High Speed Civil Transport is one based on the Kolmogorov spectrum function. This report compares the Dryden and Kolmogorov turbulence forms, and describes enhancements that add functionality to the selected Kolmogorov model. These added features are: an altitude variation of the eddy dissipation rate based on Dryden data, the mapping of the eddy dissipation rate database onto a regular latitude and longitude grid, a method to account for flight at large vehicle attitude angles, and a procedure for transitioning smoothly across turbulence segments.

  10. Weighted graph based ordering techniques for preconditioned conjugate gradient methods

    Science.gov (United States)

    Clift, Simon S.; Tang, Wei-Pai

    1994-01-01

    We describe the basis of a matrix ordering heuristic for improving the incomplete factorization used in preconditioned conjugate gradient techniques applied to anisotropic PDE's. Several new matrix ordering techniques, derived from well-known algorithms in combinatorial graph theory, which attempt to implement this heuristic, are described. These ordering techniques are tested against a number of matrices arising from linear anisotropic PDE's, and compared with other matrix ordering techniques. A variation of RCM is shown to generally improve the quality of incomplete factorization preconditioners.

  11. Cartilage repair: surgical techniques and tissue engineering using polysaccharide- and collagen-based biomaterials.

    Science.gov (United States)

    Galois, L; Freyria, A M; Grossin, L; Hubert, P; Mainard, D; Herbage, D; Stoltz, J F; Netter, P; Dellacherie, E; Payan, E

    2004-01-01

    Lesions of articular cartilage have a large variety of causes among which traumatic damage, osteoarthritis and osteochondritis dissecans are the most frequent. Replacement of articular defects in joints has assumed greater importance in recent years. This interest results in large part because cartilage defects cannot adequately heal themselves. Many techniques have been suggested over the last 30 years, but none allows the regeneration of the damaged cartilage, i.e. its replacement by a strictly identical tissue. In the first generation of techniques, relief of pain was the main concern, which could be provided by techniques in which cartilage was replaced by fibrocartilage. Disappointing results led investigators to focus on more appropriate bioregenerative approaches using transplantation of autologous cells into the lesion. Unfortunately, none of these approaches has provided a perfect final solution to the problem. The latest generation of techniques, currently in the developmental or preclinical stages, involve biomaterials for the repair of chondral or osteochondral lesions. Many of these scaffolds are designed to be seeded with chondrocytes or progenitor cells. Among natural and synthetic polymers, collagen- and polysaccharide-based biomaterials have been extensively used. For both these supports, studies have shown that chondrocytes maintain their phenotype when cultured in three dimensions. In both types of culture, a glycosaminoglycan-rich deposit is formed on the surface and in the inner region of the cultured cartilage, and type II collagen synthesis is also observed. Dynamic conditions can also improve the composition of such three-dimensional constructs. Many improvements are still required, however, in a number of key aspects that so far have received only scant attention. These aspects include: adhesion/integration of the graft with the adjacent native cartilage, cell-seeding with genetically-modified cell populations, biomaterials that can be

  12. Simplifying Hill-based muscle models through generalized extensible fuzzy heuristic implementation

    Science.gov (United States)

    O'Brien, Amy J.

    2006-04-01

    Traditional dynamic muscle models based on work initially published by A. V. Hill in 1938 often rely on high-order systems of differential equations. While such models are very accurate and effective, they do not typically lend themselves to modification by clinicians who are unfamiliar with biomedical engineering and advanced mathematics. However, it is possible to develop a fuzzy heuristic implementation of a Hill-based model-the Fuzzy Logic Implemented HIll-based (FLIHI) muscle model-that offers several advantages over conventional state equation approaches. Because a fuzzy system is oriented by design to describe a model in linguistics rather than ordinary differential equation-based mathematics, the resulting fuzzy model can be more readily modified and extended by medical practitioners. It also stands to reason that a well-designed fuzzy inference system can be implemented with a degree of generalizability not often encountered in traditional state space models. Taking electromyogram (EMG) as one input to muscle, FLIHI is tantamount to a fuzzy EMG-to-muscle force estimator that captures dynamic muscle properties while providing robustness to partial or noisy data. One goal behind this approach is to encourage clinicians to rely on the model rather than assuming that muscle force as an output maps directly to smoothed EMG as an input. FLIHI's force estimate is more accurate than assuming force equal to smoothed EMG because FLIHI provides a transfer function that accounts for muscle's inherent nonlinearity. Furthermore, employing fuzzy logic should provide FLIHI with improved robustness over traditional mathematical approaches.

  13. Adoption of farm-based irrigation water-saving techniques in the Guanzhong Plain, China

    NARCIS (Netherlands)

    Tang, Jianjun; Folmer, Henk; Xue, Jianhong

    2016-01-01

    This article analyses adoption of farm-based irrigation water saving techniques, based on a cross-sectional data set of 357 farmers in the Guanzhong Plain, China. Approximately 83% of the farmers use at least one farm-based water-saving technique. However, the traditional, inefficient techniques bor

  14. Application of Condition-Based Monitoring Techniques for Remote Monitoring of a Simulated Gas Centrifuge Enrichment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Hooper, David A [ORNL; Henkel, James J [ORNL; Whitaker, Michael [ORNL

    2012-01-01

    This paper presents research into the adaptation of monitoring techniques from maintainability and reliability (M&R) engineering for remote unattended monitoring of gas centrifuge enrichment plants (GCEPs) for international safeguards. Two categories of techniques are discussed: the sequential probability ratio test (SPRT) for diagnostic monitoring, and sequential Monte Carlo (SMC or, more commonly, particle filtering ) for prognostic monitoring. Development and testing of the application of condition-based monitoring (CBM) techniques was performed on the Oak Ridge Mock Feed and Withdrawal (F&W) facility as a proof of principle. CBM techniques have been extensively developed for M&R assessment of physical processes, such as manufacturing and power plants. These techniques are normally used to locate and diagnose the effects of mechanical degradation of equipment to aid in planning of maintenance and repair cycles. In a safeguards environment, however, the goal is not to identify mechanical deterioration, but to detect and diagnose (and potentially predict) attempts to circumvent normal, declared facility operations, such as through protracted diversion of enriched material. The CBM techniques are first explained from the traditional perspective of maintenance and reliability engineering. The adaptation of CBM techniques to inspector monitoring is then discussed, focusing on the unique challenges of decision-based effects rather than equipment degradation effects. These techniques are then applied to the Oak Ridge Mock F&W facility a water-based physical simulation of a material feed and withdrawal process used at enrichment plants that is used to develop and test online monitoring techniques for fully information-driven safeguards of GCEPs. Advantages and limitations of the CBM approach to online monitoring are discussed, as well as the potential challenges of adapting CBM concepts to safeguards applications.

  15. Evidence-based surgical techniques for caesarean section

    DEFF Research Database (Denmark)

    Aabakke, Anna J M; Secher, Niels Jørgen; Krebs, Lone

    2014-01-01

    Caesarean section (CS) is a common surgical procedure, and in Denmark 21% of deliveries is by CS. There is an increasing amount of scientific evidence to support the different surgical techniques used at CS. This article reviews the literature regarding CS techniques. There is still a lack...

  16. A Technique for Volumetric CSG Based on Morphology

    DEFF Research Database (Denmark)

    Bærentzen, Jakob Andreas; Christensen, Niels Jørgen

    2001-01-01

    In this paper, a new technique for volumetric CSG is presented. The technique requires the input volumes to correspond to solids which fulfill a voxelization suitability criterion. Assume the CSG operation is union. The volumetric union of two such volumes is defined in terms of the voxelization...

  17. An Architecture for Intrusion Detection Based on an Extension of the Method of Remaining Elements

    Directory of Open Access Journals (Sweden)

    P. Velarde-Alvarado

    2010-08-01

    Full Text Available This paper introduces an Anomaly-based Intrusion Detection architecture based on behavioral traffic profiles createdby using our enhanced version of the Method of Remaining Elements (MRE. This enhanced version includes: aredefinition of the exposure threshold through the entropy and cardinality of residual sequences, a dualcharacterization for two types of traffic slots, the introduction of the Anomaly Level Exposure (ALE that gives a betterquantification of anomalies for a given traffic slot and r-feature, an alternative support that extends its detectioncapabilities, and a new procedure to obtain the exposure threshold through an analysis of outliers on the trainingdataset. Regarding the original MRE, we incorporate the refinements outlined resulting in a reliable method, whichgives an improved sensitivity to the detection of a broader range of attacks. The experiments were conducted on theMIT-DARPA dataset and also on an academic LAN by implementing real attacks. The results show that the proposedarchitecture is effective in early detection of intrusions, as well as some kind of attacks designed to bypass detectionmeasures.

  18. Consumer Decision-Making Styles Extension to Trust-Based Product Comparison Site Usage Model

    Directory of Open Access Journals (Sweden)

    Radoslaw Macik

    2016-09-01

    Full Text Available The paper describes an implementation of extended consumer decision-making styles concept in explaining consumer choices made in product comparison site environment in the context of trust-based information technology acceptance model. Previous research proved that trust-based acceptance model is useful in explaining purchase intention and anticipated satisfaction in product comparison site environment, as an example of online decision shopping aids. Trust to such aids is important in explaining their usage by consumers. The connections between consumer decision-making styles, product and sellers opinions usage, cognitive and affective trust toward online product comparison site, as well as choice outcomes (purchase intention and brand choice are explored trough structural equation models using PLS-SEM approach, using a sample of 461 young consumers. Research confirmed the validity of research model in explaining product comparison usage, and some consumer decision-making styles influenced consumers’ choices and purchase intention. Product and sellers reviews usage were partially mediating mentioned relationships.

  19. New Extensions of Pairing-based Signatures into Universal (Multi) Designated Verifier Signatures

    CERN Document Server

    Vergnaud, Damien

    2008-01-01

    The concept of universal designated verifier signatures was introduced by Steinfeld, Bull, Wang and Pieprzyk at Asiacrypt 2003. These signatures can be used as standard publicly verifiable digital signatures but have an additional functionality which allows any holder of a signature to designate the signature to any desired verifier. This designated verifier can check that the message was indeed signed, but is unable to convince anyone else of this fact. We propose new efficient constructions for pairing-based short signatures. Our first scheme is based on Boneh-Boyen signatures and its security can be analyzed in the standard security model. We prove its resistance to forgery assuming the hardness of the so-called strong Diffie-Hellman problem, under the knowledge-of-exponent assumption. The second scheme is compatible with the Boneh-Lynn-Shacham signatures and is proven unforgeable, in the random oracle model, under the assumption that the computational bilinear Diffie-Hellman problem is untractable. Both s...

  20. The Numerical Simulation of the Crack Elastoplastic Extension Based on the Extended Finite Element Method

    Directory of Open Access Journals (Sweden)

    Xia Xiaozhou

    2013-01-01

    Full Text Available In the frame of the extended finite element method, the exponent disconnected function is introduced to reflect the discontinuous characteristic of crack and the crack tip enrichment function which is made of triangular basis function, and the linear polar radius function is adopted to describe the displacement field distribution of elastoplastic crack tip. Where, the linear polar radius function form is chosen to decrease the singularity characteristic induced by the plastic yield zone of crack tip, and the triangle basis function form is adopted to describe the displacement distribution character with the polar angle of crack tip. Based on the displacement model containing the above enrichment displacement function, the increment iterative form of elastoplastic extended finite element method is deduced by virtual work principle. For nonuniform hardening material such as concrete, in order to avoid the nonsymmetry characteristic of stiffness matrix induced by the non-associate flowing of plastic strain, the plastic flowing rule containing cross item based on the least energy dissipation principle is adopted. Finally, some numerical examples show that the elastoplastic X-FEM constructed in this paper is of validity.

  1. A Lossless Data Hiding Technique based on AES-DWT

    Directory of Open Access Journals (Sweden)

    Gustavo Fernandaacute;ndez Torres2

    2012-09-01

    Full Text Available In this paper we propose a new data hiding technique. The new technique uses steganography and cryptography on images with a size of 256x256 pixels and an 8-bit grayscale format. There are design restrictions such as a fixed-size cover image, and reconstruction without error of the hidden image. The steganography technique uses a Haar-DWT (Discrete Wavelet Transform with hard thresholding and LSB (Less Significant Bit technique on the cover image. The algorithms used for compressing and ciphering the secret image are lossless JPG and AES, respectively. The proposed technique is used to generate a stego image which provides a double type of security that is robust against attacks. Results are reported for different thresholds levels in terms of PSNR.

  2. Development of Acoustic Model-Based Iterative Reconstruction Technique for Thick-Concrete Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Almansouri, Hani [Purdue University; Clayton, Dwight A [ORNL; Kisner, Roger A [ORNL; Polsky, Yarom [ORNL; Bouman, Charlie [Purdue University; Santos-Villalobos, Hector J [ORNL

    2016-01-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well's health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.

  3. Development of acoustic model-based iterative reconstruction technique for thick-concrete imaging

    Science.gov (United States)

    Almansouri, Hani; Clayton, Dwight; Kisner, Roger; Polsky, Yarom; Bouman, Charles; Santos-Villalobos, Hector

    2016-02-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well's health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.1

  4. Development of Acoustic Model-Based Iterative Reconstruction Technique for Thick-Concrete Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Almansouri, Hani [Purdue University; Clayton, Dwight A [ORNL; Kisner, Roger A [ORNL; Polsky, Yarom [ORNL; Bouman, Charlie [Purdue University; Santos-Villalobos, Hector J [ORNL

    2015-01-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well s health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.

  5. An Agent-based Extensible Climate Control System for Sustainable Greenhouse Production

    DEFF Research Database (Denmark)

    Sørensen, Jan Corfixen; Jørgensen, Bo Nørregaard; Klein, Mark

    2011-01-01

    The slow adoption pace of new control strategies for sustainable greenhouse climate control by industrial growers is mainly due to the complexity of identifying and resolving potentially conflicting climate control requirements. In this paper, we present a multi-agent-based climate control system...... that allows new requirements to be added without any need to identify or resolve conflicts beforehand. This is achieved by representing the climate control requirements as separate agents. Identifying and solving conflicts now become a negotiation problem among agents sharing the same controlled environment....... Negotiation is done using a novel multi-issue negotiation protocol that uses a generic algorithm to find an optimized solution within the search space. The Multi-Agent control system has been empirically evaluated in an ornamental floriculture research facility in Denmark. The evaluation showed...

  6. Size measurement of gold and silver nanostructures based on their extinction spectrum: limitations and extensions

    Directory of Open Access Journals (Sweden)

    A A Ashkarran

    2013-09-01

    Full Text Available  This paper reports on physical principles and the relations between extinction cross section and geometrical properties of silver and gold nanostructures. We introduce some simple relations for determining geometrical properties of silver and gold nanospheres based on the situation of their plasmonic peak. We also applied, investigated and compared the accuracy of these relations using other published works in order to make clear the effects of shape, size distribution and refractive index of particles’ embedding medium. Finally, we extended the equations to non-spherical particles and investigated their accuracy. We found that modified forms of the equations may lead to more exact results for non-spherical metal particles, but for better results, modified equations should depend on shape and size distribution of particles. It seems that these equations are not applicable to particles with corners sharper than cubes' corners i.e. nanostructures with spatial angles less than π/2 sr.

  7. Research Extension and Education Programs on Bio-based Energy Technologies and Products

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, Sam [University of Tennessee, Knoxville, TN (United States). Tennessee Agricultural Experiment Station; Harper, David [University of Tennessee, Knoxville, TN (United States). Tennessee Agricultural Experiment Station; Womac, Al [University of Tennessee, Knoxville, TN (United States). Tennessee Agricultural Experiment Station

    2010-03-02

    The overall objectives of this project were to provide enhanced educational resources for the general public, educational and development opportunities for University faculty in the Southeast region, and enhance research knowledge concerning biomass preprocessing and deconstruction. All of these efforts combine to create a research and education program that enhances the biomass-based industries of the United States. This work was broken into five primary objective areas: • Task A - Technical research in the area of biomass preprocessing, analysis, and evaluation. • Tasks B&C - Technical research in the areas of Fluidized Beds for the Chemical Modification of Lignocellulosic Biomass and Biomass Deconstruction and Evaluation. • Task D - Analyses for the non-scientific community to provides a comprehensive analysis of the current state of biomass supply, demand, technologies, markets and policies; identify a set of feasible alternative paths for biomass industry development and quantify the impacts associated with alternative path. • Task E - Efforts to build research capacity and develop partnerships through faculty fellowships with DOE national labs The research and education programs conducted through this grant have led to three primary results. They include: • A better knowledge base related to and understanding of biomass deconstruction, through both mechanical size reduction and chemical processing • A better source of information related to biomass, bioenergy, and bioproducts for researchers and general public users through the BioWeb system. • Stronger research ties between land-grant universities and DOE National Labs through the faculty fellowship program. In addition to the scientific knowledge and resources developed, funding through this program produced a minimum of eleven (11) scientific publications and contributed to the research behind at least one patent.

  8. SPAM CLASSIFICATION BASED ON SUPERVISED LEARNING USING MACHINE LEARNING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    T. Hamsapriya

    2011-12-01

    Full Text Available E-mail is one of the most popular and frequently used ways of communication due to its worldwide accessibility, relatively fast message transfer, and low sending cost. The flaws in the e-mail protocols and the increasing amount of electronic business and financial transactions directly contribute to the increase in e-mail-based threats. Email spam is one of the major problems of the today’s Internet, bringing financial damage to companies and annoying individual users. Spam emails are invading users without their consent and filling their mail boxes. They consume more network capacity as well as time in checking and deleting spam mails. The vast majority of Internet users are outspoken in their disdain for spam, although enough of them respond to commercial offers that spam remains a viable source of income to spammers. While most of the users want to do right think to avoid and get rid of spam, they need clear and simple guidelines on how to behave. In spite of all the measures taken to eliminate spam, they are not yet eradicated. Also when the counter measures are over sensitive, even legitimate emails will be eliminated. Among the approaches developed to stop spam, filtering is the one of the most important technique. Many researches in spam filtering have been centered on the more sophisticated classifier-related issues. In recent days, Machine learning for spam classification is an important research issue. The effectiveness of the proposed work is explores and identifies the use of different learning algorithms for classifying spam messages from e-mail. A comparative analysis among the algorithms has also been presented.

  9. Drying techniques for the visualisation of agarose-based chromatography media by scanning electron microscopy.

    Science.gov (United States)

    Nweke, Mauryn C; Turmaine, Mark; McCartney, R Graham; Bracewell, Daniel G

    2017-03-01

    The drying of chromatography resins prior to scanning electron microscopy is critical to image resolution and hence understanding of the bead structure at sub-micron level. Achieving suitable drying conditions is especially important with agarose-based chromatography resins, as over-drying may cause artefact formation, bead damage and alterations to ultrastructural properties; and under-drying does not provide sufficient resolution for visualization under SEM. This paper compares and contrasts the effects of two drying techniques, critical point drying and freeze drying, on the morphology of two agarose based resins (MabSelect™/dw ≈85 µm and Capto™ Adhere/dw ≈75 µm) and provides a complete method for both. The results show that critical point drying provides better drying and subsequently clearer ultrastructural visualization of both resins under SEM. Under this protocol both the polymer fibers (thickness ≈20 nm) and the pore sizes (diameter ≈100 nm) are clearly visible. Freeze drying is shown to cause bead damage to both resins, but to different extents. MabSelect resin encounters extensive bead fragmentation, whilst Capto Adhere resin undergoes partial bead disintegration, corresponding with the greater extent of agarose crosslinking and strength of this resin. While freeze drying appears to be the less favorable option for ultrastructural visualization of chromatography resin, it should be noted that the extent of fracturing caused by the freeze drying process may provide some insight into the mechanical properties of agarose-based chromatography media.

  10. Evaluation of a school-based diabetes education intervention, an extension of Program ENERGY

    Science.gov (United States)

    Conner, Matthew David

    Background: The prevalence of both obesity and type 2 diabetes in the United States has increased over the past two decades and rates remain high. The latest data from the National Center for Health Statistics estimates that 36% of adults and 17% of children and adolescents in the US are obese (CDC Adult Obesity, CDC Childhood Obesity). Being overweight or obese greatly increases one's risk of developing several chronic diseases, such as type 2 diabetes. Approximately 8% of adults in the US have diabetes, type 2 diabetes accounts for 90-95% of these cases. Type 2 diabetes in children and adolescents is still rare, however clinical reports suggest an increase in the frequency of diagnosis (CDC Diabetes Fact Sheet, 2011). Results from the Diabetes Prevention Program show that the incidence of type 2 diabetes can be reduced through the adoption of a healthier lifestyle among high-risk individuals (DPP, 2002). Objectives: This classroom-based intervention included scientific coverage of energy balance, diabetes, diabetes prevention strategies, and diabetes management. Coverage of diabetes management topics were included in lesson content to further the students' understanding of the disease. Measurable short-term goals of the intervention included increases in: general diabetes knowledge, diabetes management knowledge, and awareness of type 2 diabetes prevention strategies. Methods: A total of 66 sixth grade students at Tavelli Elementary School in Fort Collins, CO completed the intervention. The program consisted of nine classroom-based lessons; students participated in one lesson every two weeks. The lessons were delivered from November of 2005 to May of 2006. Each bi-weekly lesson included a presentation and interactive group activities. Participants completed two diabetes knowledge questionnaires at baseline and post intervention. A diabetes survey developed by Program ENERGY measured general diabetes knowledge and awareness of type 2 diabetes prevention strategies

  11. Viable yet Protected for Future Generations? An Examination of the Extensive Forest-Based Tourism Market

    Directory of Open Access Journals (Sweden)

    Hana Sakata

    2012-12-01

    Full Text Available Abstract This article focuses on forest tourism and rainforests in particular, and explores their potential to contribute to the global tourism industry. The specific objectives of the study were to develop a profile, including motivations, of tourists visiting the Wet Tropics rainforest of Australia and to identify previous patterns of forest visitation in both Australia and other global destinations. A survey of 1,408 visitors conducted at a number of Wet Tropics rainforest sites in the tropical north region of Australia found that over 37% of the sample had previously visited forests while on holidays indicating that forest-based tourism is a major component to the nature-based market. Countries and forested sites in South-East Asia were the most popular as holiday attractions with over 13% of respondents having visited these sites. This was followed by countries of the South Pacific, North America, South America, Central America, Africa, South Asia and China, the Caribbean and Europe. While overall this is a promising result, forest-based tourism faces a number of pressures including urban settlement, extractive industries and in the near future climate change. Keywords: forests; rainforests; nature-based tourism; Tropical North Queensland; Wet Tropics rainforest. Resumo Este artigo enfoca o turismo de florestas e florestas tropicais em particular e explora seu potencial em contribuir para a indústria de turismo global. Os objetivos específicos deste estudo foram: desenvolver um perfil, incluindo as motivações, dos turistas que visitam a Wet Tropics, floresta tropical da Austrália e identificar padrões anteriores de visitação de florestas tanto na Austrália quanto em outros destinos globais. Uma pesquisa com 1.408 visitantes conduzida em vários locais com florestas tropicais Wet Tropics na região tropical norte da Austrália concluiu que mais de 37% da amostra já tinham visitado previamente as florestas quando estavam em f

  12. WSN- and IOT-Based Smart Homes and Their Extension to Smart Buildings.

    Science.gov (United States)

    Ghayvat, Hemant; Mukhopadhyay, Subhas; Gui, Xiang; Suryadevara, Nagender

    2015-05-04

    Our research approach is to design and develop reliable, efficient, flexible, economical, real-time and realistic wellness sensor networks for smart home systems. The heterogeneous sensor and actuator nodes based on wireless networking technologies are deployed into the home environment. These nodes generate real-time data related to the object usage and movement inside the home, to forecast the wellness of an individual. Here, wellness stands for how efficiently someone stays fit in the home environment and performs his or her daily routine in order to live a long and healthy life. We initiate the research with the development of the smart home approach and implement it in different home conditions (different houses) to monitor the activity of an inhabitant for wellness detection. Additionally, our research extends the smart home system to smart buildings and models the design issues related to the smart building environment; these design issues are linked with system performance and reliability. This research paper also discusses and illustrates the possible mitigation to handle the ISM band interference and attenuation losses without compromising optimum system performance.

  13. Bio-inspired computational techniques based on advanced condition monitoring

    Institute of Scientific and Technical Information of China (English)

    Su Liangcheng; He Shan; Li Xiaoli; Li Xinglin

    2011-01-01

    The application of bio-inspired computational techniques to the field of condition monitoring is addressed.First, the bio-inspired computational techniques are briefly addressed; the advantages and disadvantages of these computational methods are made clear. Then, the roles of condition monitoring in the predictive maintenance and failures prediction and the development trends of condition monitoring are discussed. Finally, a case study on the condition monitoring of grinding machine is described, which shows the application of bio-inspired computational technique to a practical condition monitoring system.

  14. CORE: Common Region Extension Based Multiple Protein Structure Alignment for Producing Multiple Solution

    Institute of Scientific and Technical Information of China (English)

    Woo-Cheol Kim; Sanghyun Park; Jung-Im Won

    2013-01-01

    Over the past several decades,biologists have conducted numerous studies examining both general and specific functions of proteins.Generally,if similarities in either the structure or sequence of amino acids exist for two proteins,then a common biological function is expected.Protein function is determined primarily based on the structure rather than the sequence of amino acids.The algorithm for protein structure alignment is an essential tool for the research.The quality of the algorithm depends on the quality of the similarity measure that is used,and the similarity measure is an objective function used to determine the best alignment.However,none of existing similarity measures became golden standard because of their individual strength and weakness.They require excessive filtering to find a single alignment.In this paper,we introduce a new strategy that finds not a single alignment,but multiple alignments with different lengths.This method has obvious benefits of high quality alignment.However,this novel method leads to a new problem that the running time for this method is considerably longer than that for methods that find only a single alignment.To address this problem,we propose algorithms that can locate a common region (CORE) of multiple alignment candidates,and can then extend the CORE into multiple alignments.Because the CORE can be defined from a final alignment,we introduce CORE* that is similar to CORE and propose an algorithm to identify the CORE*.By adopting CORE* and dynamic programming,our proposed method produces multiple alignments of various lengths with higher accuracy than previous methods.In the experiments,the alignments identified by our algorithm are longer than those obtained by TM-align by 17% and 15.48%,on average,when the comparison is conducted at the level of super-family and fold,respectively.

  15. A New Image Steganography Based On First Component Alteration Technique

    CERN Document Server

    Kaur, Amanpreet; Sikka, Geeta

    2010-01-01

    In this paper, A new image steganography scheme is proposed which is a kind of spatial domain technique. In order to hide secret data in cover-image, the first component alteration technique is used. Techniques used so far focuses only on the two or four bits of a pixel in a image (at the most five bits at the edge of an image) which results in less peak to signal noise ratio and high root mean square error. In this technique, 8 bits of blue components of pixels are replaced with secret data bits. Proposed scheme can embed more data than previous schemes and shows better image quality. To prove this scheme, several experiments are performed, and are compared the experimental results with the related previous works.

  16. A New Image Steganography Based On First Component Alteration Technique

    Directory of Open Access Journals (Sweden)

    Amanpreet Kaur

    2009-12-01

    Full Text Available In this paper, A new image steganography scheme is proposed which is a kind of spatial domain technique. In order to hide secret data in cover-image, the first component alteration technique is used. Techniques used so far focuses only on the two or four bits of a pixel in a image (at the most five bits at the edge of an image which results in less peak to signal noise ratio and high root mean square error. In this technique, 8 bits of blue components of pixels are replaced with secret data bits. Proposed scheme can embed more data than previous schemes and shows better image quality. To prove this scheme, several experiments are performed, and are compared the experimental results with the related previous works.Keywords—image; mean square error; Peak signal to noise ratio; steganography;

  17. SNMP Based Network Optimization Technique Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    M. Mohamed Surputheen

    2012-03-01

    Full Text Available Genetic Algorithms (GAs has innumerable applications through the optimization techniques and network optimization is one of them. SNMP (Simple Network Management Protocol is used as the basic network protocol for monitoring the network activities health of the systems. This paper deals with adding Intelligence to the various aspects of SNMP by adding optimization techniques derived out of genetic algorithms, which enhances the performance of SNMP processes like routing.

  18. EDM COLLABORATIVE MANUFACTURING SYSTEM BASED ON MULTI-AGENT TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    Zhao Wansheng; Zhao Jinzhi; Song Yinghui; Yang Xiaodong

    2003-01-01

    A framework for building EDM collaborative manufacturing system using multi-agent technology to support organizations characterized by physically distributed, enterprise-wide, heterogeneous intelligent manufacturing system over Internet is proposed. Expert system theory is introduced.Design, manufacturing and technological knowledge are shared using artificial intelligence and web techniques by EDM-CADagent, EDM-CAMagent and EDM-CAPPagent. System structure, design process, network conditions, realization methods and other key techniques are discussed. Instances are also introduced to testify feasibility.

  19. Biogeosystem technique as a base of Sustainable Irrigated Agriculture

    Science.gov (United States)

    Batukaev, Abdulmalik

    2016-04-01

    The world water strategy is to be changed because the current imitational gravitational frontal isotropic-continual paradigm of irrigation is not sustainable. This paradigm causes excessive consumption of fresh water - global deficit - up to 4-15 times, adverse effects on soils and landscapes. Current methods of irrigation does not control the water spread throughout the soil continuum. The preferable downward fluxes of irrigation water are forming, up to 70% and more of water supply loses into vadose zone. The moisture of irrigated soil is high, soil loses structure in the process of granulometric fractions flotation decomposition, the stomatal apparatus of plant leaf is fully open, transpiration rate is maximal. We propose the Biogeosystem technique - the transcendental, uncommon and non-imitating methods for Sustainable Natural Resources Management. New paradigm of irrigation is based on the intra-soil pulse discrete method of water supply into the soil continuum by injection in small discrete portions. Individual volume of water is supplied as a vertical cylinder of soil preliminary watering. The cylinder position in soil is at depth form 10 to 30 cm. Diameter of cylinder is 1-2 cm. Within 5-10 min after injection the water spreads from the cylinder of preliminary watering into surrounding soil by capillary, film and vapor transfer. Small amount of water is transferred gravitationally to the depth of 35-40 cm. The soil watering cylinder position in soil profile is at depth of 5-50 cm, diameter of the cylinder is 2-4 cm. Lateral distance between next cylinders along the plant raw is 10-15 cm. The soil carcass which is surrounding the cylinder of non-watered soil remains relatively dry and mechanically stable. After water injection the structure of soil in cylinder restores quickly because of no compression from the stable adjoining volume of soil and soil structure memory. The mean soil thermodynamic water potential of watered zone is -0.2 MPa. At this potential

  20. Retention of denture bases fabricated by three different processing techniques – An in vivo study

    Science.gov (United States)

    Chalapathi Kumar, V. H.; Surapaneni, Hemchand; Ravikiran, V.; Chandra, B. Sarat; Balusu, Srilatha; Reddy, V. Naveen

    2016-01-01

    Aim: Distortion due to Polymerization shrinkage compromises the retention. To evaluate the amount of retention of denture bases fabricated by conventional, anchorized, and injection molding polymerization techniques. Materials and Methods: Ten completely edentulous patients were selected, impressions were made, and master cast obtained was duplicated to fabricate denture bases by three polymerization techniques. Loop was attached to the finished denture bases to estimate the force required to dislodge them by retention apparatus. Readings were subjected to nonparametric Friedman two-way analysis of variance followed by Bonferroni correction methods and Wilcoxon matched-pairs signed-ranks test. Results: Denture bases fabricated by injection molding (3740 g), anchorized techniques (2913 g) recorded greater retention values than conventional technique (2468 g). Significant difference was seen between these techniques. Conclusions: Denture bases obtained by injection molding polymerization technique exhibited maximum retention, followed by anchorized technique, and least retention was seen in conventional molding technique. PMID:27382542

  1. An Empirical Comparative Study of Checklist-based and Ad Hoc Code Reading Techniques in a Distributed Groupware Environment

    Directory of Open Access Journals (Sweden)

    Adenike O. Osofisan

    2009-09-01

    Full Text Available Software inspection is a necessary and important tool for software quality assurance. Since it was introduced by Fagan at IBM in 1976, arguments exist as to which method should be adopted to carry out the exercise, whether it should be paper-based or tool-based, and what reading technique should be used on the inspection document. Extensive works have been done to determine the effectiveness of reviewers in paper-based environment when using ad hoc and checklist reading techniques. In this work, we take the software inspection research further by examining whether there is going to be any significant difference in defect detection effectiveness of reviewers when they use either ad hoc or checklist reading techniques in a distributed groupware environment. Twenty final year undergraduate students of computer science, divided into ad hoc and checklist reviewers groups of ten members each were employed to inspect a medium-sized java code synchronously on groupware deployed on the Internet. The data obtained were subjected to tests of hypotheses using independent t-test and correlation coefficients. Results from the study indicate that there are no significant differences in the defect detection effectiveness, effort in terms of time taken in minutes and false positives reported by the reviewers using either ad hoc or checklist based reading techniques in the distributed groupware environment studied.Key words: Software Inspection, Ad hoc, Checklist, groupware.

  2. Skull base tumours part I: Imaging technique, anatomy and anterior skull base tumours

    Energy Technology Data Exchange (ETDEWEB)

    Borges, Alexandra [Instituto Portugues de Oncologia Francisco Gentil, Centro de Lisboa, Servico de Radiologia, Rua Professor Lima Basto, 1093 Lisboa Codex (Portugal)], E-mail: borgesalexandra@clix.pt

    2008-06-15

    Advances in cross-sectional imaging, surgical technique and adjuvant treatment have largely contributed to ameliorate the prognosis, lessen the morbidity and mortality of patients with skull base tumours and to the growing medical investment in the management of these patients. Because clinical assessment of the skull base is limited, cross-sectional imaging became indispensable in the diagnosis, treatment planning and follow-up of patients with suspected skull base pathology and the radiologist is increasingly responsible for the fate of these patients. This review will focus on the advances in imaging technique; contribution to patient's management and on the imaging features of the most common tumours affecting the anterior skull base. Emphasis is given to a systematic approach to skull base pathology based upon an anatomic division taking into account the major tissue constituents in each skull base compartment. The most relevant information that should be conveyed to surgeons and radiation oncologists involved in patient's management will be discussed.

  3. Galois Field Based Very Fast and Compact Error Correcting Technique

    Directory of Open Access Journals (Sweden)

    Alin Sindhu.A,

    2014-01-01

    Full Text Available As the technology is improving the memory devices are becoming larger, so powerful error correction codes are needed. Error correction codes are commonly used to protect memories from soft errors, which change the logical value of memory cells without damaging the circuit. These codes can correct a large number of errors, but generally require complex decoders. In order to avoid this decoding complexity, in this project it uses Euclidean geometry LDPC codes with one step majority decoding technique. This method detects words having error in the first iteration of the majority logic decoding process and reduces the decoding time by stopping the decoding process when no errors are detected as well as reduces the memory access time. And the result obtained through this technique also proves that it is an effective and compact error correcting technique.

  4. A Monte-Carlo based extension of the Meteor Orbit and Trajectory Software (MOTS) for computations of orbital elements

    Science.gov (United States)

    Albin, T.; Koschny, D.; Soja, R.; Srama, R.; Poppe, B.

    2016-01-01

    The Canary Islands Long-Baseline Observatory (CILBO) is a double station meteor camera system (Koschny et al., 2013; Koschny et al., 2014) that consists of 5 cameras. The two cameras considered in this report are ICC7 and ICC9, and are installed on Tenerife and La Palma. They point to the same atmospheric volume between both islands allowing stereoscopic observation of meteors. Since its installation in 2011 and the start of operation in 2012 CILBO has detected over 15000 simultaneously observed meteors. Koschny and Diaz (2002) developed the Meteor Orbit and Trajectory Software (MOTS) to compute the trajectory of such meteors. The software uses the astrometric data from the detection software MetRec (Molau, 1998) and determines the trajectory in geodetic coordinates. This work presents a Monte-Carlo based extension of the MOTS code to compute the orbital elements of simultaneously detected meteors by CILBO.

  5. Lepton mass and mixing in a simple extension of the Standard Model based on T7 flavor symmetry

    CERN Document Server

    Vien, V V

    2016-01-01

    A simple Standard Model Extension based on $T_7$ flavor symmetry which accommodates lepton mass and mixing with non-zero $\\theta_{13}$ and CP violation phase is proposed. At the tree- level, the realistic lepton mass and mixing pattern is derived through the spontaneous symmetry breaking by just one vacuum expectation value ($v$) which is the same as in the Standard Model. Neutrinos get small masses from one $SU(2)_L$ doublet and two $SU(2)_L$ singlets in which one being in $\\underline{1}$ and the two others in $\\underline{3}$ and $\\underline{3}^*$ under $T_7$ , respectively. The model also gives a remarkable prediction of Dirac CP violation $\\delta_{CP}=172.598^\\circ$ in both normal and inverted hierarchies which is still missing in the neutrino mixing matrix.

  6. The Assessment of Comprehensive Vulnerability of Chemical Industrial Park Based on Entropy Method and Matter-element Extension Model

    Directory of Open Access Journals (Sweden)

    Yan Jingyi

    2016-01-01

    Full Text Available The paper focuses on studying connotative meaning, evaluation methods and models for chemical industry park based on in-depth analysis of relevant research results in China and abroad, it summarizes and states the feature of menacing vulnerability and structural vulnerability and submits detailed influence factors such as personnel vulnerability, infrastructural vulnerability, environmental vulnerability and the vulnerability of safety managerial defeat. Using vulnerability scoping diagram establishes 21 evaluation indexes and an index system for the vulnerability evaluation of chemical industrial park. The comprehensive weights are calculated with entropy method, combining matter-element extension model to make the quantitative evaluation, then apply to evaluate some chemical industrial park successfully. This method provides a new ideas and ways for enhancing overall safety of the chemical industrial park.

  7. Chain extension and branching of poly(L-lactic acid produced by reaction with a DGEBA-based epoxy resin

    Directory of Open Access Journals (Sweden)

    2007-11-01

    Full Text Available Dicarboxylated poly(L-lactic acid (PLLA was synthesized by reacting succinic anhydride with L-lactic acid prepolymer prepared by melt polycondensation. PLLA and epoxy resin based on diglycidyl ether of bisphenol A (DGEBA copolymers were prepared by chain extension of dicarboxylated PLLA with DGEBA. Infrared spectra confirmed the formation of dicarboxylated PLLA and PLLA/DGEBA copolymer. Influences of reaction temperature, reaction time, and the amount of DGEBA on the molecular weight and gel content of PLLA/DGEBA copolymer were studied. The viscosity average molecular weight of PLLA/DGEBA copolymer reached 87 900 when reaction temperature, reaction time, and mol ratio of dicarboxylated PLLA to DGEBA is 150°C, 30 min, and 1:1 respectively, while gel content of PLLA/DGEBA copolymer is almost zero.

  8. A novel technique for extracting clouds base height using ground based imaging

    Directory of Open Access Journals (Sweden)

    E. Hirsch

    2011-01-01

    Full Text Available The height of a cloud in the atmospheric column is a key parameter in its characterization. Several remote sensing techniques (passive and active, either ground-based or on space-borne platforms and in-situ measurements are routinely used in order to estimate top and base heights of clouds. In this article we present a novel method that combines thermal imaging from the ground and sounded wind profile in order to derive the cloud base height. This method is independent of cloud types, making it efficient for both low boundary layer and high clouds. In addition, using thermal imaging ensures extraction of clouds' features during daytime as well as at nighttime. The proposed technique was validated by comparison to active sounding by ceilometers (which is a standard ground based method, to lifted condensation level (LCL calculations, and to MODIS products obtained from space. As all passive remote sensing techniques, the proposed method extracts only the height of the lowest cloud layer, thus upper cloud layers are not detected. Nevertheless, the information derived from this method can be complementary to space-borne cloud top measurements when deep-convective clouds are present. Unlike techniques such as LCL, this method is not limited to boundary layer clouds, and can extract the cloud base height at any level, as long as sufficient thermal contrast exists between the radiative temperatures of the cloud and its surrounding air parcel. Another advantage of the proposed method is its simplicity and modest power needs, making it particularly suitable for field measurements and deployment at remote locations. Our method can be further simplified for use with visible CCD or CMOS camera (although nighttime clouds will not be observed.

  9. RP-based Abrading Technique for Graphite EDM Electrode

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Traditional processes for machining mold cavities are lengthy and costly. EDM (electro-discharge machining) is the most commonly used technique to obtain complex mold cavities. However, some electrodes are difficult to fabricate because of the complexity. Applying RP (rapid prototyping) technology to fabricate an abrading tool which is used to abrade graphite EDM electrodes, the cost and cycle time can greatly be reduced. The paper describes the work being conducted in this area by the authors. This technique will find widespread application in rapid steel mold manufacturing.

  10. Review and Extension of CO₂-Based Methods to Determine Ventilation Rates with Application to School Classrooms.

    Science.gov (United States)

    Batterman, Stuart

    2017-02-04

    The ventilation rate (VR) is a key parameter affecting indoor environmental quality (IEQ) and the energy consumption of buildings. This paper reviews the use of CO₂ as a "natural" tracer gas for estimating VRs, focusing on applications in school classrooms. It provides details and guidance for the steady-state, build-up, decay and transient mass balance methods. An extension to the build-up method and an analysis of the post-exercise recovery period that can increase CO₂ generation rates are presented. Measurements in four mechanically-ventilated school buildings demonstrate the methods and highlight issues affecting their applicability. VRs during the school day fell below recommended minimum levels, and VRs during evening and early morning were on the order of 0.1 h(-1), reflecting shutdown of the ventilation systems. The transient mass balance method was the most flexible and advantageous method given the low air change rates and dynamic occupancy patterns observed in the classrooms. While the extension to the build-up method improved stability and consistency, the accuracy of this and the steady-state method may be limited. Decay-based methods did not reflect the VR during the school day due to heating, ventilation and air conditioning (HVAC) system shutdown. Since the number of occupants in classrooms changes over the day, the VR expressed on a per person basis (e.g., L·s(-1)·person(-1)) depends on the occupancy metric. If occupancy measurements can be obtained, then the transient mass balance method likely will provide the most consistent and accurate results among the CO₂-based methods. Improved VR measurements can benefit many applications, including research examining the linkage between ventilation and health.

  11. Lagrangian study of surface transport in the Kuroshio Extension area based on simulation of propagation of Fukushima-derived radionuclides

    CERN Document Server

    Prants, S V; Uleysky, M Yu

    2013-01-01

    Lagrangian approach is applied to study near-surface large-scale transport in the Kuroshio Extension area using a simulation with synthetic particles advected by AVISO altimetric velocity field. A material line technique is applied to find the origin of water masses in cold-core cyclonic rings pinched off from the jet in summer 2011. Tracking and Lagrangian maps provide the evidence of cross-jet transport. Fukushima derived caesium isotopes are used as Lagrangian tracers to study transport and mixing in the area a few months after the March of 2011 tsunami that caused a heavy damage of the Fukushima nuclear power plant (FNPP). Tracking maps are computed to trace the origin of water parcels with measured levels of Cs-134 and Cs-137 concentrations collected in two R/V cruises in June and July 2011 in the large area of the Northwest Pacific. It is shown that Lagrangian simulation is useful to finding the surface areas that are potentially dangerous due to the risk of radioactive contamination. The results of sim...

  12. Response Time Comparisons among Four Base Running Starting Techniques in Slow Pitch Softball.

    Science.gov (United States)

    Israel, Richard G.; Brown, Rodney L.

    1981-01-01

    Response times among four starting techniques (cross-over step, jab step, standing sprinter's start, and momentum start) were compared. The results suggest that the momentum start was the fastest starting technique for optimum speed in running bases. (FG)

  13. A Predicate Based Fault Localization Technique Based On Test Case Reduction

    Directory of Open Access Journals (Sweden)

    Rohit Mishra

    2015-08-01

    Full Text Available ABSTRACT In todays world software testing with statistical fault localization technique is one of most tedious expensive and time consuming activity. In faulty program a program element contrast dynamic spectra that estimate location of fault. There may have negative impact from coincidental correctness with these technique because in non failed run the fault can also be triggered out and if so disturb the assessment of fault location. Now eliminating of confounding rules on the recognizing the accuracy. In this paper coincidental correctness which is an effective interface is the reason of success of fault location. We can find out fault predicates by distribution overlapping of dynamic spectrum in failed runs and non failed runs and slacken the area by referencing the inter class distances of spectra to clamp the less suspicious candidate. After that we apply coverage matrix base reduction approach to reduce the test cases of that program and locate the fault in that program. Finally empirical result shows that our technique outshine with previous existing predicate based fault localization technique with test case reduction.

  14. DSPI system based on spatial carrier phase shifting technique

    Science.gov (United States)

    Wang, Yonghong; Li, Junrui; Sun, Jianfei; Yang, Lianxiang

    2013-10-01

    Digital Speckle Pattern Interferometry (DSPI) is an optical method for measuring small displacement and deformation. It allows whole field, non-contacting measurement of micro deformation. Traditional Temporal phase shifting has been used for quantitative analyses in DSPI. The technique requires the recording of at least three phase-shifted interferograms, which must be taken sequentially. This can lead to disturbances by thermal and mechanical fluctuations during the required recording time. In addition, fast object deformations cannot be detected. In this paper a DSPI system using Spatial Carrier Phase Shifting (SCPS) technique is introduced, which is useful for extracting quantitative displacement data from the system with only two interferograms. The sensitive direction of this system refers to the illumination direction and observation direction. The frequencies of the spatial carrier relates to the angle between reference light and observation direction. Fourier transform is adopted in the digital evaluation to filter out the frequencies links to the deformation of testing object. The phase is obtained from the complex matrix formed by inverse Fourier transform, and the phase difference and deformation are calculated subsequently. Comparing with conventional temporal phase shifting, the technique can achieve measuring the vibration and transient deformation of testing object. Experiment set-ups and results are presented in this paper, and the experiment results have shown the effectiveness and advantages of the SCPS technique.

  15. A novel image inpainting technique based on median diffusion

    Indian Academy of Sciences (India)

    Rajkumar L Biradar; Vinayadatt V Kohir

    2013-08-01

    Image inpainting is the technique of filling-in the missing regions and removing unwanted objects from an image by diffusing the pixel information from the neighbourhood pixels. Image inpainting techniques are in use over a long time for various applications like removal of scratches, restoring damaged/missing portions or removal of objects from the images, etc. In this study, we present a simple, yet unexplored (digital) image inpainting technique using median filter, one of the most popular nonlinear (order statistics) filters. The median is maximum likelihood estimate of location for the Laplacian distribution. Hence, the proposed algorithm diffuses median value of pixels from the exterior area into the inner area to be inpainted. The median filter preserves the edge which is an important property needed to inpaint edges. This technique is stable. Experimental results show remarkable improvements and works for homogeneous as well as heterogeneous background. PSNR (quantitative assessment) is used to compare inpainting results.

  16. Kernel-Based Discriminant Techniques for Educational Placement

    Science.gov (United States)

    Lin, Miao-hsiang; Huang, Su-yun; Chang, Yuan-chin

    2004-01-01

    This article considers the problem of educational placement. Several discriminant techniques are applied to a data set from a survey project of science ability. A profile vector for each student consists of five science-educational indicators. The students are intended to be placed into three reference groups: advanced, regular, and remedial.…

  17. MRA Based Efficient Database Storing and Fast Querying Technique

    Directory of Open Access Journals (Sweden)

    Mitko Kostov

    2017-02-01

    Full Text Available In this paper we consider a specific way of organizing 1D signals or 2D image databases, such that a more efficient storage and faster querying is achieved. A multiresolution technique of data processing is used in order of saving the most significant processed data.

  18. Evaluation of an Amino Acid−Based Formula in Infants Not Responding to Extensively Hydrolyzed Protein Formula

    Science.gov (United States)

    Vanderhoof, Jon; Moore, Nancy; de Boissieu, Delphine

    2016-01-01

    ABSTRACT Nearly 2% to 3% of infants and children younger than 3 years have confirmed cow's milk protein allergy with multiple clinical presentations including atopic dermatitis (AD), diarrhea, and vomiting/spitting up. Although most infants with cow's milk protein allergy experience clinical improvement with the use of an extensively hydrolyzed (EH) formula, highly sensitive infants may require an amino acid−based formula. In this observational, prospective study, 30 infants (1–12 months of age) with a history of weight loss and persistent allergic manifestations while on an EH formula were provided an amino acid−based formula for 12 weeks. Mean weight gain (z score change) improved +0.43 ± 0.28 (mean ± standard deviation) after the 12-week feeding period. Improvement was observed for many allergic symptoms including significant decreases in AD severity (P = 0.02). These results indicate the new amino acid–based infant formula supported healthy weight gain and improvement in allergic manifestations in infants not responding to EH formulas. PMID:27526059

  19. The Influence of an Extensive Inquiry-Based Field Experience on Pre-Service Elementary Student Teachers' Science Teaching Beliefs

    Science.gov (United States)

    Bhattacharyya, Sumita; Volk, Trudi; Lumpe, Andrew

    2009-06-01

    This study examined the effects of an extensive inquiry-based field experience on pre service elementary teachers’ personal agency beliefs, a composite measure of context beliefs and capability beliefs related to teaching science. The research combined quantitative and qualitative approaches and included an experimental group that utilized the inquiry method and a control group that used traditional teaching methods. Pre- and post-test scores for the experimental and control groups were compared. The context beliefs of both groups showed no significant change as a result of the experience. However, the control group’s capability belief scores, lower than those of the experimental group to start with, declined significantly; the experimental group’s scores remained unchanged. Thus, the inquiry-based field experience led to an increase in personal agency beliefs. The qualitative data suggested a new hypothesis that there is a spiral relationship among teachers’ ability to establish communicative relationships with students, desire for personal growth and improvement, ability to implement multiple instructional strategies, and possession of substantive content knowledge. The study concludes that inquiry-based student teaching should be encouraged in the training of elementary school science teachers. However, the meaning and practice of the inquiry method should be clearly delineated to ensure its correct implementation in the classroom.

  20. Wavelet-Based Techniques for the Gamma-Ray Sky

    CERN Document Server

    McDermott, Samuel D; Cholis, Ilias; Lee, Samuel K

    2015-01-01

    We demonstrate how the image analysis technique of wavelet decomposition can be applied to the gamma-ray sky to separate emission on different angular scales. New structures on scales that differ from the scales of the conventional astrophysical foreground and background uncertainties can be robustly extracted, allowing a model-independent characterization with no presumption of exact signal morphology. As a test case, we generate mock gamma-ray data to demonstrate our ability to extract extended signals without assuming a fixed spatial template. For some point source luminosity functions, our technique also allows us to differentiate a diffuse signal in gamma-rays from dark matter annihilation and extended gamma-ray point source populations in a data-driven way.

  1. Brain tumor segmentation based on a hybrid clustering technique

    Directory of Open Access Journals (Sweden)

    Eman Abdel-Maksoud

    2015-03-01

    This paper presents an efficient image segmentation approach using K-means clustering technique integrated with Fuzzy C-means algorithm. It is followed by thresholding and level set segmentation stages to provide an accurate brain tumor detection. The proposed technique can get benefits of the K-means clustering for image segmentation in the aspects of minimal computation time. In addition, it can get advantages of the Fuzzy C-means in the aspects of accuracy. The performance of the proposed image segmentation approach was evaluated by comparing it with some state of the art segmentation algorithms in case of accuracy, processing time, and performance. The accuracy was evaluated by comparing the results with the ground truth of each processed image. The experimental results clarify the effectiveness of our proposed approach to deal with a higher number of segmentation problems via improving the segmentation quality and accuracy in minimal execution time.

  2. Cleaning Verification Monitor Technique Based on Infrared Optical Methods

    Science.gov (United States)

    2004-10-01

    Cleaning Verification Techniques.” Real-time methods to provide both qualitative and quantitative assessments of surface cleanliness are needed for a...detection VCPI method offer a wide range of complementary capabilities in real-time surface cleanliness verification. Introduction Currently...also has great potential to reduce or eliminate premature failures of surface coatings caused by a lack of surface cleanliness . Additional

  3. Estimations of One Repetition Maximum and Isometric Peak Torque in Knee Extension Based on the Relationship Between Force and Velocity.

    Science.gov (United States)

    Sugiura, Yoshito; Hatanaka, Yasuhiko; Arai, Tomoaki; Sakurai, Hiroaki; Kanada, Yoshikiyo

    2016-04-01

    We aimed to investigate whether a linear regression formula based on the relationship between joint torque and angular velocity measured using a high-speed video camera and image measurement software is effective for estimating 1 repetition maximum (1RM) and isometric peak torque in knee extension. Subjects comprised 20 healthy men (mean ± SD; age, 27.4 ± 4.9 years; height, 170.3 ± 4.4 cm; and body weight, 66.1 ± 10.9 kg). The exercise load ranged from 40% to 150% 1RM. Peak angular velocity (PAV) and peak torque were used to estimate 1RM and isometric peak torque. To elucidate the relationship between force and velocity in knee extension, the relationship between the relative proportion of 1RM (% 1RM) and PAV was examined using simple regression analysis. The concordance rate between the estimated value and actual measurement of 1RM and isometric peak torque was examined using intraclass correlation coefficients (ICCs). Reliability of the regression line of PAV and % 1RM was 0.95. The concordance rate between the actual measurement and estimated value of 1RM resulted in an ICC(2,1) of 0.93 and that of isometric peak torque had an ICC(2,1) of 0.87 and 0.86 for 6 and 3 levels of load, respectively. Our method for estimating 1RM was effective for decreasing the measurement time and reducing patients' burden. Additionally, isometric peak torque can be estimated using 3 levels of load, as we obtained the same results as those reported previously. We plan to expand the range of subjects and examine the generalizability of our results.

  4. 基于NET-SNMP的代理扩展实现%Implementation of Agent Extension Based on NET-SNMP

    Institute of Scientific and Technical Information of China (English)

    黄岩涛; 阮军洲

    2014-01-01

    针对网络中不同设备的高效管理问题,对NET-SNMP技术进行了简要介绍。NET-SNMP是一个开源的软件包,选择NET-SNMP开发网管系统既便于移植,也利于代理的扩展。介绍了NET-SNMP开发工具的工作原理,详细描述了在Linux环境下使用NET-SNMP开发工具扩展代理的流程和注意事项。代理软件采用模块化的结构,可根据需要扩展所支持的管理信息库模块,实现新的应用。结尾提出了NET-SNMP技术是一种实现远程管理功能的有效可行的方法。%Aiming at the effective management of different devices in network, this paper briefly introduces the NET-SNMP technology. The NET-SNMP is an open source software package. The selection of NET-SNMP for developing the network management system is convenient for migration and agent extension. This paper introduces the operating principle of NET-SNMP development tool, describes in detail the processes and precautions of extension agent of NET-SNMP development tool used in Linux environment. The agent software uses modular structure, so that the supported management information base module can be extended according to the requirements, in order to realize the new application. The conclusion shows that the NET-SNMP technology is an effective method for implementing the remote management function.

  5. On-Line Hydrogen-Isotope Measurements of Organic Samples Using Elemental Chromium : An Extension for High Temperature Elemental-Analyzer Techniques

    NARCIS (Netherlands)

    Gehre, Matthias; Renpenning, Julian; Gilevska, Tetyana; Qi, Haiping; Coplen, Tyler B.; Meijer, Harro A. J.; Brand, Willi A.; Schimmelmann, Arndt

    2015-01-01

    The high temperature conversion (HTC) technique using an elemental analyzer with a glassy carbon tube and filling (temperature conversion/elemental analysis, TC/EA) is a widely used method for hydrogen isotopic analysis of water and many solid and liquid organic samples with analysis by isotope-rati

  6. Identifying content-based and relational techniques to change behaviour in motivational interviewing.

    Science.gov (United States)

    Hardcastle, Sarah J; Fortier, Michelle; Blake, Nicola; Hagger, Martin S

    2017-03-01

    Motivational interviewing (MI) is a complex intervention comprising multiple techniques aimed at changing health-related motivation and behaviour. However, MI techniques have not been systematically isolated and classified. This study aimed to identify the techniques unique to MI, classify them as content-related or relational, and evaluate the extent to which they overlap with techniques from the behaviour change technique taxonomy version 1 [BCTTv1; Michie, S., Richardson, M., Johnston, M., Abraham, C., Francis, J., Hardeman, W., … Wood, C. E. (2013). The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: Building an international consensus for the reporting of behavior change interventions. Annals of Behavioral Medicine, 46, 81-95]. Behaviour change experts (n = 3) content-analysed MI techniques based on Miller and Rollnick's [(2013). Motivational interviewing: Preparing people for change (3rd ed.). New York: Guildford Press] conceptualisation. Each technique was then coded for independence and uniqueness by independent experts (n = 10). The experts also compared each MI technique to those from the BCTTv1. Experts identified 38 distinct MI techniques with high agreement on clarity, uniqueness, preciseness, and distinctiveness ratings. Of the identified techniques, 16 were classified as relational techniques. The remaining 22 techniques were classified as content based. Sixteen of the MI techniques were identified as having substantial overlap with techniques from the BCTTv1. The isolation and classification of MI techniques will provide researchers with the necessary tools to clearly specify MI interventions and test the main and interactive effects of the techniques on health behaviour. The distinction between relational and content-based techniques within MI is also an important advance, recognising that changes in motivation and behaviour in MI is a function of both intervention content and the interpersonal style

  7. Time-Reversal Based Range Extension Technique for Ultra-Wideband (UWB) Sensors and Applications in Tactical Communications and Networking

    Science.gov (United States)

    2008-07-16

    N00014-07-1-0529 Prepared by C) Robert C. Qiu "* (Principal Investigator) 0 together with (Contributing Researchers at Wireless Networking Systems Lab...connection between an FPGA and an ADC (or DAC) is still a bottleneck that restricts the system perfomance. These issues are covered in this report. The...Advanced DSP48E slices, featuring 25-bits x 18-bit two’s complement multiplier, optional pipeline stages for enhanced performance and optional 48

  8. Time-Reversal Based Range Extension Technique for Ultra-wideband (UWB) Sensors and Applications in Tactical Communications and Networking

    Science.gov (United States)

    2009-04-16

    physical PCB routing limitation in the cable -connector adapting PCB. A high speed ribbon cable /bus has been successfully used to connect the DAC...and the FPGA board in the transmitter. However, It is doubtable for this type of cables to meet our higher sampling rate requirement. One solution to...frequency • Local Oscillator: MITEQ LPLM15000 with 8 GHz - 15 GHz output frequency • PA: MITEQ AMF -6B-08001800-60-34P with 8 GHz - 18 GHz frequency

  9. Time-Reversal Based Range Extension Technique for Ultra-wideband (UWB) Sensors and Applications in Tactical Communications and Networking

    Science.gov (United States)

    2007-10-16

    1999. [93] H. Song, "Iterative Time Reversal in the Ocean," J. Acoust. Soc. Am., vol. 105, no. 6. [94] S. Kim, G. Edelmann , W. Kuperman, W. Hodgkiss, and...Channel Time-Reversal Acoustics," Appl. Phys. Lett., vol. 80, pp. 694-696, 2002. [97] G. Edelmann , T. Akal, W. Hodgkiss, S. Kim, K. W.A., and H. Song, "An

  10. Time-Reversal Based Range Extension technique for Ultra-wideband (UWB) Sensors and Applications in Tactical Communications and Networking

    Science.gov (United States)

    2010-01-28

    which integrates two 3 Gsps 8-bit ADCs, a clock circuitry, 2 banks of 1GByte DDR2 Memory each and a Xilinx Virtex5 LX110T-3 FPGA, under the 3U format...XC5VLX110T-3 (fastest speed grade available). Two DDR2 memory banks are accessible by the FPGA in order to store data on the fly. An SHB connector is

  11. A Robust Non-Blind Watermarking Technique for Color Video Based on Combined DWT-DFT Transforms and SVD Technique

    Directory of Open Access Journals (Sweden)

    Nandeesh B

    2014-08-01

    Full Text Available The rise of popularity of Digital video in the past decade has been tremendous thereby leading to malicious copying and distribution. So the need for preservation of ownership and in tackling copyright issues has become an imminent issue. Digital Video Watermarking has been in existence as a solution for this. The paper proposes a non-blind watermarking technique based on combined DWT-DFT transforms using singular values of SVD matrix in YCbCr color space. The technique uses Fibonacci series for selection of frames to enhance security and thereby maintaining quality of original video. Watermark encryption is done by scrambling the watermark using Arnold transform. Geometric and non-geometric attacks on watermarked video have been performed to test the robustness of the proposed technique. Quality of watermarked video is measured using PSNR and NC gives the similarity between extracted and the original watermark.

  12. Soft Computing Technique Based Enhancement of Transmission System Lodability Incorporating Facts

    Directory of Open Access Journals (Sweden)

    T. Vara Prasad,

    2014-07-01

    Full Text Available Due to the growth of electricity demands and transactions in power markets, existing power networks need to be enhanced in order to increase their loadability. The problem of determining the best locations for network reinforcement can be formulated as a mixed discrete-continuous nonlinear optimization problem (MDCP. The complexity of the problem makes extensive simulations necessary and the computational requirement is high. This paper compares the effectiveness of Evolutionary Programming (EP and an ordinal optimization (OO technique is proposed in this paper to solve the MDCP involving two types of flexible ac transmission systems (FACTS devices, namely static var compensator (SVC and thyristor controlled series compensator (TCSC, for system loadability enhancement. In this approach, crude models are proposed to cope with the complexity of the problem and speed up the simulations with high alignment confidence. The test and Validation of the proposed algorithm are conducted on IEEE 14–bus system and 22-bus Indian system.Simulation results shows that the proposed models permit the use of OO-based approach for finding good enough solutions with less computational efforts.

  13. GPU-Based Techniques for Global Illumination Effects

    CERN Document Server

    Szirmay-Kalos, László; Sbert, Mateu

    2008-01-01

    This book presents techniques to render photo-realistic images by programming the Graphics Processing Unit (GPU). We discuss effects such as mirror reflections, refractions, caustics, diffuse or glossy indirect illumination, radiosity, single or multiple scattering in participating media, tone reproduction, glow, and depth of field. This book targets game developers, graphics programmers, and also students with some basic understanding of computer graphics algorithms, rendering APIs like Direct3D or OpenGL, and shader programming. In order to make this book self-contained, the most important c

  14. Applying Knowledge-Based Techniques to Software Development.

    Science.gov (United States)

    Harandi, Mehdi T.

    1986-01-01

    Reviews overall structure and design principles of a knowledge-based programming support tool, the Knowledge-Based Programming Assistant, which is being developed at University of Illinois Urbana-Champaign. The system's major units (program design program coding, and intelligent debugging) and additional functions are described. (MBR)

  15. Satellite communication performance evaluation: Computational techniques based on moments

    Science.gov (United States)

    Omura, J. K.; Simon, M. K.

    1980-01-01

    Computational techniques that efficiently compute bit error probabilities when only moments of the various interference random variables are available are presented. The approach taken is a generalization of the well known Gauss-Quadrature rules used for numerically evaluating single or multiple integrals. In what follows, basic algorithms are developed. Some of its properties and generalizations are shown and its many potential applications are described. Some typical interference scenarios for which the results are particularly applicable include: intentional jamming, adjacent and cochannel interferences; radar pulses (RFI); multipath; and intersymbol interference. While the examples presented stress evaluation of bit error probilities in uncoded digital communication systems, the moment techniques can also be applied to the evaluation of other parameters, such as computational cutoff rate under both normal and mismatched receiver cases in coded systems. Another important application is the determination of the probability distributions of the output of a discrete time dynamical system. This type of model occurs widely in control systems, queueing systems, and synchronization systems (e.g., discrete phase locked loops).

  16. A novel fast full inversion based breast ultrasound elastography technique.

    Science.gov (United States)

    Karimi, Hirad; Fenster, Aaron; Samani, Abbas

    2013-04-07

    Cancer detection and classification have been the focus of many imaging and therapeutic research studies. Elastography is a non-invasive technique to visualize suspicious soft tissue areas where tissue stiffness is used as image contrast mechanism. In this study, a breast ultrasound elastography system including software and hardware is proposed. Unlike current elastography systems that image the tissue strain and present it as an approximation to relative tissue stiffness, this system is capable of imaging the breast absolute Young's modulus in fast fashion. To improve the quality of elastography images, a novel system consisting of two load cells has been attached to the ultrasound probe. The load cells measure the breast surface forces to be used for calculating the tissue stress distribution throughout the breast. To facilitate fast imaging, this stress calculation is conducted by an accelerated finite element method. Acquired tissue displacements and surface force data are used as input to the proposed Young's modulus reconstruction technique. Numerical and tissue mimicking phantom studies were conducted for validating the proposed system. These studies indicated that fast imaging of breast tissue absolute Young's modulus using the proposed ultrasound elastography system is feasible. The tissue mimicking phantom study indicated that the system is capable of providing reliable absolute Young's modulus values for both normal tissue and tumour as the maximum Young's modulus reconstruction error was less than 6%. This demonstrates that the proposed system has a good potential to be used for clinical breast cancer assessment.

  17. A novel fast full inversion based breast ultrasound elastography technique

    Science.gov (United States)

    Karimi, Hirad; Fenster, Aaron; Samani, Abbas

    2013-04-01

    Cancer detection and classification have been the focus of many imaging and therapeutic research studies. Elastography is a non-invasive technique to visualize suspicious soft tissue areas where tissue stiffness is used as image contrast mechanism. In this study, a breast ultrasound elastography system including software and hardware is proposed. Unlike current elastography systems that image the tissue strain and present it as an approximation to relative tissue stiffness, this system is capable of imaging the breast absolute Young’s modulus in fast fashion. To improve the quality of elastography images, a novel system consisting of two load cells has been attached to the ultrasound probe. The load cells measure the breast surface forces to be used for calculating the tissue stress distribution throughout the breast. To facilitate fast imaging, this stress calculation is conducted by an accelerated finite element method. Acquired tissue displacements and surface force data are used as input to the proposed Young’s modulus reconstruction technique. Numerical and tissue mimicking phantom studies were conducted for validating the proposed system. These studies indicated that fast imaging of breast tissue absolute Young’s modulus using the proposed ultrasound elastography system is feasible. The tissue mimicking phantom study indicated that the system is capable of providing reliable absolute Young’s modulus values for both normal tissue and tumour as the maximum Young’s modulus reconstruction error was less than 6%. This demonstrates that the proposed system has a good potential to be used for clinical breast cancer assessment.

  18. Study of systems and techniques for data base management

    Science.gov (United States)

    1976-01-01

    Data management areas were studied to identify pertinent problems and issues that will affect future NASA data users in terms of performance and cost. Specific topics discussed include the identifications of potential NASA data users other than those normally discussed, consideration affecting the clustering of minicomputers, low cost computer system for information retrieval and analysis, the testing of minicomputer based data base management systems, ongoing work related to the use of dedicated systems for data base management, and the problems of data interchange among a community of NASA data users.

  19. A measurement-based technique for incipient anomaly detection

    KAUST Repository

    Harrou, Fouzi

    2016-06-13

    Fault detection is essential for safe operation of various engineering systems. Principal component analysis (PCA) has been widely used in monitoring highly correlated process variables. Conventional PCA-based methods, nevertheless, often fail to detect small or incipient faults. In this paper, we develop new PCA-based monitoring charts, combining PCA with multivariate memory control charts, such as the multivariate cumulative sum (MCUSUM) and multivariate exponentially weighted moving average (MEWMA) monitoring schemes. The multivariate control charts with memory are sensitive to small and moderate faults in the process mean, which significantly improves the performance of PCA methods and widen their applicability in practice. Using simulated data, we demonstrate that the proposed PCA-based MEWMA and MCUSUM control charts are more effective in detecting small shifts in the mean of the multivariate process variables, and outperform the conventional PCA-based monitoring charts. © 2015 IEEE.

  20. Quartile Clustering: A quartile based technique for Generating Meaningful Clusters

    CERN Document Server

    Goswami, Saptarsi

    2012-01-01

    Clustering is one of the main tasks in exploratory data analysis and descriptive statistics where the main objective is partitioning observations in groups. Clustering has a broad range of application in varied domains like climate, business, information retrieval, biology, psychology, to name a few. A variety of methods and algorithms have been developed for clustering tasks in the last few decades. We observe that most of these algorithms define a cluster in terms of value of the attributes, density, distance etc. However these definitions fail to attach a clear meaning/semantics to the generated clusters. We argue that clusters having understandable and distinct semantics defined in terms of quartiles/halves are more appealing to business analysts than the clusters defined by data boundaries or prototypes. On the samepremise, we propose our new algorithm named as quartile clustering technique. Through a series of experiments we establish efficacy of this algorithm. We demonstrate that the quartile clusteri...

  1. Feature based sliding window technique for face recognition

    Science.gov (United States)

    Javed, Muhammad Younus; Mohsin, Syed Maajid; Anjum, Muhammad Almas

    2010-02-01

    Human beings are commonly identified by biometric schemes which are concerned with identifying individuals by their unique physical characteristics. The use of passwords and personal identification numbers for detecting humans are being used for years now. Disadvantages of these schemes are that someone else may use them or can easily be forgotten. Keeping in view of these problems, biometrics approaches such as face recognition, fingerprint, iris/retina and voice recognition have been developed which provide a far better solution when identifying individuals. A number of methods have been developed for face recognition. This paper illustrates employment of Gabor filters for extracting facial features by constructing a sliding window frame. Classification is done by assigning class label to the unknown image that has maximum features similar to the image stored in the database of that class. The proposed system gives a recognition rate of 96% which is better than many of the similar techniques being used for face recognition.

  2. A fast Stokes inversion technique based on quadratic regression

    Science.gov (United States)

    Teng, Fei; Deng, Yuan-Yong

    2016-05-01

    Stokes inversion calculation is a key process in resolving polarization information on radiation from the Sun and obtaining the associated vector magnetic fields. Even in the cases of simple local thermodynamic equilibrium (LTE) and where the Milne-Eddington approximation is valid, the inversion problem may not be easy to solve. The initial values for the iterations are important in handling the case with multiple minima. In this paper, we develop a fast inversion technique without iterations. The time taken for computation is only 1/100 the time that the iterative algorithm takes. In addition, it can provide available initial values even in cases with lower spectral resolutions. This strategy is useful for a filter-type Stokes spectrograph, such as SDO/HMI and the developed two-dimensional real-time spectrograph (2DS).

  3. Study of hydrogen in coals, polymers, oxides, and muscle water by nuclear magnetic resonance; extension of solid-state high-resolution techniques. [Hydrogen molybdenum bronze

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, L.M.

    1981-10-01

    Nuclear magnetic resonance (NMR) spectroscopy has been an important analytical and physical research tool for several decades. One area of NMR which has undergone considerable development in recent years is high resolution NMR of solids. In particular, high resolution solid state /sup 13/C NMR spectra exhibiting features similar to those observed in liquids are currently achievable using sophisticated pulse techniques. The work described in this thesis develops analogous methods for high resolution /sup 1/H NMR of rigid solids. Applications include characterization of hydrogen aromaticities in fossil fuels, and studies of hydrogen in oxides and bound water in muscle.

  4. Cost-optimal power system extension under flow-based market coupling and high shares of photovoltaics

    Energy Technology Data Exchange (ETDEWEB)

    Hagspiel, Simeon; Jaegemann, Cosima; Lindenberger, Dietmar [Koeln Univ. (Germany). Inst. of Energy Economics; Cherevatskiy, Stanislav; Troester, Eckehard; Brown, Tom [Energynautics GmbH, Langen (Germany)

    2012-07-01

    Electricity market models, implemented as dynamic programming problems, have been applied widely to identify possible pathways towards a cost-optimal and low carbon electricity system. However, the joint optimization of generation and transmission remains challenging, mainly due to the fact that different characteristics and rules apply to commercial and physical exchanges of electricity in meshed networks. This paper presents a methodology that allows to optimize power generation and transmission infrastructures jointly through an iterative approach based on power transfer distribution factors (PTDFs). As PTDFs are linear representations of the physical load flow equations, they can be implemented in a linear programming environment suitable for large scale problems such as the European power system. The algorithm iteratively updates PTDFs when grid infrastructures are modified due to cost-optimal extension and thus yields an optimal solution with a consistent representation of physical load flows. The method is demonstrated on a simplified three-node model where it is found to be stable and convergent. It is then scaled to the European level in order to find the optimal power system infrastructure development under the prescription of strongly decreasing CO{sub 2} emissions in Europe until 2050 with a specific focus on photovoltaic (PV) power. (orig.)

  5. Wavelet packet transform-based robust video watermarking technique

    Indian Academy of Sciences (India)

    Gaurav Bhatnagar; Balasubrmanian Raman

    2012-06-01

    In this paper, a wavelet packet transform (WPT)-based robust video watermarking algorithm is proposed. A visible meaningful binary image is used as the watermark. First, sequent frames are extracted from the video clip. Then, WPT is applied on each frame and from each orientation one sub-band is selected based on block mean intensity value called robust sub-band. Watermark is embedded in the robust sub-bands based on the relationship between wavelet packet coefficient and its 8-neighbour $(D_8)$ coefficients considering the robustness and invisibility. Experimental results and comparison with existing algorithms show the robustness and the better performance of the proposed algorithm.

  6. Practical Network-Based Techniques for Mobile Positioning in UMTS

    Directory of Open Access Journals (Sweden)

    Borkowski Jakub

    2006-01-01

    Full Text Available This paper presents results of research on network-based positioning for UMTS (universal mobile telecommunication system. Two new applicable network-based cellular location methods are proposed and assessed by field measurements and simulations. The obtained results indicate that estimation of the position at a sufficient accuracy for most of the location-based services does not have to involve significant changes in the terminals and in the network infrastructure. In particular, regular UMTS terminals can be used in the presented PCM (pilot correlation method, while the other proposed method - the ECID+RTT (cell identification + round trip time requires only minor software updates in the network and user equipment. The performed field measurements of the PCM reveal that in an urban network, of users can be located with an accuracy of m. In turn, simulations of the ECID+RTT report accuracy of m– m for of the location estimates in an urban scenario.

  7. Estimating monthly temperature using point based interpolation techniques

    Science.gov (United States)

    Saaban, Azizan; Mah Hashim, Noridayu; Murat, Rusdi Indra Zuhdi

    2013-04-01

    This paper discusses the use of point based interpolation to estimate the value of temperature at an unallocated meteorology stations in Peninsular Malaysia using data of year 2010 collected from the Malaysian Meteorology Department. Two point based interpolation methods which are Inverse Distance Weighted (IDW) and Radial Basis Function (RBF) are considered. The accuracy of the methods is evaluated using Root Mean Square Error (RMSE). The results show that RBF with thin plate spline model is suitable to be used as temperature estimator for the months of January and December, while RBF with multiquadric model is suitable to estimate the temperature for the rest of the months.

  8. Fluorometric Discrimination Technique of Phytoplankton Population Based on Wavelet Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Shanshan; SU Rongguo; DUAN Yali; ZHANG Cui; SONG Zhijie; WANG Xiulin

    2012-01-01

    The discrete excitation-emission-matrix fluorescence spectra(EEMS)at 12 excitation wavelengths (400,430,450,460,470,490,500,510,525,550,570,and 590 nm)and emission wavelengths ranging from 600-750 nm were determined for 43 phytoplankton species.A two-rank fluorescence spectra database was established by wavelet analysis and a fluorometric discrimination technique for determining phytoplankton population was developed.For laboratory simulatively mixed samples,the samples mixed from 43 algal species(the algae of one division accounted for 25%,50%,75%,85%,and 100% of the gross biomass,respectively),the average discrimination rates at the level of division were 65.0%,87.5%,98.6%,99.0%,and 99.1%,with average relative contents of 18.9%,44.5%,68.9%,73.4%,and 82.9%,respectively;the samples mixed from 32 red tide algal species(the dominant species accounted for 60%,70%,80%,90%,and 100% of the gross biomass,respectively),the average correct discrimination rates of the dominant species at the level of genus were 63.3%,74.2%,78.8%,83.4%,and 79.4%,respectively.For the 81 laboratory mixed samples with the dominant species accounting for 75% of the gross biomass(chlorophyll),the discrimination rates of the dominant species were 95.1% and 72.8% at the level of division and genus,respectively.For the 12 samples collected from the mesocosm experiment in Maidao Bay of Qingdao in August 2007,the dominant species of the 11 samples were recognized at the division level and the dominant species of four of the five samples in which the dominant species accounted for more than 80% of the gross biomass were discriminated at the genus level;for the 12 samples obtained from Jiaozhou Bay in August 2007,the dominant species of all the 12 samples were recognized at the division level.The technique can be directly applied to fluorescence spectrophotometers and to the developing of an in situ algae fluorescence auto-analyzer for

  9. The Visual Memory-Based Memorization Techniques in Piano Education

    Science.gov (United States)

    Yucetoker, Izzet

    2016-01-01

    Problem Statement: Johann Sebastian Bach is one of the leading composers of the baroque period. In addition to his huge contributions in the artistic dimension, he also served greatly in the field of education. This study has been done for determining the impact of visual memory-based memorization practices in the piano education on the visual…

  10. Customer requirements based ERP customization using AHP technique

    NARCIS (Netherlands)

    Parthasarathy, S.; Daneva, Maya

    2014-01-01

    Purpose– Customization is a difficult task for many organizations implementing enterprise resource planning (ERP) systems. The purpose of this paper is to develop a new framework based on customers’ requirements to examine the ERP customization choices for the enterprise. The analytical hierarchy pr

  11. Orientation precision of TEM-based orientation mapping techniques

    Energy Technology Data Exchange (ETDEWEB)

    Morawiec, A., E-mail: nmmorawi@cyf-kr.edu.pl [Institute of Metallurgy and Materials Science, Polish Academy of Sciences, Kraków (Poland); Bouzy, E. [Laboratoire d' Etude des Microstructures et de Mécanique des Matériaux, Université de Metz, Metz (France); Paul, H. [Institute of Metallurgy and Materials Science, Polish Academy of Sciences, Kraków (Poland); Fundenberger, J.J. [Laboratoire d' Etude des Microstructures et de Mécanique des Matériaux, Université de Metz, Metz (France)

    2014-01-15

    Automatic orientation mapping is an important addition to standard capabilities of conventional transmission electron microscopy (TEM) as it facilitates investigation of crystalline materials. A number of different such mapping systems have been implemented. One of their crucial characteristics is the orientation resolution. The precision in determination of orientations and misorientations reached in practice by TEM-based automatic mapping systems is the main subject of the paper. The analysis is focused on two methods: first, using spot diffraction patterns and ‘template matching’, and second, using Kikuchi patterns and detection of reflections. In simple terms, for typical mapping conditions, their precisions in orientation determination with the confidence of 95% are, respectively, 1.1° and 0.3°. The results are illustrated by example maps of cellular structure in deformed Al, the case for which high orientation sensitivity matters. For more direct comparison, a novel approach to mapping is used: the same patterns are solved by each of the two methods. Proceeding from a classification of the mapping systems, the obtained results may serve as indicators of precisions of other TEM-based orientation mapping methods. The findings are of significance for selection of methods adequate to investigated materials. - Highlights: • Classification of the existing TEM-based orientation mapping systems. • Reliable data on orientation precision in TEM-based orientation maps. • Orientation precisions in spot and Kikuchi based maps estimated to be 1.1° and 0.3°. • New method of mapping by using spot and Kikuchi components of the same patterns.

  12. Kernel-based machine learning techniques for infrasound signal classification

    Science.gov (United States)

    Tuma, Matthias; Igel, Christian; Mialle, Pierrick

    2014-05-01

    Infrasound monitoring is one of four remote sensing technologies continuously employed by the CTBTO Preparatory Commission. The CTBTO's infrasound network is designed to monitor the Earth for potential evidence of atmospheric or shallow underground nuclear explosions. Upon completion, it will comprise 60 infrasound array stations distributed around the globe, of which 47 were certified in January 2014. Three stages can be identified in CTBTO infrasound data processing: automated processing at the level of single array stations, automated processing at the level of the overall global network, and interactive review by human analysts. At station level, the cross correlation-based PMCC algorithm is used for initial detection of coherent wavefronts. It produces estimates for trace velocity and azimuth of incoming wavefronts, as well as other descriptive features characterizing a signal. Detected arrivals are then categorized into potentially treaty-relevant versus noise-type signals by a rule-based expert system. This corresponds to a binary classification task at the level of station processing. In addition, incoming signals may be grouped according to their travel path in the atmosphere. The present work investigates automatic classification of infrasound arrivals by kernel-based pattern recognition methods. It aims to explore the potential of state-of-the-art machine learning methods vis-a-vis the current rule-based and task-tailored expert system. To this purpose, we first address the compilation of a representative, labeled reference benchmark dataset as a prerequisite for both classifier training and evaluation. Data representation is based on features extracted by the CTBTO's PMCC algorithm. As classifiers, we employ support vector machines (SVMs) in a supervised learning setting. Different SVM kernel functions are used and adapted through different hyperparameter optimization routines. The resulting performance is compared to several baseline classifiers. All

  13. An RSA-Based Leakage-Resilient Authenticated Key Exchange Protocol Secure against Replacement Attacks, and Its Extensions

    Science.gov (United States)

    Shin, Seonghan; Kobara, Kazukuni; Imai, Hideki

    Secure channels can be realized by an authenticated key exchange (AKE) protocol that generates authenticated session keys between the involving parties. In [32], Shin et al., proposed a new kind of AKE (RSA-AKE) protocol whose goal is to provide high efficiency and security against leakage of stored secrets as much as possible. Let us consider more powerful attacks where an adversary completely controls the communications and the stored secrets (the latter is denoted by “replacement” attacks). In this paper, we first show that the RSA-AKE protocol [32] is no longer secure against such an adversary. The main contributions of this paper are as follows: (1) we propose an RSA-based leakage-resilient AKE (RSA-AKE2) protocol that is secure against active attacks as well as replacement attacks; (2) we prove that the RSA-AKE2 protocol is secure against replacement attacks based on the number theory results; (3) we show that it is provably secure in the random oracle model, by showing the reduction to the RSA one-wayness, under an extended model that covers active attacks and replacement attacks; (4) in terms of efficiency, the RSA-AKE2 protocol is comparable to [32] in the sense that the client needs to compute only one modular multiplication with pre-computation; and (5) we also discuss about extensions of the RSA-AKE2 protocol for several security properties (i.e., synchronization of stored secrets, privacy of client and solution to server compromise-impersonation attacks).

  14. Nonlinear ultrasonic measurements based on cross-correlation filtering techniques

    Science.gov (United States)

    Yee, Andrew; Stewart, Dylan; Bunget, Gheorghe; Kramer, Patrick; Farinholt, Kevin; Friedersdorf, Fritz; Pepi, Marc; Ghoshal, Anindya

    2017-02-01

    Cyclic loading of mechanical components promotes the formation of dislocation dipoles in metals, which can serve as precursors to crack nucleation and ultimately lead to failure. In the laboratory setting, an acoustic nonlinearity parameter has been assessed as an effective indicator for characterizing the progression of fatigue damage precursors. However, the need to use monochromatic waves of medium-to-high acoustic energy has presented a constraint, making it problematic for use in field applications. This paper presents a potential approach for field measurement of acoustic nonlinearity by using general purpose ultrasonic pulser-receivers. Nonlinear ultrasonic measurements during fatigue testing were analyzed by the using contact and immersion pulse-through method. A novel cross-correlation filtering technique was developed to extract the fundamental and higher harmonic waves from the signals. As in the case of the classic harmonic generation, the nonlinearity parameters of the second and third harmonics indicate a strong correlation with fatigue cycles. Consideration was given to potential nonlinearities in the measurement system, and tests have confirmed that measured second harmonic signals exhibit a linear dependence on the input signal strength, further affirming the conclusion that this parameter relates to damage precursor formation from cyclic loading.

  15. Method of pectus excavatum measurement based on structured light technique

    Science.gov (United States)

    Glinkowski, Wojciech; Sitnik, Robert; Witkowski, Marcin; Kocoń, Hanna; Bolewicki, Pawel; Górecki, Andrzej

    2009-07-01

    We present an automatic method for assessment of pectus excavatum severity based on an optical 3-D markerless shape measurement. A four-directional measurement system based on a structured light projection method is built to capture the shape of the body surface of the patients. The system setup is described and typical measurement parameters are given. The automated data analysis path is explained. Their main steps are: normalization of trunk model orientation, cutting the model into slices, analysis of each slice shape, selecting the proper slice for the assessment of pectus excavatum of the patient, and calculating its shape parameter. We develop a new shape parameter (I3ds) that shows high correlation with the computed tomography (CT) Haller index widely used for assessment of pectus excavatum. Clinical results and the evaluation of developed indexes are presented.

  16. Method of pectus excavatum measurement based on structured light technique.

    Science.gov (United States)

    Glinkowski, Wojciech; Sitnik, Robert; Witkowski, Marcin; Kocoń, Hanna; Bolewicki, Pawel; Górecki, Andrzej

    2009-01-01

    We present an automatic method for assessment of pectus excavatum severity based on an optical 3-D markerless shape measurement. A four-directional measurement system based on a structured light projection method is built to capture the shape of the body surface of the patients. The system setup is described and typical measurement parameters are given. The automated data analysis path is explained. Their main steps are: normalization of trunk model orientation, cutting the model into slices, analysis of each slice shape, selecting the proper slice for the assessment of pectus excavatum of the patient, and calculating its shape parameter. We develop a new shape parameter (I(3ds)) that shows high correlation with the computed tomography (CT) Haller index widely used for assessment of pectus excavatum. Clinical results and the evaluation of developed indexes are presented.

  17. Compressive spectrum sensing of radar pulses based on photonic techniques.

    Science.gov (United States)

    Guo, Qiang; Liang, Yunhua; Chen, Minghua; Chen, Hongwei; Xie, Shizhong

    2015-02-23

    We present a photonic-assisted compressive sampling (CS) system which can acquire about 10(6) radar pulses per second spanning from 500 MHz to 5 GHz with a 520-MHz analog-to-digital converter (ADC). A rectangular pulse, a linear frequency modulated (LFM) pulse and a pulse stream is respectively reconstructed faithfully through this system with a sliding window-based recovery algorithm, demonstrating the feasibility of the proposed photonic-assisted CS system in spectral estimation for radar pulses.

  18. A Scenario-Based Technique for Developing SOA Technical Governance

    Science.gov (United States)

    2009-06-01

    18  Figure 5:  CBDi-SAE SOA Governance Framework 19  Figure 6:  IBM SOA Governance and Management Method 20  Figure 7:  ITIL Core Framework 21...organizations, even if they are not SOA specific, such as the Information Technol- ogy Infrastructure Library ( ITIL ) [10] These frameworks can be very...based on a standard or a widely recommended approach such as ITIL , one custom-built for the organization, or a hybrid of all of the preceding. For

  19. Grid Based Techniques for Visualization in the Geosciences

    Science.gov (United States)

    Bollig, E. F.; Sowell, B.; Lu, Z.; Erlebacher, G.; Yuen, D. A.

    2005-12-01

    As experiments and simulations in the geosciences grow larger and more complex, it has become increasingly important to develop methods of processing and sharing data in a distributed computing environment. In recent years, the scientific community has shown growing interest in exploiting the powerful assets of Grid computing to this end, but the complexity of the Grid has prevented many scientists from converting their applications and embracing this possibility. We are investigating methods for development and deployment of data extraction and visualization services across the NaradaBrokering [1] Grid infrastructure. With the help of gSOAP [2], we have developed a series of C/C++ services for wavelet transforms, earthquake clustering, and basic 3D visualization. We will demonstrate the deployment and collaboration of these services across a network of NaradaBrokering nodes, concentrating on the challenges faced in inter-service communication, service/client division, and particularly web service visualization. Renderings in a distributed environment can be handled in three ways: 1) the data extraction service computes and renders everything locally and sends results to the client as a bitmap image, 2) the data extraction service sends results to a separate visualization service for rendering, which in turn sends results to a client as a bitmap image, and 3) the client itself renders images locally. The first two options allow for large visualizations in a distributed and collaborative environment, but limit interactivity of the client. To address this problem we are investigating the advantages of the JOGL OpenGL library [3] to perform renderings on the client side using the client's hardware for increased performance. We will present benchmarking results to ascertain the relative advantage of the three aforementioned techniques as a function of datasize and visualization task. [1] The NaradaBrokering Project, http://www.naradabrokering.org [2] gSOAP: C/C++ Web

  20. Review of Physical Based Monitoring Techniques for Condition Assessment of Corrosion in Reinforced Concrete

    Directory of Open Access Journals (Sweden)

    Ying Lei

    2013-01-01

    Full Text Available Monitoring the condition of steel corrosion in reinforced concrete (RC is imperative for structural durability. In the past decades, many electrochemistry based techniques have been developed for monitoring steel corrosion. However, these electrochemistry techniques can only assess steel corrosion through monitoring the surrounding concrete medium. As alternative tools, some physical based techniques have been proposed for accurate condition assessment of steel corrosion through direct measurements on embedded steels. In this paper, some physical based monitoring techniques developed in the last decade for condition assessment of steel corrosion in RC are reviewed. In particular, techniques based on ultrasonic guided wave (UGW and Fiber Bragg grating (FBG are emphasized. UGW based technique is first reviewed, including important characters of UGW, corrosion monitoring mechanism and feature extraction, monitoring corrosion induced deboning, pitting, interface roughness, and influence factors. Subsequently, FBG for monitoring corrosion in RC is reviewed. The studies and application of the FBG based corrosion sensor developed by the authors are presented. Other physical techniques for monitoring corrosion in RC are also introduced. Finally, the challenges and future trends in the development of physical based monitoring techniques for condition assessment of steel corrosion in RC are put forward.

  1. An immunity-based technique to detect network intrusions

    Institute of Scientific and Technical Information of China (English)

    PAN Feng; DING Yun-fei; WANG Wei-nong

    2005-01-01

    This paper briefly reviews other people's works on negative selection algorithm and their shortcomings. With a view to the real problem to be solved, authors bring forward two assumptions, based on which a new immune algorithm, multi-level negative selection algorithm, is developed. In essence, compared with Forrest's negative selection algorithm, it enhances detector generation efficiency. This algorithm integrates clonal selection process into negative selection process for the first time. After careful analyses, this algorithm was applied to network intrusion detection and achieved good results.

  2. A survey of GPU-based medical image computing techniques.

    Science.gov (United States)

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming; Wang, Defeng

    2012-09-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine.

  3. Image restoration techniques based on fuzzy neural networks

    Institute of Scientific and Technical Information of China (English)

    刘普寅; 李洪兴

    2002-01-01

    By establishing some suitable partitions of input and output spaces, a novel fuzzy neuralnetwork (FNN) which is called selection type FNN is developed. Such a system is a multilayerfeedforward neural network, which can be a universal approximator with maximum norm. Based ona family of fuzzy inference rules that are of real senses, a simple and useful inference type FNN isconstructed. As a result, the fusion of selection type FNN and inference type FNN results in a novelfilter-FNN filter. It is simple in structure. And also it is convenient to design the learning algorithmfor structural parameters. Further, FNN filter can efficiently suppress impulse noise superimposed onimage and preserve fine image structure, simultaneously. Some examples are simulated to confirmthe advantages of FNN filter over other filters, such as median filter and adaptive weighted fuzzymean (AWFM) filter and so on, in suppression of noises and preservation of image structure.

  4. Choice probability for apple juice based on novel processing techniques

    DEFF Research Database (Denmark)

    Olsen, Nina Veflen; Menichelli, E.; Grunert, Klaus G.

    2011-01-01

    and pulsed electric field (PEF) juice are compared with their probability of choice for pasteurized juice and freshly produced apple juice, and consumer choices are tried explained by values and consequences generated from a MEC study. The study support, at least partly, that means-end chain structures’ have......, within the core of academic consumer research, MEC has been almost ignored. One plausible explanation for this lack of interest may be that studies linking MEC data to choice have been few. In this study, we are to investigate how values and consequences generated from a previous MEC study structure can...... be linked to likelihood of choice. Hypotheses about European consumers’ likelihood of choice for novel processed juice are stated and tested in a rating based conjoint study in Norway, Denmark, Hungary and Slovakia. In the study, consumers probability of choice for high pressure processed (HPP) juice...

  5. SPEECH/MUSIC CLASSIFICATION USING WAVELET BASED FEATURE EXTRACTION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Thiruvengatanadhan Ramalingam

    2014-01-01

    Full Text Available Audio classification serves as the fundamental step towards the rapid growth in audio data volume. Due to the increasing size of the multimedia sources speech and music classification is one of the most important issues for multimedia information retrieval. In this work a speech/music discrimination system is developed which utilizes the Discrete Wavelet Transform (DWT as the acoustic feature. Multi resolution analysis is the most significant statistical way to extract the features from the input signal and in this study, a method is deployed to model the extracted wavelet feature. Support Vector Machines (SVM are based on the principle of structural risk minimization. SVM is applied to classify audio into their classes namely speech and music, by learning from training data. Then the proposed method extends the application of Gaussian Mixture Models (GMM to estimate the probability density function using maximum likelihood decision methods. The system shows significant results with an accuracy of 94.5%.

  6. Techniques of Image Processing Based on Artificial Neural Networks

    Institute of Scientific and Technical Information of China (English)

    LI Wei-qing; WANG Qun; WANG Cheng-biao

    2006-01-01

    This paper presented an online quality inspection system based on artificial neural networks. Chromatism classification and edge detection are two difficult problems in glass steel surface quality inspection. Two artificial neural networks were made and the two problems were solved. The one solved chromatism classification. Hue,saturation and their probability of three colors, whose appearing probabilities were maximum in color histogram,were selected as input parameters, and the number of output node could be adjusted with the change of requirement. The other solved edge detection. In this neutral network, edge detection of gray scale image was able to be tested with trained neural networks for a binary image. It prevent the difficulty that the number of needed training samples was too large if gray scale images were directly regarded as training samples. This system is able to be applied to not only glass steel fault inspection but also other product online quality inspection and classification.

  7. Design of Process Displays based on Risk Analysis Techniques

    DEFF Research Database (Denmark)

    Paulsen, Jette Lundtang

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, es-pecially in view of the enormous amount of information available in computer-based supervision systems...... in some detail. Finally we address the problem of where to put the dot and the lines: when all information is ‘on the table’, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose...... was to develop a software tool for maintenance supervision of components in a nuclear power plant....

  8. Constellation choosing based on multi-dimensional sphere packing technique

    Science.gov (United States)

    Jinghe, Li; Guijun, Hu; Kashero, Enock; Zhaoxi, Li

    2016-09-01

    In this paper we address the sphere packing lattice points selection problem being used as constellation points in high-dimensional modulation. We propose a new type of points selection method based on threshold theory. Theoretically, this method improves the transmission performance of high-dimensional signal modulation systems. We find that the BER of a 4D modulation signal using the threshold value points selection method reduces. We also compared random and distant points selection methods in a BER of 10-3 and obtained a reduced SNR of about 2 db. At a 10-3 BER, a 8D modulation signal with points selected using the threshold selection methods obtained a reduced SNR of about 3 db. At a 10-3 BER, a 16D modulation signal with points selected using the threshold selection methods obtained a reduced SNR of about 3.5 db.

  9. Quadrant Based WSN Routing Technique By Shifting Of Origin

    Directory of Open Access Journals (Sweden)

    Nandan Banerji

    2013-04-01

    Full Text Available A sensor is a miniaturized, low powered (basically battery powered, limited storage device which can sense the natural phenomenon or things and convert it into electrical energy or vice versa using transduction process. A Wireless Sensor Network (WSN is such a wireless network built using sensors. The sensors communicate with each other’s using wireless medium. They can be deployed in such an environment; inaccessible to human or difficult to reach. Basically there is a vast application on automated world such as robotics, avionics, oceanographic study, space, satellites etc. The routing of a packet from a source node to a destination should be efficient in such a way that must be efficient in case of energy, communication overhead, less intermediate hops. The scheme will help to route the packet with a lesser intermediate nodes as the neighbors are being selected based on their Quadrant position.

  10. Analysis of ISO/IEEE 11073 built-in security and its potential IHE-based extensibility.

    Science.gov (United States)

    Rubio, Óscar J; Trigo, Jesús D; Alesanco, Álvaro; Serrano, Luis; García, José

    2016-04-01

    The ISO/IEEE 11073 standard for Personal Health Devices (X73PHD) aims to ensure interoperability between Personal Health Devices and aggregators-e.g. health appliances, routers-in ambulatory setups. The Integrating the Healthcare Enterprise (IHE) initiative promotes the coordinated use of different standards in healthcare systems (e.g. Personal/Electronic Health Records, alert managers, Clinical Decision Support Systems) by defining profiles intended for medical use cases. X73PHD provides a robust syntactic model and a comprehensive terminology, but it places limited emphasis on security and on interoperability with IHE-compliant systems and frameworks. However, the implementation of eHealth/mHealth applications in environments such as health and fitness monitoring, independent living and disease management (i.e. the X73PHD domains) increasingly requires features such as secure connections to mobile aggregators-e.g. smartphones, tablets-, the sharing of devices among different users with privacy, and interoperability with certain IHE-compliant healthcare systems. This work proposes a comprehensive IHE-based X73PHD extension consisting of additive layers adapted to different eHealth/mHealth applications, after having analyzed the features of X73PHD (especially its built-in security), IHE profiles related with these applications and other research works. Both the new features proposed for each layer and the procedures to support them have been carefully chosen to minimize the impact on X73PHD, on its architecture (in terms of delays and overhead) and on its framework. Such implications are thoroughly analyzed in this paper. As a result, an extended model of X73PHD is proposed, preserving its essential features while extending them with added value.

  11. A Simulated System for Traffic Signal Management Based on Integrating GIS & WSN Techniques

    Directory of Open Access Journals (Sweden)

    Ahmed S. Elmotelb

    2016-01-01

    Full Text Available Traffic signals management systems (TSMS are traffic systems based on cameras, infrared sensors and satellite systems. Such systems have been lacking the ability of real-time data collection and support. This paper proposes a solution to the traffic signal management problem using combined techniques that combines both GIS information with WSN based techniques. This combination provide appropriate techniques and tools that will enhance the capabilities of traffic jam prevention, early detection, efficient surveillance, efficient spread control, and fast termination of possible hazards. Consequently, this work proposes a new methodology thrown merging WSN and GIS techniques to produce valuable information for traffic signals management systems purposes.

  12. 香蕉育苗技术的推广应用%Extension and Utilization of Seedling Breeding Technique for Banana

    Institute of Scientific and Technical Information of China (English)

    王永壮; 符运柳; 刘以道; 覃和业

    2011-01-01

    介绍健康、优质香蕉组培苗一、二级苗的培育技术,以促进香蕉产业的健康发展。%Based on many years of practical experience in the production of banana seedlings,author now is introducing the seedling cultivation technology on health and quality of banana tissue culture to promote the healthy development of the banana industry.

  13. PDE-based Non-Linear Diffusion Techniques for Denoising Scientific and Industrial Images: An Empirical Study

    Energy Technology Data Exchange (ETDEWEB)

    Weeratunga, S K; Kamath, C

    2001-12-20

    Removing noise from data is often the first step in data analysis. Denoising techniques should not only reduce the noise, but do so without blurring or changing the location of the edges. Many approaches have been proposed to accomplish this; in this paper, they focus on one such approach, namely the use of non-linear diffusion operators. This approach has been studied extensively from a theoretical viewpoint ever since the 1987 work of Perona and Malik showed that non-linear filters outperformed the more traditional linear Canny edge detector. They complement this theoretical work by investigating the performance of several isotropic diffusion operators on test images from scientific domains. They explore the effects of various parameters such as the choice of diffusivity function, explicit and implicit methods for the discretization of the PDE, and approaches for the spatial discretization of the non-linear operator etc. They also compare these schemes with simple spatial filters and the more complex wavelet-based shrinkage techniques. The empirical results show that, with an appropriate choice of parameters, diffusion-based schemes can be as effective as competitive techniques.

  14. Damage identification in beams by a response surface based technique

    Directory of Open Access Journals (Sweden)

    Teidj S.

    2014-01-01

    Full Text Available In this work, identification of damage in uniform homogeneous metallic beams was considered through the propagation of non dispersive elastic torsional waves. The proposed damage detection procedure consisted of the following sequence. Giving a localized torque excitation, having the form of a short half-sine pulse, the first step was calculating the transient solution of the resulting torsional wave. This torque could be generated in practice by means of asymmetric laser irradiation of the beam surface. Then, a localized defect assumed to be characterized by an abrupt reduction of beam section area with a given height and extent was placed at a known location of the beam. Next, the response in terms of transverse section rotation rate was obtained for a point situated afterwards the defect, where the sensor was positioned. This last could utilize in practice the concept of laser vibrometry. A parametric study has been conducted after that by using a full factorial design of experiments table and numerical simulations based on a finite difference characteristic scheme. This has enabled the derivation of a response surface model that was shown to represent adequately the response of the system in terms of the following factors: defect extent and severity. The final step was performing the inverse problem solution in order to identify the defect characteristics by using measurement.

  15. Memristor-Based Computing Architecture: Design Methodologies and Circuit Techniques

    Science.gov (United States)

    2013-03-01

    3.2.6 Ten-synapse Circuit……………………………………………………………………….9 3.3 Case Study – Synapse-based ALU Design………………………………………………10 3.3.1 1-bit Adder ...3.3.4 4-bit ALU as Adder , Subtractor, and Decade Counter…………………………………..14 3.3.5 8-bit ALU as Adder , Subtractor, and Decade Counter...training………………………………………......10 13 Schematic of 1-bit adder -subtractor block……………………………………………..10 14 Schematic of 4-bit binary counter built with

  16. Measurement of particle size based on digital imaging technique

    Institute of Scientific and Technical Information of China (English)

    CHEN Hong; TANG Hong-wu; LIU Yun; WANG Hao; LIU Gui-ping

    2013-01-01

    To improve the analysis methods for the measurement of the sediment particle sizes with a wide distribution and of irregular shapes,a sediment particle image measurement,an analysis system,and an extraction algorithm of the optimal threshold based on the gray histogram peak values are proposed.Recording the pixels of the sediment particles by labeling them,the algorithm can effectively separate the sediment particle images from the background images using the equivalent pixel circles with the same diameters to represent the sediment particles.Compared with the laser analyzer for the case of blue plastic sands,the measurement results of the system are shown to be reasonably similar.The errors are mainly due to the small size of the particles and the limitation of the apparatus.The measurement accuracy can be improved by increasing the Charge-Coupled Devices (CCD) camera resolution.The analysis method of the sediment particle images can provide a technical support for the rapid measurement of the sediment particle size and its distribution.

  17. Systematic infrared image quality improvement using deep learning based techniques

    Science.gov (United States)

    Zhang, Huaizhong; Casaseca-de-la-Higuera, Pablo; Luo, Chunbo; Wang, Qi; Kitchin, Matthew; Parmley, Andrew; Monge-Alvarez, Jesus

    2016-10-01

    Infrared thermography (IRT, or thermal video) uses thermographic cameras to detect and record radiation in the longwavelength infrared range of the electromagnetic spectrum. It allows sensing environments beyond the visual perception limitations, and thus has been widely used in many civilian and military applications. Even though current thermal cameras are able to provide high resolution and bit-depth images, there are significant challenges to be addressed in specific applications such as poor contrast, low target signature resolution, etc. This paper addresses quality improvement in IRT images for object recognition. A systematic approach based on image bias correction and deep learning is proposed to increase target signature resolution and optimise the baseline quality of inputs for object recognition. Our main objective is to maximise the useful information on the object to be detected even when the number of pixels on target is adversely small. The experimental results show that our approach can significantly improve target resolution and thus helps making object recognition more efficient in automatic target detection/recognition systems (ATD/R).

  18. Design of process displays based on risk analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lundtang Paulsen, J

    2004-05-01

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, especially in view of the enormous amount of information available in computer-based supervision systems. The state of the art is discussed: How are supervision systems designed today and why? Which strategies are used? What kind of research is going on? Four different plants and their display systems, designed by the author, are described and discussed. Next we outline different methods for eliciting knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chapter presents an overview of the various types of operation references: constitutive equations, set points, design parameters, component characteristics etc., and their validity in different situations. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engineer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described in some detail. Finally we address the problem of where to put the dot and the lines: when all information is on the table, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose was to develop a software tool for maintenance supervision of components in a nuclear power plant. (au)

  19. Whole genome sequencing-based characterization of extensively drug resistant (XDR) strains of Mycobacterium tuberculosis from Pakistan

    KAUST Repository

    Hasan, Zahra

    2015-03-01

    Objectives: The global increase in drug resistance in Mycobacterium tuberculosis (MTB) strains increases the focus on improved molecular diagnostics for MTB. Extensively drug-resistant (XDR) - TB is caused by MTB strains resistant to rifampicin, isoniazid, fluoroquinolone and aminoglycoside antibiotics. Resistance to anti-tuberculous drugs has been associated with single nucleotide polymorphisms (SNPs), in particular MTB genes. However, there is regional variation between MTB lineages and the SNPs associated with resistance. Therefore, there is a need to identify common resistance conferring SNPs so that effective molecular-based diagnostic tests for MTB can be developed. This study investigated used whole genome sequencing (WGS) to characterize 37 XDR MTB isolates from Pakistan and investigated SNPs related to drug resistance. Methods: XDR-TB strains were selected. DNA was extracted from MTB strains, and samples underwent WGS with 76-base-paired end fragment sizes using Illumina paired end HiSeq2000 technology. Raw sequence data were mapped uniquely to H37Rv reference genome. The mappings allowed SNPs and small indels to be called using SAMtools/BCFtools. Results: This study found that in all XDR strains, rifampicin resistance was attributable to SNPs in the rpoB RDR region. Isoniazid resistance-associated mutations were primarily related to katG codon 315 followed by inhA S94A. Fluoroquinolone resistance was attributable to gyrA 91-94 codons in most strains, while one did not have SNPs in either gyrA or gyrB. Aminoglycoside resistance was mostly associated with SNPs in rrs, except in 6 strains. Ethambutol resistant strains had embB codon 306 mutations, but many strains did not have this present. The SNPs were compared with those present in commercial assays such as LiPA Hain MDRTBsl, and the sensitivity of the assays for these strains was evaluated. Conclusions: If common drug resistance associated with SNPs evaluated the concordance between phenotypic and

  20. Influence of an extensive inquiry-based field experience on pre-service elementary student teachers' science teaching beliefs

    Science.gov (United States)

    Bhattacharyya, Sumita

    This study examined the effects of an extensive inquiry-based field experience on pre-service elementary teachers' personal agency beliefs (PAB) about teaching science and their ability to effectively implement science instruction. The research combined quantitative and qualitative approaches within an ethnographic research tradition. A comparison was made between the pre and posttest scores for two groups. The experimental group utilized the inquiry method; the control group did not. The experimental group had the stronger PAB pattern. The field experience caused no significant differences to the context beliefs of either groups, but did to the capability beliefs. The number of college science courses taken by pre-service elementary teachers' was positively related to their post capability belief (p = .0209). Qualitative information was collected through case studies which included observation of classrooms, assessment of lesson plans and open-ended, extended interviews of the participants about their beliefs in their teaching abilities (efficacy beliefs), and in teaching environments (context beliefs). The interview data were analyzed by the analytic induction method to look for themes. The emerging themes were then grouped under several attributes. Following a review of the attributes a number of hypotheses were formulated. Each hypothesis was then tested across all the cases by the constant comparative method. The pattern of relationship that emerged from the hypotheses testing clearly suggests a new hypothesis that there is a spiral relationship among the ability to establish communicative relationship with students, desire for personal growth and improvement, and greater content knowledge. The study concluded that inquiry based student teaching should be encouraged to train school science teachers. But the meaning and the practice of the inquiry method should be clearly delineated to ensure its correct implementation in the classroom. A survey should be

  1. ECG processing techniques based on neural networks and bidirectional associative memories.

    Science.gov (United States)

    Maglaveras, N; Stamkopoulos, T; Pappas, C; Strintzis, M

    1998-01-01

    Two ECG processing techniques are described for the classification of QRSs, PVCs and normal and ischaemic beats. The techniques use neural network (NN) technology in two ways. The first technique, uses nonlinear ECG mapping preprocessing and subsequently for classification uses a shrinking algorithm based on NNs. This technique is applied to the QRS/PVC problem with good result. The second technique is based on the Bidirectional Associative Memory (BAM) NN and is used to distinguish normal from ischaemic beats. In this technique the ECG beat is treated as a digitized image which is then transformed into a bipolar vector suitable for input in the BAM. The results show that this method, if properly calibrated, can result in a fast and reliable ischaemic beat detection algorithm.

  2. Intrusion Detection Systems Based on Artificial Intelligence Techniques in Wireless Sensor Networks

    OpenAIRE

    2013-01-01

    Intrusion detection system (IDS) is regarded as the second line of defense against network anomalies and threats. IDS plays an important role in network security. There are many techniques which are used to design IDSs for specific scenario and applications. Artificial intelligence techniques are widely used for threats detection. This paper presents a critical study on genetic algorithm, artificial immune, and artificial neural network (ANN) based IDSs techniques used in wireless sensor netw...

  3. Wood lens design philosophy based on a binary additive manufacturing technique

    Science.gov (United States)

    Marasco, Peter L.; Bailey, Christopher

    2016-04-01

    Using additive manufacturing techniques in optical engineering to construct a gradient index (GRIN) optic may overcome a number of limitations of GRIN technology. Such techniques are maturing quickly, yielding additional design degrees of freedom for the engineer. How best to employ these degrees of freedom is not completely clear at this time. This paper describes a preliminary design philosophy, including assumptions, pertaining to a particular printing technique for GRIN optics. It includes an analysis based on simulation and initial component measurement.

  4. The extension of school-based inter- and intraracial children's friendships: influences on psychosocial well-being.

    Science.gov (United States)

    Fletcher, Anne C; Rollins, Alethea; Nickerson, Pamela

    2004-07-01

    Children's (N=142) school friendships with same versus different race peers were coded for prevalence and the extent to which parents maintained social relationships with these friends (a proxy for extension of friendships beyond the school context). Membership in integrated versus nonintegrated social networks at school was unassociated with psychosocial well-being. Out-of-school extension of interracial friendships was linked with greater social competence among Black children. Black children whose friendships with both same and different race peers were extended beyond the school context reported higher levels of self-esteem.

  5. Determination of cis/trans phase of variations in the MC1R gene with allele-specific PCR and single base extension

    DEFF Research Database (Denmark)

    Mengel-From, Jonas; Børsting, Claus; Sanchez, Juan J;

    2008-01-01

    The MC1R gene encodes a protein with key regulatory functions in the melanin synthesis. A multiplex PCR and a multiplex single base extension protocol were established for genotyping six exonic MC1R variations highly penetrant for red hair (R), four exonic MC1R variations weakly penetrant for red...

  6. Typing of multiple single-nucleotide polymorphisms using ribonuclease cleavage of DNA/RNA chimeric single-base extension primers and detection by MALDI-TOF mass spectrometry

    DEFF Research Database (Denmark)

    Mengel-From, Jonas; Sanchez Sanchez, Juan Jose; Børsting, Claus;

    2005-01-01

    A novel single-base extension (SBE) assay using cleavable and noncleavable SBE primers in the same reaction mix is described. The cleavable SBE primers consisted of deoxyribonucleotides and one ribonucleotide (hereafter denoted chimeric primers), whereas the noncleavable SBE primers consisted of ...

  7. An extensively hydrolysed casein-based formula for infants with cows' milk protein allergy : tolerance/hypo-allergenicity and growth catch-up

    NARCIS (Netherlands)

    Dupont, Christophe; Hol, Jeroen; Nieuwenhuis, Edward E. S.

    2015-01-01

    Children with cows' milk protein allergy (CMPA) are at risk of insufficient length and weight gain, and the nutritional efficacy of hypo-allergenic formulas should be carefully assessed. In 2008, a trial assessed the impact of probiotic supplementation of an extensively hydrolysed casein-based formu

  8. INTELLIGENT CAR STYLING TECHNIQUE AND SYSTEM BASED ON A NEW AERODYNAMIC-THEORETICAL MODEL

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Car styling technique based on a new theoretical model of automotive aerodynamics is introduced, which is proved to be feasible and effective by wind tunnel tests. Development of a multi-module software system from this technique, including modules of knowledge processing, referential styling and ANN aesthetic evaluation etc, capable of assisting car styling works in an intelligent way, is also presented and discussed.

  9. A new approach of binary addition and subtraction by non-linear material based switching technique

    Indian Academy of Sciences (India)

    Archan Kumar Das; Partha Partima Das; Sourangshu Mukhopadhyay

    2005-02-01

    Here, we refer a new proposal of binary addition as well as subtraction in all-optical domain by exploitation of proper non-linear material-based switching technique. In this communication, the authors extend this technique for both adder and subtractor accommodating the spatial input encoding system.

  10. Maternal mortality in rural south Ethiopia: outcomes of community-based birth registration by health extension workers.

    Directory of Open Access Journals (Sweden)

    Yaliso Yaya

    Full Text Available Rural communities in low-income countries lack vital registrations to track birth outcomes. We aimed to examine the feasibility of community-based birth registration and measure maternal mortality ratio (MMR in rural south Ethiopia.In 2010, health extension workers (HEWs registered births and maternal deaths among 421,639 people in three districts (Derashe, Bonke, and Arba Minch Zuria. One nurse-supervisor per district provided administrative and technical support to HEWs. The primary outcomes were the feasibility of registration of a high proportion of births and measuring MMR. The secondary outcome was the proportion of skilled birth attendance. We validated the completeness of the registry and the MMR by conducting a house-to-house survey in 15 randomly selected villages in Bonke.We registered 10,987 births (81·4% of expected 13,492 births with annual crude birth rate of 32 per 1,000 population. The validation study showed that, of 2,401 births occurred in the surveyed households within eight months of the initiation of the registry, 71·6% (1,718 were registered with similar MMRs (474 vs. 439 between the registered and unregistered births. Overall, we recorded 53 maternal deaths; MMR was 489 per 100,000 live births and 83% (44 of 53 maternal deaths occurred at home. Ninety percent (9,863 births were at home, 4% (430 at health posts, 2·5% (282 at health centres, and 3·5% (412 in hospitals. MMR increased if: the male partners were illiterate (609 vs. 346; p= 0·051 and the villages had no road access (946 vs. 410; p= 0·039. The validation helped to increase the registration coverage by 10% through feedback discussions.It is possible to obtain a high-coverage birth registration and measure MMR in rural communities where a functional system of community health workers exists. The MMR was high in rural south Ethiopia and most births and maternal deaths occurred at home.

  11. AQA-PM: Extension of the Air-Quality Model For Austria with Satellite based Particulate Matter Estimates

    Science.gov (United States)

    Hirtl, Marcus; Mantovani, Simone; Krüger, Bernd C.; Triebnig, Gerhard; Flandorfer, Claudia

    2013-04-01

    Air quality is a key element for the well-being and quality of life of European citizens. Air pollution measurements and modeling tools are essential for assessment of air quality according to EU legislation. The responsibilities of ZAMG as the national weather service of Austria include the support of the federal states and the public in questions connected to the protection of the environment in the frame of advisory and counseling services as well as expert opinions. The Air Quality model for Austria (AQA) is operated at ZAMG in cooperation with the University of Natural Resources and Life Sciences in Vienna (BOKU) by order of the regional governments since 2005. AQA conducts daily forecasts of gaseous and particulate (PM10) air pollutants over Austria. In the frame of the project AQA-PM (funded by FFG), satellite measurements of the Aerosol Optical Thickness (AOT) and ground-based PM10-measurements are combined to highly-resolved initial fields using regression- and assimilation techniques. For the model simulations WRF/Chem is used with a resolution of 3 km over the alpine region. Interfaces have been developed to account for the different measurements as input data. The available local emission inventories provided by the different Austrian regional governments were harmonized and used for the model simulations. An episode in February 2010 is chosen for the model evaluation. During that month exceedances of PM10-thresholds occurred at many measurement stations of the Austrian network. Different model runs (only model/only ground stations assimilated/satellite and ground stations assimilated) are compared to the respective measurements. The goal of this project is to improve the PM10-forecasts for Austria with the integration of satellite based measurements and to provide a comprehensive product-platform.

  12. Photoacoustic Techniques for Trace Gas Sensing Based on Semiconductor Laser Sources

    Directory of Open Access Journals (Sweden)

    Vincenzo Spagnolo

    2009-12-01

    Full Text Available The paper provides an overview on the use of photoacoustic sensors based on semiconductor laser sources for the detection of trace gases. We review the results obtained using standard, differential and quartz enhanced photoacoustic techniques.

  13. Resizing Technique-Based Hybrid Genetic Algorithm for Optimal Drift Design of Multistory Steel Frame Buildings

    Directory of Open Access Journals (Sweden)

    Hyo Seon Park

    2014-01-01

    Full Text Available Since genetic algorithm-based optimization methods are computationally expensive for practical use in the field of structural optimization, a resizing technique-based hybrid genetic algorithm for the drift design of multistory steel frame buildings is proposed to increase the convergence speed of genetic algorithms. To reduce the number of structural analyses required for the convergence, a genetic algorithm is combined with a resizing technique that is an efficient optimal technique to control the drift of buildings without the repetitive structural analysis. The resizing technique-based hybrid genetic algorithm proposed in this paper is applied to the minimum weight design of three steel frame buildings. To evaluate the performance of the algorithm, optimum weights, computational times, and generation numbers from the proposed algorithm are compared with those from a genetic algorithm. Based on the comparisons, it is concluded that the hybrid genetic algorithm shows clear improvements in convergence properties.

  14. DEVELOPMENT OF OBSTACLE AVOIDANCE TECHNIQUE IN WEB-BASED GEOGRAPHIC INFORMATION SYSTEM FOR TRAFFIC MANAGEMENT USING OPEN SOURCE SOFTWARE

    Directory of Open Access Journals (Sweden)

    Nik Mohd Ramli Nik Yusoff

    2014-01-01

    Full Text Available The shortest path routing is one of the well-known network analysis techniques implemented in road management systems. Pg Routing as an extension of Postgre SQL/Post GIS database is an open source library that implements the Dijkstra shortest path algorithm. However, the functionality to avoid obstacles in that analysis is still limited. Therefore, this study was conducted to enable obstacle avoidance function in the existing pgRouting algorithm using OpenStreetMap road network. By implementing this function, it enhances the Dijkstra algorithm ability in network analysis. In this study a dynamic restriction feature is added at the program level to represent the obstacles on the road. With this modification the algorithm is now able to generate an alternative route by avoiding the existence of obstacles on the roads. By using OpenLayers and PHP a web-based GIS platform was developed to ease the system’s usability.

  15. Comparative study of retinal vessel segmentation based on global thresholding techniques.

    Science.gov (United States)

    Mapayi, Temitope; Viriri, Serestina; Tapamo, Jules-Raymond

    2015-01-01

    Due to noise from uneven contrast and illumination during acquisition process of retinal fundus images, the use of efficient preprocessing techniques is highly desirable to produce good retinal vessel segmentation results. This paper develops and compares the performance of different vessel segmentation techniques based on global thresholding using phase congruency and contrast limited adaptive histogram equalization (CLAHE) for the preprocessing of the retinal images. The results obtained show that the combination of preprocessing technique, global thresholding, and postprocessing techniques must be carefully chosen to achieve a good segmentation performance.

  16. Comparison of multivariate preprocessing techniques as applied to electronic tongue based pattern classification for black tea.

    Science.gov (United States)

    Palit, Mousumi; Tudu, Bipan; Bhattacharyya, Nabarun; Dutta, Ankur; Dutta, Pallab Kumar; Jana, Arun; Bandyopadhyay, Rajib; Chatterjee, Anutosh

    2010-08-18

    In an electronic tongue, preprocessing on raw data precedes pattern analysis and choice of the appropriate preprocessing technique is crucial for the performance of the pattern classifier. While attempting to classify different grades of black tea using a voltammetric electronic tongue, different preprocessing techniques have been explored and a comparison of their performances is presented in this paper. The preprocessing techniques are compared first by a quantitative measurement of separability followed by principle component analysis; and then two different supervised pattern recognition models based on neural networks are used to evaluate the performance of the preprocessing techniques.

  17. Comparison of multivariate preprocessing techniques as applied to electronic tongue based pattern classification for black tea

    Energy Technology Data Exchange (ETDEWEB)

    Palit, Mousumi [Department of Electronics and Telecommunication Engineering, Central Calcutta Polytechnic, Kolkata 700014 (India); Tudu, Bipan, E-mail: bt@iee.jusl.ac.in [Department of Instrumentation and Electronics Engineering, Jadavpur University, Kolkata 700098 (India); Bhattacharyya, Nabarun [Centre for Development of Advanced Computing, Kolkata 700091 (India); Dutta, Ankur; Dutta, Pallab Kumar [Department of Instrumentation and Electronics Engineering, Jadavpur University, Kolkata 700098 (India); Jana, Arun [Centre for Development of Advanced Computing, Kolkata 700091 (India); Bandyopadhyay, Rajib [Department of Instrumentation and Electronics Engineering, Jadavpur University, Kolkata 700098 (India); Chatterjee, Anutosh [Department of Electronics and Communication Engineering, Heritage Institute of Technology, Kolkata 700107 (India)

    2010-08-18

    In an electronic tongue, preprocessing on raw data precedes pattern analysis and choice of the appropriate preprocessing technique is crucial for the performance of the pattern classifier. While attempting to classify different grades of black tea using a voltammetric electronic tongue, different preprocessing techniques have been explored and a comparison of their performances is presented in this paper. The preprocessing techniques are compared first by a quantitative measurement of separability followed by principle component analysis; and then two different supervised pattern recognition models based on neural networks are used to evaluate the performance of the preprocessing techniques.

  18. Comparative Study of Retinal Vessel Segmentation Based on Global Thresholding Techniques

    Directory of Open Access Journals (Sweden)

    Temitope Mapayi

    2015-01-01

    Full Text Available Due to noise from uneven contrast and illumination during acquisition process of retinal fundus images, the use of efficient preprocessing techniques is highly desirable to produce good retinal vessel segmentation results. This paper develops and compares the performance of different vessel segmentation techniques based on global thresholding using phase congruency and contrast limited adaptive histogram equalization (CLAHE for the preprocessing of the retinal images. The results obtained show that the combination of preprocessing technique, global thresholding, and postprocessing techniques must be carefully chosen to achieve a good segmentation performance.

  19. A quality control technique based on UV-VIS absorption spectroscopy for tequila distillery factories

    Science.gov (United States)

    Barbosa Garcia, O.; Ramos Ortiz, G.; Maldonado, J. L.; Pichardo Molina, J.; Meneses Nava, M. A.; Landgrave, Enrique; Cervantes, M. J.

    2006-02-01

    A low cost technique based on the UV-VIS absorption spectroscopy is presented for the quality control of the spirit drink known as tequila. It is shown that such spectra offer enough information to discriminate a given spirit drink from a group of bottled commercial tequilas. The technique was applied to white tequilas. Contrary to the reference analytic methods, such as chromatography, for this technique neither special personal training nor sophisticated instrumentations is required. By using hand-held instrumentation this technique can be applied in situ during the production process.

  20. Android Access Control Extension

    Directory of Open Access Journals (Sweden)

    Anton Baláž

    2015-12-01

    Full Text Available The main objective of this work is to analyze and extend security model of mobile devices running on Android OS. Provided security extension is a Linux kernel security module that allows the system administrator to restrict program's capabilities with per-program profiles. Profiles can allow capabilities like network access, raw socket access, and the permission to read, write, or execute files on matching paths. Module supplements the traditional Android capability access control model by providing mandatory access control (MAC based on path. This extension increases security of access to system objects in a device and allows creating security sandboxes per application.

  1. MATILDA: A Military Laser Range Safety Tool Based on Probabilistic Risk Assessment (PRA) Techniques

    Science.gov (United States)

    2014-08-01

    AFRL-RH-FS-TR-2014-0035 MATILDA: A Military Laser Range Safety Tool Based on Probabilistic Risk Assessment (PRA) Techniques Paul...the Government’s approval or disapproval of its ideas or findings. MATILDA: A Military Laser Range Safety Tool Based on Probabilistic Risk Assessment... Probabilistic Risk Assessment (PRA) techniques to perform laser safety and hazard analysis for high output lasers in outdoor environments has become

  2. Distributed Synchronization Technique for OFDMA-Based Wireless Mesh Networks Using a Bio-Inspired Algorithm

    OpenAIRE

    Mi Jeong Kim; Sung Joon Maeng; Yong Soo Cho

    2015-01-01

    In this paper, a distributed synchronization technique based on a bio-inspired algorithm is proposed for an orthogonal frequency division multiple access (OFDMA)-based wireless mesh network (WMN) with a time difference of arrival. The proposed time- and frequency-synchronization technique uses only the signals received from the neighbor nodes, by considering the effect of the propagation delay between the nodes. It achieves a fast synchronization with a relatively low computational complexity...

  3. Effectiveness of Agricultural Extension Activities

    Directory of Open Access Journals (Sweden)

    Ali AL-Sharafat

    2012-01-01

    Full Text Available Problem statement: Jordans agricultural extension service is seriously under-staffed and its effectiveness is consequently compromised. Reservations are being expressed about the performance and capability of the agricultural extension system in Jordan. The performance of this sector has been disappointing and has failed to transfer agricultural technology to the farmers. The main objective of this study is to assess the effectiveness of Jordans agricultural extension services. Approach: The effect of extension services on olive productivity in the study area was investigated. A total number of 60 olive producers were selected to be interviewed for this study. This number was enough to achieve the study objectives. The interviewed producers were distributed almost equally within olive production locations in the study area. The sample obtained through the simple random sampling technique. The two groups had been chosen and distributed randomly into an experimental group (30 farmers; 10 for each source of extension service and control group (30 farmers. The experimental group received extension services and the control group received no extension services. Two interview-cum-structured questionnaires were designed and used to collect information and data for this study. The first instrument was designed for farmers who received extension services and the second from farmers who received no extension services. Another questionnaire was designed for administrators of extension organizations concerned with providing extension services to farmers. To find the differences that may exist between two studied groups, One Way Analysis of Variance (ANOVA, t-test and LSD test via Statistical Package for Social Sciences software (SPSS were used. The average net profit obtained from an area of one dynamo of olive farm was the main item to be considered in determining the effectiveness of agricultural extension activities. Results and Conclusion: The results of

  4. Cesarean delivery technique: evidence or tradition? A review of the evidence-based cesarean delivery.

    Science.gov (United States)

    Encarnacion, Betsy; Zlatnik, Marya G

    2012-08-01

    Cesarean delivery is the most common surgical procedure performed in the United States, yet the techniques used during this procedure often vary significantly among providers. The purpose of this review was to evaluate and outline current evidence behind the cesarean delivery technique. A search of the PubMed database was conducted using the terms cesarean section and cesarean delivery and the technique of interest, for example, cesarean section prophylactic antibiotics. Few aspects of the cesarean delivery were found to have high-quality consistent evidence to support use of a particular technique. Because many aspects of the procedure are based on limited or no data, more studies on specific cesarean delivery techniques are clearly needed. Providers should be aware of which components of the cesarean delivery are evidence-based versus not when performing this procedure.

  5. Automatic authentication using rough set-based technique and fuzzy decision

    Institute of Scientific and Technical Information of China (English)

    CHEN Ning; FENG Bo-qin; WANG Hai-xiao; ZHANG Hao

    2009-01-01

    This study introduced an automatic authentication technique for checking the genuineness of a vehicle.The rough set-based technique was used to handle the uncertainty arisen from artifacts in the acquired images imprinted on a vehicle. However, it has been proved to be NP-hard to find all reductions and the minimal reduction, and generally different heuristic algorithms were used to find a set of reductions and the Gaussian distribution was used to describe the uncertainty to achieve the minimal reduction. On the basis of inductive logic programming, the technique can distinguish between two similar images, as is superior to the conventional pattern-recognition technique being merely capable of classifier. Furthermore, it can avoid some failures of the technique based on the correlation coefficient to authenticate binary image. The experiments show an accuracy rate close to 93. 2%.

  6. Antenna pointing system for satellite tracking based on Kalman filtering and model predictive control techniques

    Science.gov (United States)

    Souza, André L. G.; Ishihara, João Y.; Ferreira, Henrique C.; Borges, Renato A.; Borges, Geovany A.

    2016-12-01

    The present work proposes a new approach for an antenna pointing system for satellite tracking. Such a system uses the received signal to estimate the beam pointing deviation and then adjusts the antenna pointing. The present work has two contributions. First, the estimation is performed by a Kalman filter based conical scan technique. This technique uses the Kalman filter avoiding the batch estimator and applies a mathematical manipulation avoiding the linearization approximations. Secondly, a control technique based on the model predictive control together with an explicit state feedback solution are obtained in order to reduce the computational burden. Numerical examples illustrate the results.

  7. Simulation-driven design by knowledge-based response correction techniques

    CERN Document Server

    Koziel, Slawomir

    2016-01-01

    Focused on efficient simulation-driven multi-fidelity optimization techniques, this monograph on simulation-driven optimization covers simulations utilizing physics-based low-fidelity models, often based on coarse-discretization simulations or other types of simplified physics representations, such as analytical models. The methods presented in the book exploit as much as possible any knowledge about the system or device of interest embedded in the low-fidelity model with the purpose of reducing the computational overhead of the design process. Most of the techniques described in the book are of response correction type and can be split into parametric (usually based on analytical formulas) and non-parametric, i.e., not based on analytical formulas. The latter, while more complex in implementation, tend to be more efficient. The book presents a general formulation of response correction techniques as well as a number of specific methods, including those based on correcting the low-fidelity model response (out...

  8. Nasal base narrowing of the caucasian nose through the cerclage technique

    Directory of Open Access Journals (Sweden)

    Mocellin, Marcos

    2010-06-01

    Full Text Available Introduction: Several techniques can be performed to reduce the nasal base (narrowing, as skin resection vestibular columellar skin resection, resection of skin in elliptical lip narinary, sloughing of skin and advancements (VY technique of Bernstein and the use of cerclage sutures in the nasal base. Objective: To evaluate the technique of cerclage performed in the nasal base, through endonasal rhinoplasty without delivery of basic technique, in the Caucasian nose, reducing the distance inter-alar flare and correcting the wing with consequent improvement in nasal harmony in the whole face. Methods: A retrospective analysis by analysis of clinical documents and photos of 43 patients in whom cerclage was made of the nasal base by resecting skin ellipse in the region of the vestibule and the nasal base (modified technique of Weir using colorless mononylon® 4 "0" with a straight cutting needle. The study was conducted in 2008 and 2009 at Hospital of Paraná Institute of Otolaryngology - IPO in Curitiba, Parana - Brazil. Patients had a follow up ranging 7-12 months. Results: In 100% of cases was achieved an improvement in nasal harmony, by decreasing the inter-alar distance. Conclusion: The encircling with minimal resection of vestibular skin and the nasal base is an effective method for the narrowing of the nasal base in the Caucasian nose, with predictable results and easy to perform.

  9. Model-based fault diagnosis techniques design schemes, algorithms, and tools

    CERN Document Server

    Ding, Steven

    2008-01-01

    The objective of this book is to introduce basic model-based FDI schemes, advanced analysis and design algorithms, and the needed mathematical and control theory tools at a level for graduate students and researchers as well as for engineers. This is a textbook with extensive examples and references. Most methods are given in the form of an algorithm that enables a direct implementation in a programme. Comparisons among different methods are included when possible.

  10. Applications of synchrotron-based X-ray techniques in environmental science

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Synchrotron-based X-ray techniques have been widely applied to the fields of environmental science due to their element-specific and nondestructive properties and unique spectral and spatial resolution advantages.The techniques are capable of in situ investigating chemical speciation,microstructure and mapping of elements in question at the molecular or nanometer scale,and thus provide direct evidence for reaction mechanisms for various environmental processes.In this contribution,the applications of three types of the techniques commonly used in the fields of environmental research are reviewed,namely X-ray absorption spectroscopy (XAS),X-ray fluorescence (XRF) spectroscopy and scanning transmission X-ray microscopy (STXM).In particular,the recent advances of the techniques in China are elaborated,and a selection of the applied examples are provided in the field of environmental science.Finally,the perspectives of synchrotron-based X-ray techniques are discussed.With their great progress and wide application,the techniques have revolutionized our understanding of significant geo-and bio-chemical processes.It is anticipatable that synchrotron-based X-ray techniques will continue to play a significant role in the fields and significant advances will be obtained in decades ahead.

  11. SVD-Based Optimal Filtering Technique for Noise Reduction in Hearing Aids Using Two Microphones

    Directory of Open Access Journals (Sweden)

    Moonen Marc

    2002-01-01

    Full Text Available We introduce a new SVD-based (Singular value decomposition strategy for noise reduction in hearing aids. This technique is evaluated for noise reduction in a behind-the-ear (BTE hearing aid where two omnidirectional microphones are mounted in an endfire configuration. The behaviour of the SVD-based technique is compared to a two-stage adaptive beamformer for hearing aids developed by Vanden Berghe and Wouters (1998. The evaluation and comparison is done with a performance metric based on the speech intelligibility index (SII. The speech and noise signals are recorded in reverberant conditions with a signal-to-noise ratio of and the spectrum of the noise signals is similar to the spectrum of the speech signal. The SVD-based technique works without initialization nor assumptions about a look direction, unlike the two-stage adaptive beamformer. Still, for different noise scenarios, the SVD-based technique performs as well as the two-stage adaptive beamformer, for a similar filter length and adaptation time for the filter coefficients. In a diffuse noise scenario, the SVD-based technique performs better than the two-stage adaptive beamformer and hence provides a more flexible and robust solution under speaker position variations and reverberant conditions.

  12. SVD-Based Optimal Filtering Technique for Noise Reduction in Hearing Aids Using Two Microphones

    Science.gov (United States)

    Maj, Jean-Baptiste; Moonen, Marc; Wouters, Jan

    2002-12-01

    We introduce a new SVD-based (Singular value decomposition) strategy for noise reduction in hearing aids. This technique is evaluated for noise reduction in a behind-the-ear (BTE) hearing aid where two omnidirectional microphones are mounted in an endfire configuration. The behaviour of the SVD-based technique is compared to a two-stage adaptive beamformer for hearing aids developed by Vanden Berghe and Wouters (1998). The evaluation and comparison is done with a performance metric based on the speech intelligibility index (SII). The speech and noise signals are recorded in reverberant conditions with a signal-to-noise ratio of [InlineEquation not available: see fulltext.] and the spectrum of the noise signals is similar to the spectrum of the speech signal. The SVD-based technique works without initialization nor assumptions about a look direction, unlike the two-stage adaptive beamformer. Still, for different noise scenarios, the SVD-based technique performs as well as the two-stage adaptive beamformer, for a similar filter length and adaptation time for the filter coefficients. In a diffuse noise scenario, the SVD-based technique performs better than the two-stage adaptive beamformer and hence provides a more flexible and robust solution under speaker position variations and reverberant conditions.

  13. Experimental evaluation of optimal Vehicle Dynamic Control based on the State Dependent Riccati Equation technique

    NARCIS (Netherlands)

    Alirezaei, M.; Kanarachos, S.A.; Scheepers, B.T.M.; Maurice, J.P.

    2013-01-01

    Development and experimentally evaluation of an optimal Vehicle Dynamic Control (VDC) strategy based on the State Dependent Riccati Equation (SDRE) control technique is presented. The proposed nonlinear controller is based on a nonlinear vehicle model with nonlinear tire characteristics. A novel ext

  14. RF Sub-sampling Receiver Architecture based on Milieu Adapting Techniques

    DEFF Research Database (Denmark)

    Behjou, Nastaran; Larsen, Torben; Jensen, Ole Kiel

    2012-01-01

    A novel sub-sampling based architecture is proposed which has the ability of reducing the problem of image distortion and improving the signal to noise ratio significantly. The technique is based on sensing the environment and adapting the sampling rate of the receiver to the best possible...

  15. Constructing a Soil Class Map of Denmark based on the FAO Legend Using Digital Techniques

    DEFF Research Database (Denmark)

    Adhikari, Kabindra; Minasny, Budiman; Greve, Mette Balslev

    2014-01-01

    Soil mapping in Denmark has a long history and a series of soil maps based on conventional mapping approaches have been produced. In this study, a national soil map of Denmark was constructed based on the FAO–Unesco Revised Legend 1990 using digital soil mapping techniques, existing soil profile...

  16. A New Design Technique of Reversible BCD Adder Based on NMOS With Pass Transistor Gates

    CERN Document Server

    Hossain, Md Sazzad; Rahman, Md Motiur; Hossain, A S M Delowar; Hasan, Md Minul

    2012-01-01

    In this paper, we have proposed a new design technique of BCD Adder using newly constructed reversible gates are based on NMOS with pass transistor gates, where the conventional reversible gates are based on CMOS with transmission gates. We also compare the proposed reversible gates with the conventional CMOS reversible gates which show that the required number of Transistors is significantly reduced.

  17. A comparison of base running and sliding techniques in collegiate baseball with implications for sliding into first base

    Institute of Scientific and Technical Information of China (English)

    Travis Ficklin; Jesus Dapena; Alexander Brunfeldt

    2016-01-01

    Purpose: The purpose of this study was to compare 4 techniques for arrival at a base after sprinting maximally to reach it:sliding head-first, sliding feet-first, running through the base without slowing, and stopping on the base. A secondary purpose of the study was to determine any advantage there may be to diving into first base to arrive sooner than running through the base. Methods: Two high-definition video cameras were used to capture 3-dimensional kinematics of sliding techniques of 9 intercollegiate baseball players. Another video camera was used to time runs from first base to second in 4 counterbalanced conditions:running through the base, sliding head-first, sliding feet-first, and running to a stop. Mathematical modeling was used to simulate diving to first base such that the slide would begin when the hand touches the base. Results: Based upon overall results, the quickest way to the base is by running through it, followed by head-first, feet-first, and running to a stop. Conclusion: There was a non-significant trend toward an advantage for diving into first base over running through it, but more research is needed, and even if the advantage is real, the risks of executing this technique probably outweigh the miniscule gain.

  18. ON THE PAPR REDUCTION IN OFDM SYSTEMS: A NOVEL ZCT PRECODING BASED SLM TECHNIQUE

    Directory of Open Access Journals (Sweden)

    VARUN JEOTI

    2011-06-01

    Full Text Available High Peak to Average Power Ratio (PAPR reduction is still an important challenge in Orthogonal Frequency Division Multiplexing (OFDM systems. In this paper, we propose a novel Zadoff-Chu matrix Transform (ZCT precoding based Selected Mapping (SLM technique for PAPR reduction in OFDM systems. This technique is based on precoding the constellation symbols with ZCT precoder after the multiplication of phase rotation factor and before the Inverse Fast Fourier Transform (IFFT in the SLM based OFDM (SLM-OFDM Systems. Computer simulation results show that, the proposed technique can reduce PAPR up to 5.2 dB for N=64 (System subcarriers and V=16 (Dissimilar phase sequences, at clip rate of 10-3. Additionally, ZCT based SLM-OFDM (ZCT-SLM-OFDM systems also take advantage of frequency variations of the communication channel and can also offer substantial performance gain in fading multipath channels.

  19. Radiation Effects Investigations Based on Atmospheric Radiation Model (ATMORAD) Considering GEANT4 Simulations of Extensive Air Showers and Solar Modulation Potential.

    Science.gov (United States)

    Hubert, Guillaume; Cheminet, Adrien

    2015-07-01

    The natural radiative atmospheric environment is composed of secondary cosmic rays produced when primary cosmic rays hit the atmosphere. Understanding atmospheric radiations and their dynamics is essential for evaluating single event effects, so that radiation risks in aviation and the space environment (space weather) can be assessed. In this article, we present an atmospheric radiation model, named ATMORAD (Atmospheric Radiation), which is based on GEANT4 simulations of extensive air showers according to primary spectra that depend only on the solar modulation potential (force-field approximation). Based on neutron spectrometry, solar modulation potential can be deduced using neutron spectrometer measurements and ATMORAD. Some comparisons between our methodology and standard approaches or measurements are also discussed. This work demonstrates the potential for using simulations of extensive air showers and neutron spectroscopy to monitor solar activity.

  20. Review of Fluorescence-Based Velocimetry Techniques to Study High-Speed Compressible Flows

    Science.gov (United States)

    Bathel, Brett F.; Johansen, Criag; Inman, Jennifer A.; Jones, Stephen B.; Danehy, Paul M.

    2013-01-01

    This paper reviews five laser-induced fluorescence-based velocimetry techniques that have been used to study high-speed compressible flows at NASA Langley Research Center. The techniques discussed in this paper include nitric oxide (NO) molecular tagging velocimetry (MTV), nitrogen dioxide photodissociation (NO2-to-NO) MTV, and NO and atomic oxygen (O-atom) Doppler-shift-based velocimetry. Measurements of both single-component and two-component velocity have been performed using these techniques. This paper details the specific application and experiment for which each technique has been used, the facility in which the experiment was performed, the experimental setup, sample results, and a discussion of the lessons learned from each experiment.

  1. Computationally Efficient Implementation of Convolution-based Locally Adaptive Binarization Techniques

    OpenAIRE

    Mollah, Ayatullah Faruk; Basu, Subhadip; Nasipuri, Mita

    2012-01-01

    One of the most important steps of document image processing is binarization. The computational requirements of locally adaptive binarization techniques make them unsuitable for devices with limited computing facilities. In this paper, we have presented a computationally efficient implementation of convolution based locally adaptive binarization techniques keeping the performance comparable to the original implementation. The computational complexity has been reduced from O(W2N2) to O(WN2) wh...

  2. A Low Cost Vision Based Hybrid Fiducial Mark Tracking Technique for Mobile Industrial Robots

    OpenAIRE

    Mohammed Y Aalsalem; Wazir Zada Khan; Quratul Ain Arshad

    2012-01-01

    The field of robotic vision is developing rapidly. Robots can react intelligently and provide assistance to user activities through sentient computing. Since industrial applications pose complex requirements that cannot be handled by humans, an efficient low cost and robust technique is required for the tracking of mobile industrial robots. The existing sensor based techniques for mobile robot tracking are expensive and complex to deploy, configure and maintain. Also some of them demand dedic...

  3. A Low Cost Vision Based Hybrid Fiducial Mark Tracking Technique for Mobile Industrial Robots

    Directory of Open Access Journals (Sweden)

    Mohammed Y Aalsalem

    2012-07-01

    Full Text Available The field of robotic vision is developing rapidly. Robots can react intelligently and provide assistance to user activities through sentient computing. Since industrial applications pose complex requirements that cannot be handled by humans, an efficient low cost and robust technique is required for the tracking of mobile industrial robots. The existing sensor based techniques for mobile robot tracking are expensive and complex to deploy, configure and maintain. Also some of them demand dedicated and often expensive hardware. This paper presents a low cost vision based technique called “Hybrid Fiducial Mark Tracking” (HFMT technique for tracking mobile industrial robot. HFMT technique requires off-the-shelf hardware (CCD cameras and printable 2-D circular marks used as fiducials for tracking a mobile industrial robot on a pre-defined path. This proposed technique allows the robot to track on a predefined path by using fiducials for the detection of Right and Left turns on the path and White Strip for tracking the path. The HFMT technique is implemented and tested on an indoor mobile robot at our laboratory. Experimental results from robot navigating in real environments have confirmed that our approach is simple and robust and can be adopted in any hostile industrial environment where humans are unable to work.

  4. Scaling up the DBSCAN Algorithm for Clustering Large Spatial Databases Based on Sampling Technique

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Clustering, in data mining, is a useful technique for discoveringinte resting data distributions and patterns in the underlying data, and has many app lication fields, such as statistical data analysis, pattern recognition, image p rocessing, and etc. We combine sampling technique with DBSCAN alg orithm to cluster large spatial databases, and two sampling-based DBSCAN (SDBSC A N) algorithms are developed. One algorithm introduces sampling technique inside DBSCAN, and the other uses sampling procedure outside DBSCAN. Experimental resul ts demonstrate that our algorithms are effective and efficient in clustering lar ge-scale spatial databases.

  5. Study on measurement technique for sodium aerosols based on laser induced breakdown spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Ohtaka, Masahiko; Hayashida, Hitoshi [Japan Nuclear Cycle Development Inst., Oarai, Ibaraki (Japan). Oarai Engineering Center

    2003-03-01

    Detection of small-scale sodium leak in its early stage is effective for enhancing fast reactor safety. Feasibility of the measuring technique for sodium aerosols based on LIBS was assessed. The technique is expected to enhance performance on high discrimination for the sodium aerosols with an equivalent detection limit of conventional small sodium leak detectors. Experiments using sodium aerosols were performed in order to design a measuring system and to study on some basic measuring performances. The results show the LIBS technique is feasible as a small sodium leak detector. (author)

  6. An adaptive laser beam shaping technique based on a genetic algorithm

    Institute of Scientific and Technical Information of China (English)

    Ping Yang; Yuan Liu; Wei Yang; Minwu Ao; Shijie Hu; Bing Xu; Wenhan Jiang

    2007-01-01

    @@ A new adaptive beam intensity shaping technique based on the combination of a 19-element piezo-electricity deformable mirror (DM) and a global genetic algorithm is presented. This technique can adaptively adjust the voltages of the 19 actuators on the DM to reduce the difference between the target beam shape and the actual beam shape. Numerical simulations and experimental results show that within the stroke range of the DM, this technique can be well used to create the given beam intensity profiles on the focal plane.

  7. Salient Feature Identification and Analysis using Kernel-Based Classification Techniques for Synthetic Aperture Radar Automatic Target Recognition

    Science.gov (United States)

    2014-03-27

    SALIENT FEATURE IDENTIFICATION AND ANALYSIS USING KERNEL-BASED CLASSIFICATION TECHNIQUES FOR SYNTHETIC APERTURE RADAR AUTOMATIC TARGET RECOGNITION...FEATURE IDENTIFICATION AND ANALYSIS USING KERNEL-BASED CLASSIFICATION TECHNIQUES FOR SYNTHETIC APERTURE RADAR AUTOMATIC TARGET RECOGNITION THESIS Presented...SALIENT FEATURE IDENTIFICATION AND ANALYSIS USING KERNEL-BASED CLASSIFICATION TECHNIQUES FOR SYNTHETIC APERTURE RADAR AUTOMATIC TARGET RECOGNITION

  8. Web Tag Recommendation System Based on Semantic Extension%基于语义扩展的网页标签推荐系统

    Institute of Scientific and Technical Information of China (English)

    钱程; 阳小兰

    2012-01-01

    The mismatch between the web advertisement and the current page leads to worse advertisement effect. In the paper,we use the Bayesian model extension based on site and the semantic extension based on Wikipedia to accurately extract web tag information. We use more accurate tag to match the network advertisement, enhancing the advertisement effect. This paper designs a web tag recommendation system base on semantic extension. The experiments confirm that the effect is good.%网页广告与当前页面内容不匹配使得广告的投放效果降低.本文使用基于站点的贝叶斯模型扩展和基于维基百科的语义扩展两种方法,精确提取网页的标签信息,用更加精确的标签去匹配网络广告,增强了广告效果.本文实现了一个基于语义扩展的网页标签推荐系统,实验证实效果良好.

  9. Modal extension rule

    Institute of Scientific and Technical Information of China (English)

    WU Xia; SUN Jigui; LIN Hai; FENG Shasha

    2005-01-01

    Modal logics are good candidates for a formal theory of agents. The efficiency of reasoning method in modal logics is very important, because it determines whether or not the reasoning method can be widely used in systems based on agent. In this paper,we modify the extension rule theorem proving method we presented before, and then apply it to P-logic that is translated from modal logic by functional transformation. At last, we give the proof of its soundness and completeness.

  10. A comparative study on change vector analysis based change detection techniques

    Indian Academy of Sciences (India)

    Sartajvir Singh; Rajneesh Talwar

    2014-12-01

    Detection of Earth surface changes are essential to monitor regional climatic, snow avalanche hazard analysis and energy balance studies that occur due to air temperature irregularities. Geographic Information System (GIS) enables such research activities to be carried out through change detection analysis. From this viewpoint, different change detection algorithms have been developed for land-use land-cover (LULC) region. Among the different change detection algorithms, change vector analysis (CVA) has level headed capability of extracting maximuminformation in terms of overall magnitude of change and the direction of change between multispectral bands from multi-temporal satellite data sets. Since past two–three decades, many effective CVA based change detection techniques e.g., improved change vector analysis (ICVA), modified change vector analysis (MCVA) and change vector analysis posterior-probability space (CVAPS), have been developed to overcome the difficulty that exists in traditional change vector analysis (CVA). Moreover, many integrated techniques such as cross correlogram spectral matching (CCSM) based CVA. CVA uses enhanced principal component analysis (PCA) and inverse triangular (IT) function, hyper-spherical direction cosine (HSDC), and median CVA (m-CVA), as an effective LULC change detection tools. This paper comprises a comparative analysis on CVA based change detection techniques such as CVA, MCVA, ICVA and CVAPS. This paper also summarizes the necessary integrated CVA techniques along with their characteristics, features and shortcomings. Based on experiment outcomes, it has been evaluated that CVAPS technique has greater potential than other CVA techniques to evaluate the overall transformed information over three differentMODerate resolution Imaging Spectroradiometer (MODIS) satellite data sets of different regions. Results of this study are expected to be potentially useful for more accurate analysis of LULC changes which will, in turn

  11. Performance index based learning controls for the partial non-regular systems using lifting technique

    Institute of Scientific and Technical Information of China (English)

    Shengyue YANG; Xiaoping FAN; Zhihua QU

    2009-01-01

    Deficiencies of the performance-based iterative learning control (ILC) for the non-regular systems are investigated in detail, then a faster control input updating and lifting technique is introduced in the design of performance index based ILCs for the partial non-regular systems. Two kinds of optimal ILCs based on different performance indices are considered. Finally, simulation examples are given to illustrate the feasibility of the proposed learning controls.

  12. Efficient techniques for wave-based sound propagation in interactive applications

    Science.gov (United States)

    Mehra, Ravish

    Sound propagation techniques model the effect of the environment on sound waves and predict their behavior from point of emission at the source to the final point of arrival at the listener. Sound is a pressure wave produced by mechanical vibration of a surface that propagates through a medium such as air or water, and the problem of sound propagation can be formulated mathematically as a second-order partial differential equation called the wave equation. Accurate techniques based on solving the wave equation, also called the wave-based techniques, are too expensive computationally and memory-wise. Therefore, these techniques face many challenges in terms of their applicability in interactive applications including sound propagation in large environments, time-varying source and listener directivity, and high simulation cost for mid-frequencies. In this dissertation, we propose a set of efficient wave-based sound propagation techniques that solve these three challenges and enable the use of wave-based sound propagation in interactive applications. Firstly, we propose a novel equivalent source technique for interactive wave-based sound propagation in large scenes spanning hundreds of meters. It is based on the equivalent source theory used for solving radiation and scattering problems in acoustics and electromagnetics. Instead of using a volumetric or surface-based approach, this technique takes an object-centric approach to sound propagation. The proposed equivalent source technique generates realistic acoustic effects and takes orders of magnitude less runtime memory compared to prior wave-based techniques. Secondly, we present an efficient framework for handling time-varying source and listener directivity for interactive wave-based sound propagation. The source directivity is represented as a linear combination of elementary spherical harmonic sources. This spherical harmonic-based representation of source directivity can support analytical, data

  13. Comparison of acrylamide intake from Western and guideline based diets using probabilistic techniques and linear programming.

    Science.gov (United States)

    Katz, Josh M; Winter, Carl K; Buttrey, Samuel E; Fadel, James G

    2012-03-01

    Western and guideline based diets were compared to determine if dietary improvements resulting from following dietary guidelines reduce acrylamide intake. Acrylamide forms in heat treated foods and is a human neurotoxin and animal carcinogen. Acrylamide intake from the Western diet was estimated with probabilistic techniques using teenage (13-19 years) National Health and Nutrition Examination Survey (NHANES) food consumption estimates combined with FDA data on the levels of acrylamide in a large number of foods. Guideline based diets were derived from NHANES data using linear programming techniques to comport to recommendations from the Dietary Guidelines for Americans, 2005. Whereas the guideline based diets were more properly balanced and rich in consumption of fruits, vegetables, and other dietary components than the Western diets, acrylamide intake (mean±SE) was significantly greater (Plinear programming and results demonstrate that linear programming techniques can be used to model specific diets for the assessment of toxicological and nutritional dietary components.

  14. Distributed Synchronization Technique for OFDMA-Based Wireless Mesh Networks Using a Bio-Inspired Algorithm.

    Science.gov (United States)

    Kim, Mi Jeong; Maeng, Sung Joon; Cho, Yong Soo

    2015-07-28

    In this paper, a distributed synchronization technique based on a bio-inspired algorithm is proposed for an orthogonal frequency division multiple access (OFDMA)-based wireless mesh network (WMN) with a time difference of arrival. The proposed time- and frequency-synchronization technique uses only the signals received from the neighbor nodes, by considering the effect of the propagation delay between the nodes. It achieves a fast synchronization with a relatively low computational complexity because it is operated in a distributed manner, not requiring any feedback channel for the compensation of the propagation delays. In addition, a self-organization scheme that can be effectively used to construct 1-hop neighbor nodes is proposed for an OFDMA-based WMN with a large number of nodes. The performance of the proposed technique is evaluated with regard to the convergence property and synchronization success probability using a computer simulation.

  15. Distributed Synchronization Technique for OFDMA-Based Wireless Mesh Networks Using a Bio-Inspired Algorithm

    Directory of Open Access Journals (Sweden)

    Mi Jeong Kim

    2015-07-01

    Full Text Available In this paper, a distributed synchronization technique based on a bio-inspired algorithm is proposed for an orthogonal frequency division multiple access (OFDMA-based wireless mesh network (WMN with a time difference of arrival. The proposed time- and frequency-synchronization technique uses only the signals received from the neighbor nodes, by considering the effect of the propagation delay between the nodes. It achieves a fast synchronization with a relatively low computational complexity because it is operated in a distributed manner, not requiring any feedback channel for the compensation of the propagation delays. In addition, a self-organization scheme that can be effectively used to construct 1-hop neighbor nodes is proposed for an OFDMA-based WMN with a large number of nodes. The performance of the proposed technique is evaluated with regard to the convergence property and synchronization success probability using a computer simulation.

  16. The extension of a DNA double helix by an additional Watson-Crick base pair on the same backbone

    DEFF Research Database (Denmark)

    Kumar, P.; Sharma, P. K.; Madsen, Charlotte S.

    2013-01-01

    Additional base pair: The DNA duplex can be extended with an additional Watson-Crick base pair on the same backbone by the use of double-headed nucleotides. These also work as compressed dinucleotides and form two base pairs with cognate nucleobases on the opposite strand.......Additional base pair: The DNA duplex can be extended with an additional Watson-Crick base pair on the same backbone by the use of double-headed nucleotides. These also work as compressed dinucleotides and form two base pairs with cognate nucleobases on the opposite strand....

  17. Extensive soft-sediment deformation and peperite formation at the base of a rhyolite lava: Owyhee Mountains, SW Idaho, USA

    Science.gov (United States)

    McLean, Charlotte E.; Brown, David J.; Rawcliffe, Heather J.

    2016-06-01

    In the Northern Owyhee Mountains (SW Idaho), a >200-m-thick flow of the Miocene Jump Creek Rhyolite was erupted on to a sequence of tuffs, lapilli tuffs, breccias and lacustrine siltstones of the Sucker Creek Formation. The rhyolite lava flowed over steep palaeotopography, resulting in the forceful emplacement of lava into poorly consolidated sediments. The lava invaded this sequence, liquefying and mobilising the sediment, propagating sediment subvertically in large metre-scale fluidal diapirs and sediment injectites. The heat and the overlying pressure of the thick Jump Creek Rhyolite extensively liquefied and mobilised the sediment resulting in the homogenization of the Sucker Creek Formation units, and the formation of metre-scale loading structures (simple and pendulous load casts, detached pseudonodules). Density contrasts between the semi-molten rhyolite and liquefied sediment produced highly fluidal Rayleigh-Taylor structures. Local fluidisation formed peperite at the margins of the lava and elutriation structures in the disrupted sediment. The result is a 30-40-m zone beneath the rhyolite lava of extremely deformed stratigraphy. Brittle failure and folding is recorded in more consolidated sediments, indicating a differential response to loading due to the consolidation state of the sediments. The lava-sediment interaction is interpreted as being a function of (1) the poorly consolidated nature of the sediments, (2) the thickness and heat retention of the rhyolite lava, (3) the density contrast between the lava and the sediment and (4) the forceful emplacement of the lava. This study demonstrates how large lava bodies have the potential to extensively disrupt sediments and form significant lateral and vertical discontinuities that complicate volcanic facies architecture.

  18. A novel and alternative approach to controlled release drug delivery system based on solid dispersion technique

    Directory of Open Access Journals (Sweden)

    Tapan Kumar Giri

    2012-12-01

    Full Text Available The solid dispersion method was originally used to improve the dissolution properties and the bioavailability of poorly water soluble drugs by dispersing them into water soluble carriers. In addition to the above, dissolution retardation through solid dispersion technique using water insoluble and water swellable polymer for the development of controlled release dosage forms has become a field of interest in recent years. Development of controlled release solid dispersion has a great advantage for bypassing the risk of a burst release of drug; since the structure of the solid dispersion is monolithic where drug molecules homogeneously disperse. Despite the remarkable potential and extensive research being conducted on controlled release solid dispersion system, commercialization and large scale production are limited. The author expects that recent technological advances may overcome the existing limitations and facilitate the commercial utilization of the techniques for manufacture of controlled release solid dispersions. This article begins with an overview of the different carriers being used for the preparation of controlled release solid dispersion and also different techniques being used for the purpose. Kinetics of drug release from these controlled release solid dispersions and the relevant mathematical modeling have also been reviewed in this manuscript.

  19. Nonlinear Second-Order Partial Differential Equation-Based Image Smoothing Technique

    Directory of Open Access Journals (Sweden)

    Tudor Barbu

    2016-09-01

    Full Text Available A second-order nonlinear parabolic PDE-based restoration model is provided in this article. The proposed anisotropic diffusion-based denoising approach is based on some robust versions of the edge-stopping function and of the conductance parameter. Two stable and consistent approximation schemes are then developed for this differential model. Our PDE-based filtering technique achieves an efficient noise removal while preserving the edges and other image features. It outperforms both the conventional filters and also many PDE-based denoising approaches, as it results from the successful experiments and method comparison applied.

  20. Extensive Reading Coursebooks in China

    Science.gov (United States)

    Renandya, Willy A.; Hu, Guangwei; Xiang, Yu

    2015-01-01

    This article reports on a principle-based evaluation of eight dedicated extensive reading coursebooks published in mainland China and used in many universities across the country. The aim is to determine the extent to which these coursebooks reflect a core set of nine second language acquisition and extensive reading principles. Our analysis shows…

  1. Microcapsule-based techniques for improving the safety of lithium-ion batteries

    Science.gov (United States)

    Baginska, Marta

    Lithium-ion batteries are vital energy storage devices due to their high specific energy density, lack of memory effect, and long cycle life. While they are predominantly used in small consumer electronics, new strategies for improving battery safety and lifetime are critical to the successful implementation of high-capacity, fast-charging materials required for advanced Li-ion battery applications. Currently, the presence of a volatile, combustible electrolyte and an oxidizing agent (Lithium oxide cathodes) make the Li-ion cell susceptible to fire and explosions. Thermal overheating, electrical overcharging, or mechanical damage can trigger thermal runaway, and if left unchecked, combustion of battery materials. To improve battery safety, autonomic, thermally-induced shutdown of Li-ion batteries is demonstrated by depositing thermoresponsive polymer microspheres onto battery anodes. When the internal temperature of the cell reaches a critical value, the microspheres melt and conformally coat the anode and/or separator with an ion insulating barrier, halting Li-ion transport and shutting down the cell permanently. Charge and discharge capacity is measured for Li-ion coin cells containing microsphere-coated anodes or separators as a function of capsule coverage. Scanning electron microscopy images of electrode surfaces from cells that have undergone autonomic shutdown provides evidence of melting, wetting, and re-solidification of polyethylene (PE) into the anode and polymer film formation at the anode/separator interface. As an extension of this autonomic shutdown approach, a particle-based separator capable of performing autonomic shutdown, but which reduces the shorting hazard posed by current bi- and tri-polymer commercial separators, is presented. This dual-particle separator is composed of hollow glass microspheres acting as a physical spacer between electrodes, and PE microspheres to impart autonomic shutdown functionality. An oil-immersion technique is

  2. Key Techniques for the Development of Web-Based PDM System

    Institute of Scientific and Technical Information of China (English)

    WANG Li-juan; ZHANG Xu; NING Ru-xin

    2006-01-01

    Some key techniques for the development of web-based product data management (PDM) system are introduced. The four-tiered B/S architecture of a PDM system-BITPDM is introduced first, followed by its design and implementation, including virtual data vault, flexible coding system, document management,product structure and configuration management, workflow/process and product maturity management. BITPDM can facilitate the activities from new product introduction phase to manufacturing, and manage the product data and their dynamic changing history. Based on Microsoft. NET, XML, web service and SOAP techniques, BITPDM realizes the integration and efficient management of product information.

  3. HACIA LA EXTENSION DEL MÉTODO GRAY WATCH BASADO EN EL ESTÁNDAR DE CALIDAD ISO/IEC 25010 // TOWARDS THE EXTENSION OF THE GRAY WATCH METHOD BASED ON THE QUALITY STANDARD ISO/IEC 25010

    Directory of Open Access Journals (Sweden)

    Jorge Luis Pérez-Medina

    2012-06-01

    Full Text Available Talk about software quality implies the need to rely on parameters that should allow to establish the minimal levels that a product of this type must reach in order to be considered of quality. This paper aims to propose an extension of the method GRAY WATCH, specifically in the technical processes of Analysis and Design connecting the products obtained to the process of Implementation. The orientation of our proposal consists of using the standard of product quality ISO/IEC 25010, which establishes criteria for the specification of quality requirements of software products, their metrics and evaluation, and includes a quality model composed by characteristics and subcharacteristics. The result of this proposal, adds significant value to the extended method, Allowing to system analysts and Computer professionals to specify the precise activities to be performed to obtain quality requirements. To make this work we have supported our efforts in the Domain Engineering process based in Software Quality named InDoCaS as methodology for the definition of activities and products in the processes of Analysis, Design and Implementation of the Application.// RESUMEN: Hablar de calidad de software implica la necesidad de contar con parámetros que permitan establecer los niveles mínimos que un producto de este tipo debe alcanzar para que se considere de calidad. El presente articulo tiene como finalidad proponer una extension del método GRAY WATCH, especificamente en los procesos técnicos de análisis y diseño acoplando los productos obtenidos al proceso de implementación. La orientación de nuestra propuesta consiste en utilizar el estándar de calidad del producto ISO/IEC 25010, que establece criterios para la especificacion de requisitos de calidad de productos de software, sus métricas y su evaluación, e incluye un modelo de calidad compuesto por características y subcaracteristicas. El resultado de esta propuesta, agrega un valor importante al

  4. An improved visualization-based force-measurement technique for short-duration hypersonic facilities

    Energy Technology Data Exchange (ETDEWEB)

    Laurence, Stuart J.; Karl, Sebastian [Institute of Aerodynamics and Flow Technology, Spacecraft Section, German Aerospace Center (DLR), Goettingen (Germany)

    2010-06-15

    This article is concerned with describing and exploring the limitations of an improved version of a recently proposed visualization-based technique for the measurement of forces and moments in short-duration hypersonic wind tunnels. The technique is based on tracking the motion of a free-flying body over a sequence of high-speed visualizations; while this idea is not new in itself, the use of high-speed digital cinematography combined with a highly accurate least-squares tracking algorithm allows improved results over what have been previously possible with such techniques. The technique precision is estimated through the analysis of artificially constructed and experimental test images, and the resulting error in acceleration measurements is characterized. For wind-tunnel scale models, position measurements to within a few microns are shown to be readily attainable. Image data from two previous experimental studies in the T5 hypervelocity shock tunnel are then reanalyzed with the improved technique: the uncertainty in the mean drag acceleration is shown to be reduced to the order of the flow unsteadiness, 2-3%, and time-resolved acceleration measurements are also shown to be possible. The response time of the technique for the configurations studied is estimated to be {proportional_to}0.5 ms. Comparisons with computations using the DLR TAU code also yield agreement to within the overall experimental uncertainty. Measurement of the pitching moment for blunt geometries still appears challenging, however. (orig.)

  5. An improved visualization-based force-measurement technique for short-duration hypersonic facilities

    Science.gov (United States)

    Laurence, Stuart J.; Karl, Sebastian

    2010-06-01

    This article is concerned with describing and exploring the limitations of an improved version of a recently proposed visualization-based technique for the measurement of forces and moments in short-duration hypersonic wind tunnels. The technique is based on tracking the motion of a free-flying body over a sequence of high-speed visualizations; while this idea is not new in itself, the use of high-speed digital cinematography combined with a highly accurate least-squares tracking algorithm allows improved results over what have been previously possible with such techniques. The technique precision is estimated through the analysis of artificially constructed and experimental test images, and the resulting error in acceleration measurements is characterized. For wind-tunnel scale models, position measurements to within a few microns are shown to be readily attainable. Image data from two previous experimental studies in the T5 hypervelocity shock tunnel are then reanalyzed with the improved technique: the uncertainty in the mean drag acceleration is shown to be reduced to the order of the flow unsteadiness, 2-3%, and time-resolved acceleration measurements are also shown to be possible. The response time of the technique for the configurations studied is estimated to be ˜0.5 ms. Comparisons with computations using the DLR TAU code also yield agreement to within the overall experimental uncertainty. Measurement of the pitching moment for blunt geometries still appears challenging, however.

  6. On some logical and algebraic properties of axiomatic extensions of the monoidal t-norm based logic MTL related with single chain completeness

    CERN Document Server

    Bianchi, Matteo

    2012-01-01

    In [Mon11] are studied, for the axiomatic extensions of the monoidal t-norm based logic ([EG01]), the properties of single chain completeness. On the other side, in [GJKO07, Chapter 5] are studied many logical and algebraic properties (like Halld\\'en completeness, variable separation properties, amalgamation property etc.), in the context of substructural logics. The aim of this paper is twofold: first of all we will specialize the properties studied in [GJKO07, Chapter 5] from the case of substructural logics to the one of extensions of MTL, by obtaining some general characterization. Moreover we will show that some of these properties are indeed strictly connected to the topics developed in [Mon11]. This will help to have a better intuition concerning some open problems of [Mon11].

  7. Evaluation of paint coating thickness variations based on pulsed Infrared thermography laser technique

    Science.gov (United States)

    Mezghani, S.; Perrin, E.; Vrabie, V.; Bodnar, J. L.; Marthe, J.; Cauwe, B.

    2016-05-01

    In this paper, a pulsed Infrared thermography technique using a homogeneous heat provided by a laser source is used for the non-destructive evaluation of paint coating thickness variations. Firstly, numerical simulations of the thermal response of a paint coated sample are performed. By analyzing the thermal responses as a function of thermal properties and thickness of both coating and substrate layers, optimal excitation parameters of the heating source are determined. Two characteristic parameters were studied with respect to the paint coating layer thickness variations. Results obtained using an experimental test bench based on the pulsed Infrared thermography laser technique are compared with those given by a classical Eddy current technique for paint coating variations from 5 to 130 μm. These results demonstrate the efficiency of this approach and suggest that the pulsed Infrared thermography technique presents good perspectives to characterize the heterogeneity of paint coating on large scale samples with other heating sources.

  8. Cross-correlation function based multipath mitigation technique for cosine-BOC signals

    Institute of Scientific and Technical Information of China (English)

    Huihua Chen; Weimin Jia; Minli Yao

    2013-01-01

    We propose a new multipath mitigation technique based on cross-correlation function for the new cosine phased binary off-set carrier (cosine-BOC) modulated signals, which wil most likely be employed in both European Galileo system and Chinese Com-pass system. This technique is implemented to create an optimum cross-correlation function via designing the modulated symbols of the local signal. And the structure of the code tracking loop for cosine-BOC signals is quite simple including only two real correla-tors. Results demonstrate that the technique efficiently eliminates the ranging errors in the medium and long multipath regions with respect to the conventional receiver correlation techniques.

  9. Interference Mitigation Technique for Coexistence of Pulse-Based UWB and OFDM

    Directory of Open Access Journals (Sweden)

    Tetsushi Ikegami

    2008-04-01

    Full Text Available Ultra-wideband (UWB is a useful radio technique for sharing frequency bands between radio systems. It uses very short pulses to spread spectrum. However, there is a potential for interference between systems using the same frequency bands at close range. In some regulatory systems, interference detection and avoidance (DAA techniques are required to prevent interference with existing radio systems. In this paper, the effect of interference on orthogonal frequency division multiplexing (OFDM signals from pulse-based UWB is discussed, and an interference mitigation technique is proposed. This technique focuses on the pulse repetition cycle of UWB. The pulse repetition interval is set the same or half the period of the OFDM symbol excluding the guard interval to mitigate interference. These proposals are also made for direct sequence (DS-UWB. Bit error rate (BER performance is illustrated through both simulation and theoretical approximations.

  10. Interference Mitigation Technique for Coexistence of Pulse-Based UWB and OFDM

    Directory of Open Access Journals (Sweden)

    Ohno Kohei

    2008-01-01

    Full Text Available Abstract Ultra-wideband (UWB is a useful radio technique for sharing frequency bands between radio systems. It uses very short pulses to spread spectrum. However, there is a potential for interference between systems using the same frequency bands at close range. In some regulatory systems, interference detection and avoidance (DAA techniques are required to prevent interference with existing radio systems. In this paper, the effect of interference on orthogonal frequency division multiplexing (OFDM signals from pulse-based UWB is discussed, and an interference mitigation technique is proposed. This technique focuses on the pulse repetition cycle of UWB. The pulse repetition interval is set the same or half the period of the OFDM symbol excluding the guard interval to mitigate interference. These proposals are also made for direct sequence (DS-UWB. Bit error rate (BER performance is illustrated through both simulation and theoretical approximations.

  11. DCT-Yager FNN: a novel Yager-based fuzzy neural network with the discrete clustering technique.

    Science.gov (United States)

    Singh, A; Quek, C; Cho, S Y

    2008-04-01

    superior performance. Extensive experiments have been conducted to test the effectiveness of these two networks, using various clustering algorithms. It follows that the SDCT and UDCT clustering algorithms are particularly suited to networks based on the Yager inference rule.

  12. Mobility based key management technique for multicast security in mobile ad hoc networks.

    Science.gov (United States)

    Madhusudhanan, B; Chitra, S; Rajan, C

    2015-01-01

    In MANET multicasting, forward and backward secrecy result in increased packet drop rate owing to mobility. Frequent rekeying causes large message overhead which increases energy consumption and end-to-end delay. Particularly, the prevailing group key management techniques cause frequent mobility and disconnections. So there is a need to design a multicast key management technique to overcome these problems. In this paper, we propose the mobility based key management technique for multicast security in MANET. Initially, the nodes are categorized according to their stability index which is estimated based on the link availability and mobility. A multicast tree is constructed such that for every weak node, there is a strong parent node. A session key-based encryption technique is utilized to transmit a multicast data. The rekeying process is performed periodically by the initiator node. The rekeying interval is fixed depending on the node category so that this technique greatly minimizes the rekeying overhead. By simulation results, we show that our proposed approach reduces the packet drop rate and improves the data confidentiality.

  13. Vision-based system identification technique for building structures using a motion capture system

    Science.gov (United States)

    Oh, Byung Kwan; Hwang, Jin Woo; Kim, Yousok; Cho, Tongjun; Park, Hyo Seon

    2015-11-01

    This paper presents a new vision-based system identification (SI) technique for building structures by using a motion capture system (MCS). The MCS with outstanding capabilities for dynamic response measurements can provide gage-free measurements of vibrations through the convenient installation of multiple markers. In this technique, from the dynamic displacement responses measured by MCS, the dynamic characteristics (natural frequency, mode shape, and damping ratio) of building structures are extracted after the processes of converting the displacement from MCS to acceleration and conducting SI by frequency domain decomposition. A free vibration experiment on a three-story shear frame was conducted to validate the proposed technique. The SI results from the conventional accelerometer-based method were compared with those from the proposed technique and showed good agreement, which confirms the validity and applicability of the proposed vision-based SI technique for building structures. Furthermore, SI directly employing MCS measured displacements to FDD was performed and showed identical results to those of conventional SI method.

  14. High-resolution remote sensing image-based extensive deformation-induced landslide displacement field monitoring method

    Institute of Scientific and Technical Information of China (English)

    Shanjun Liu; Han Wang; Jianwei Huang; Lixin Wu

    2015-01-01

    Landslide is one of the multitudinous serious geological hazards. The key to its control and reduction lies on dynamic monitoring and early warning. The article points out the insufficiency of traditional measuring means applied for large-scale landslide monitoring and proposes the method for extensive landslide displacement field monitoring using high-resolution remote images. Matching of cognominal points is realized by using the invariant features of SIFT algorithm in image translation, rotation, zooming, and affine transformation, and through recognition and comparison of characteristics of high-resolution images in different landsliding periods. Following that, landslide displacement vector field can be made known by measuring the distances and directions between cognominal points. As evidenced by field application of the method for landslide monitoring at West Open Mine in Fushun city of China, the method has the attraction of being able to make areal measurement through satellite observation and capable of obtaining at the same time the information of large-area intensive displacement field, for facilitating automatic delimitation of extent of landslide displacement vector field and sliding mass. This can serve as a basis for making analysis of laws governing occurrence of landslide and adoption of countermeasures.

  15. Chaotic Extension Neural Network Theory-Based XXY Stage Collision Fault Detection Using a Single Accelerometer Sensor

    Directory of Open Access Journals (Sweden)

    Chin-Tsung Hsieh

    2014-11-01

    Full Text Available The collision fault detection of a XXY stage is proposed for the first time in this paper. The stage characteristic signals are extracted and imported into the master and slave chaos error systems by signal filtering from the vibratory magnitude of the stage. The trajectory diagram is made from the chaos synchronization dynamic error signals E1 and E2. The distance between characteristic positive and negative centers of gravity, as well as the maximum and minimum distances of trajectory diagram, are captured as the characteristics of fault recognition by observing the variation in various signal trajectory diagrams. The matter-element model of normal status and collision status is built by an extension neural network. The correlation grade of various fault statuses of the XXY stage was calculated for diagnosis. The dSPACE is used for real-time analysis of stage fault status with an accelerometer sensor. Three stage fault statuses are detected in this study, including normal status, Y collision fault and X collision fault. It is shown that the scheme can have at least 75% diagnosis rate for collision faults of the XXY stage. As a result, the fault diagnosis system can be implemented using just one sensor, and consequently the hardware cost is significantly reduced.

  16. Segmentation techniques evaluation based on a single compact breast mass classification scheme

    Science.gov (United States)

    Matheus, Bruno R. N.; Marcomini, Karem D.; Schiabel, Homero

    2016-03-01

    In this work some segmentation techniques are evaluated by using a simple centroid-based classification system regarding breast mass delineation in digital mammography images. The aim is to determine the best one for future CADx developments. Six techniques were tested: Otsu, SOM, EICAMM, Fuzzy C-Means, K-Means and Level-Set. All of them were applied to segment 317 mammography images from DDSM database. A single compact set of attributes was extracted and two centroids were defined, one for malignant and another for benign cases. The final classification was based on proximity with a given centroid and the best results were presented by the Level-Set technique with a 68.1% of Accuracy, which indicates this method as the most promising for breast masses segmentation aiming a more precise interpretation in schemes CADx.

  17. Novel stability criteria for fuzzy Hopfield neural networks based on an improved homogeneous matrix polynomials technique

    Institute of Scientific and Technical Information of China (English)

    Feng Yi-Fu; Zhang Qing-Ling; Feng De-Zhi

    2012-01-01

    The global stability problem of Takagi-Sugeno (T S) fuzzy Hopfield neural networks (FHNNs) with time delays is investigated.Novel LMI-based stability criteria are obtained by using Lyapunov functional theory to guarantee the asymptotic stability of the FHNNs with less conservatism.Firstly,using both Finsler's lemma and an improved homogeneous matrix polynomial technique,and applying an affine parameter-dependent Lyapunov-Krasovskii functional,we obtain the convergent LMI-based stability criteria.Algebraic properties of the fuzzy membership functions in the unit simplex are considered in the process of stability analysis via the homogeneous matrix polynomials technique.Secondly,to further reduce the conservatism,a new right-hand-side slack variables introducing technique is also proposed in terms of LMIs,which is suitable to the homogeneous matrix polynomials setting.Finally,two illustrative examples are given to show the efficiency of the proposed approaches.

  18. Novel technique for distributed fibre sensing based on coherent Rayleigh scattering measurements of birefringence

    Science.gov (United States)

    Lu, Xin; Soto, Marcelo A.; Thévenaz, Luc

    2016-05-01

    A novel distributed fibre sensing technique is described and experimentally validated, based on birefringence measurements using coherent Rayleigh scattering. It natively provides distributed measurements of temperature and strain with more than an order of magnitude higher sensitivity than Brillouin sensing, and requiring access to a single fibre-end. Unlike the traditional Rayleigh-based coherent optical time-domain reflectometry, this new method provides absolute measurements of the measurand and may lead to a robust discrimination between temperature and strain in combination with another technique. Since birefringence is purposely induced in the fibre by design, large degrees of freedom are offered to optimize and scale the sensitivity to a given quantity. The technique has been validated in 2 radically different types of birefringent fibres - elliptical-core and Panda polarization-maintaining fibres - with a good repeatability.

  19. Randomization techniques for the intensity modulation-based quantum stream cipher and progress of experiment

    Science.gov (United States)

    Kato, Kentaro; Hirota, Osamu

    2011-08-01

    The quantum noise based direct encryption protocol Y-OO is expected to provide physical complexity based security, which is thought to be comparable to information theoretic security in mathematical cryptography, for the. physical layer of fiber-optic communication systems. So far, several randomization techniques for the quantum stream cipher by Y-OO protocol have been proposed, but most of them were developed under the assumption that phase shift keying is used as the modulation format. On the other hand, the recent progress in the experimental study on the intensity modulation based quantum stream cipher by Y-OO protocol raises expectations for its realization. The purpose of this paper is to present design and implementation methods of a composite model of the intensity modulation based quantum stream cipher with some randomization techniques. As a result this paper gives a viewpoint of how the Y-OO cryptosystem is miniaturized.

  20. THE USE OF SCAFFOLDING TECHNIQUE TO IMPROVE THE STUDENTS’ COMPETENCE IN WRITING GENRE-BASED TEXTS

    Directory of Open Access Journals (Sweden)

    Sri Mulatsih

    2011-04-01

    Full Text Available Dalam menulis teks yang bersifat genre-based, para mahasiswa masih mengalami kesulitan, khususnya tentang bagaimana mengembangkan ide, membangun sturktur tematik yang benar, dan menggunakan cirri lexico-grammar pada teks. Untuk mengatasi masalah itu the scaffolding technique dibutuhkan dalam kelas Writing. Teknik itu diberikan untuk membantu para mahasiswa menulis sebuah teks, termasuk, persiapan, presentasi dan refleksi. 25 mahasiswa Fakultas Bahasa & Sastra, Dian Nuswantoro, dipilih untuk diberi pengarahan dan diminta untuk menulis teks genre-based dalam bahasa Inggris. Siklusnya diulang tiga kali. Hasilnya menunjukkan peningkatan yang signifikan pada kompetensi menulis teks yang bersifat genre-based itu.   Keywords    :    Genre-Based Writing, Scaffolding Technique, Students’ competence, Texts, Teaching.