WorldWideScience

Sample records for base extension technique

  1. High-extensible scene graph framework based on component techniques

    Institute of Scientific and Technical Information of China (English)

    LI Qi-cheng; WANG Guo-ping; ZHOU Feng

    2006-01-01

    In this paper, a novel component-based scene graph is proposed, in which all objects in the scene are classified to different entities, and a scene can be represented as a hierarchical graph composed of the instances of entities. Each entity contains basic data and its operations which are encapsulated into the entity component. The entity possesses certain behaviours which are responses to rules and interaction defined by the high-level application. Such behaviours can be described by script or behaviours model. The component-based scene graph in the paper is more abstractive and high-level than traditional scene graphs. The contents of a scene could be extended flexibly by adding new entities and new entity components, and behaviour modification can be obtained by modifying the model components or behaviour scripts. Its robustness and efficiency are verified by many examples implemented in the Virtual Scenario developed by Peking University.

  2. An enhanced single base extension technique for the analysis of complex viral populations.

    Directory of Open Access Journals (Sweden)

    Dale R Webster

    Full Text Available Many techniques for the study of complex populations provide either specific information on a small number of variants or general information on the entire population. Here we describe a powerful new technique for elucidating mutation frequencies at each genomic position in a complex population. This single base extension (SBE based microarray platform was designed and optimized using poliovirus as the target genotype, but can be easily adapted to assay populations derived from any organism. The sensitivity of the method was demonstrated by accurate and consistent readouts from a controlled population of mutant genotypes. We subsequently deployed the technique to investigate the effects of the nucleotide analog ribavirin on a typical poliovirus population through two rounds of passage. Our results show that this economical platform can be used to investigate dynamic changes occurring at frequencies below 1% within a complex nucleic acid population. Given that many key aspects of the study and treatment of disease are intimately linked to population-level genomic diversity, our SBE-based technique provides a scalable and cost-effective complement to both traditional and next generation sequencing methodologies.

  3. Method Chunks Selection by Multicriteria Techniques: an Extension of the Assembly-based Approach

    CERN Document Server

    Kornyshova, Elena; Salinesi, Camille

    2009-01-01

    The work presented in this paper is related to the area of situational method engineering (SME). In this domain, approaches are developed accordingly to specific project specifications. We propose to adapt an existing method construction process, namely the assembly-based one. One of the particular features of assembly-based SME approach is the selection of method chunks. Our proposal is to offer a better guidance in the retrieval of chunks by the introduction of multicriteria techniques. To use them efficiently, we defined a typology of projects characteristics, in order to identify all their critical aspects, which will offer a priorisation to help the method engineer in the choice between similar chunks.

  4. A single base extension technique for the analysis of known mutations utilizing capillary gel electrophoreisis with electrochemical detection.

    Science.gov (United States)

    Brazill, Sara A; Kuhr, Werner G

    2002-07-15

    A novel single nucleotide polymorphism (SNP) detection system is described in which the accuracy of DNA polymerase and advantages of electrochemical detection are demonstrated. A model SNP system is presented to illustrate the potential advantages in coupling the single base extension (SBE) technique to capillary gel electrophoresis (CGE) with electrochemical detection. An electrochemically labeled primer, with a ferrocene acetate covalently attached to its 5' end, is used in the extension reaction. When the Watson-Crick complementary ddNTP is added to the SBE reaction, the primer is extended by a single nucleotide. The reaction mixture is subsequently separated by CGE, and the ferrocene-tagged fragments are detected at the separation anode with sinusoidal voltammetry. This work demonstrates the first single base resolution separation of DNA coupled with electrochemical detection. The unextended primer (20-mer) and the 21-mer extension product are separated with a resolution of 0.8. PMID:12139049

  5. DNA Microarray Based on Arrayed-Primer Extension Technique for Identification of Pathogenic Fungi Responsible for Invasive and Superficial Mycoses▿

    OpenAIRE

    Campa, Daniele; Tavanti, Arianna; Gemignani, Federica; Mogavero, Crocifissa S.; Bellini, Ilaria; Bottari, Fabio; Barale, Roberto; Landi, Stefano; Senesi, Sonia

    2007-01-01

    An oligonucleotide microarray based on the arrayed-primer extension (APEX) technique has been developed to simultaneously identify pathogenic fungi frequently isolated from invasive and superficial infections. Species-specific oligonucleotide probes complementary to the internal transcribed spacer 1 and 2 (ITS1 and ITS2) region were designed for 24 species belonging to 10 genera, including Candida species (Candida albicans, Candida dubliniensis, Candida famata, Candida glabrata, Candida tropi...

  6. Network Lifetime Extension Based On Network Coding Technique In Underwater Acoustic Sensor Networks

    Directory of Open Access Journals (Sweden)

    Padmavathy.T.V

    2012-06-01

    Full Text Available Underwater acoustic sensor networks (UWASNs are playing a lot of interest in ocean applications, such as ocean pollution monitoring, ocean animal surveillance, oceanographic data collection, assisted- navigation, and offshore exploration, UWASN is composed of underwater sensors that engage sound to transmit information collected in the ocean. The reason to utilize sound is that radio frequency (RF signals used by terrestrial sensor networks (TWSNs can merely transmit a few meters in the water. Unfortunately, the efficiency of UWASNs is inferior to that of the terrestrial sensor networks (TWSNs. Some of the challenges in under water communication are propagation delay, high bit error rate and limited bandwidth. Our aim is to minimize the power consumption and to improve the reliability of data transmission by finding the optimum number of clusters based on energy consumption.

  7. Investigating Upper Bounds on Network Lifetime Extension for Cell-Based Energy Conservation Techniques in Stationary Ad Hoc Networks

    OpenAIRE

    Santi, Paolo

    2002-01-01

    Cooperative cell-based strategies have been recently proposed as a technique for extending the lifetime of wireless adhoc networks, while only slightly impacting network performance. The effectiveness of this approach depends heavilyon the node density: the higher it is, the more consistent energy savings can potentially be achieved. However, nogeneral analyses of network lifetime have been done either for a base network (one without any energy conservationtechnique) or for one using cooperat...

  8. Synchrotron radiation techniques. Extension to magnetism research

    International Nuclear Information System (INIS)

    Recently developed techniques using synchrotron radiation for the study of magnetism are reviewed. These techniques are based on X-ray absorption spectroscopy (XAS), and they exhibit significant advantages in element specificity. This is very important since the most attractive magnetic materials contain many magnetic elements, and those with small magnetic moments often play an essential role in the magnetic properties. Circularly polarized X-rays emitted from bending magnets or helical undulators allow us to perform magnetic circular dichroism measurements to reveal microscopic magnetic properties of various kinds of magnetic materials. X-ray absorption magnetic circular dichroism (XMCD) is discussed in detail. This technique provides unique information on orbital magnetic moments as well as spin magnetic moments, which are useful for the study of magnetic anisotropy. X-ray magnetic linear dichroism (XMLD) and X-ray resonant magnetic reflectometry (XRMR) techniques are also described. (author)

  9. Veterinary extension on sampling techniques related to heartwater research

    Directory of Open Access Journals (Sweden)

    H.C. Steyn

    2010-05-01

    Full Text Available Heartwater, a tick-borne disease caused by Ehrlichia ruminantium, is considered to be a significant cause of mortality amongst domestic and wild ruminants in South Africa. The main vector is Amblyomma hebraeum and although previous epidemiological studies have outlined endemic areas based on mortalities, these have been limited by diagnostic methods which relied mainly on positive brain smears. The indirect fluorescent antibody test (IFA has a low specificity for heartwater organisms as it cross-reacts with some other species. Since the advent of biotechnology and genomics, molecular epidemiology has evolved using the methodology of traditional epidemiology coupled with the new molecular techniques. A new quantitative real-time polymerase chain reaction (qPCR test has been developed for rapid and accurate diagnosis of heartwater in the live animal. This method can also be used to survey populations of A. hebraeum ticks for heartwater. Sampling whole blood and ticks for this qPCR differs from routine serum sampling, which is used for many serological tests. Veterinary field staff, particularly animal health technicians, are involved in surveillance and monitoring of controlled and other diseases of animals in South Africa. However, it was found that the sampling of whole blood was not done correctly, probably because it is a new sampling technique specific for new technology, where the heartwater organism is much more labile than the serum antibodies required for other tests. This qPCR technique is highly sensitive and can diagnose heartwater in the living animal within 2 hours, in time to treat it. Poor sampling techniques that decrease the sensitivity of the test will, however, result in a false negative diagnosis. This paper describes the development of a skills training programme for para-veterinary field staff, to facilitate research into the molecular epidemiology of heartwater in ruminants and eliminate any sampling bias due to collection

  10. Wavelet Based Image Denoising Technique

    Directory of Open Access Journals (Sweden)

    Sachin D Ruikar

    2011-03-01

    Full Text Available This paper proposes different approaches of wavelet based image denoising methods. The search for efficient image denoising methods is still a valid challenge at the crossing of functional analysis and statistics. In spite of the sophistication of the recently proposed methods, most algorithms have not yet attained a desirable level of applicability. Wavelet algorithms are useful tool for signal processing such as image compression and denoising. Multi wavelets can be considered as an extension of scalar wavelets. The main aim is to modify the wavelet coefficients in the new basis, the noise can be removed from the data. In this paper, we extend the existing technique and providing a comprehensive evaluation of the proposed method. Results based on different noise, such as Gaussian, Poisson’s, Salt and Pepper, and Speckle performed in this paper. A signal to noise ratio as a measure of the quality of denoising was preferred.

  11. Improvements and extensions of the item count technique

    OpenAIRE

    Groenitz, Heiko

    2014-01-01

    The item count technique (ICT) is a helpful tool to conduct surveys on sensitive characteristics such as tax evasion, corruption, insurance fraud, social fraud or drug consumption. The ICT yields cooperation of the respondents by protecting their privacy. There have been several interesting developments on the ICT in recent years. However, some approaches are incomplete while some research questions can not be tackled by the ICT so far. For these reasons, we broaden the existing literature in...

  12. Veterinary extension on sampling techniques related to heartwater research

    OpenAIRE

    H. C. Steyn; C.M.E. McCrindle; Du Toit, D.

    2010-01-01

    Heartwater, a tick-borne disease caused by Ehrlichia ruminantium, is considered to be a significant cause of mortality amongst domestic and wild ruminants in South Africa. The main vector is Amblyomma hebraeum and although previous epidemiological studies have outlined endemic areas based on mortalities, these have been limited by diagnostic methods which relied mainly on positive brain smears. The indirect fluorescent antibody test (IFA) has a low specificity for heartwater organisms as it c...

  13. Should structure-based virtual screening techniques be used more extensively in modern drug discovery?%基于结构的虚拟筛选技术能更广泛应用于现代药物发现?

    Institute of Scientific and Technical Information of China (English)

    V. Leroux; B. Maigret

    2007-01-01

    The drug discovery processes used by academic and industrial scientists are nowadays being questioned. The approaches of the pharmaceutical industry that were successful 20 years ago are simply not suitable anymore for the increasing complexity of available biological targets and the raising standards for medical safety. While the current scientific context resulting from significant developments in genomic, proteomic, organic synthesis and biochemistry seems particularly favorable, the efficiency of drug research does not appear to be following the trend. In particular, the in silico approaches, often considered as potential enhancements for classic drug discovery, are an interesting case. Techniques such as virtual screening did undergo many significant progresses in the past 5-10years and have proven their usefulness in hit discovery approaches for who wants to avoid carrying out too many expensive experimental tests while exploring an important molecular diversity. However, reliability is still deceiving despite constant enhancements,and results are unpredictable. What are the origins of such issues?In this short review, we will first summarize the current status of computer-aided drug design, then we will focus on the structurebased class of virtual screening approaches, for which docking programs constitute the main part. Can such methods give something more than cost savings in the early banks-to-hit phases of the drug discovery process? We will try to answer this question by exploring the highlights and pitfalls of the great variety of docking approaches. It will appear that while the structure-based drug design field is not yet ready to fulfill all of its early promises, it should still be investigated extensively and used with caution. Most interestingly,structure-based methods are best used when combined with other complementary drug design approaches such as the ligand-based ones. In this regard, they will have an increasing role to play in modern drug

  14. Advanced condition monitoring techniques and plant life extension studies at EBR-2

    International Nuclear Information System (INIS)

    Numerous advanced techniques have been evaluated and tested at EBR-2 as part of a plant-life extension program for detection of degradation and other abnormalities in plant systems. Two techniques have been determined to be of considerable assistance in planning for the extended-life operation of EBR-2. The first, a computer-based pattern-recognition system (System State Analyzer or SSA) is used for surveillance of the primary system instrumentation, primary sodium pumps and plant heat balances. This surveillance has indicated that the SSA can detect instrumentation degradation and system performance degradation over varying time intervals and can be used to provide derived signal values to replace signals from failed sensors. The second technique, also a computer-based pattern-recognition system (Sequential Probability Ratio Test or SPRT) is used to validate signals and to detect incipient failures in sensors and components or systems. It is being used on the failed fuel detection system and is experimentally used on the primary coolant pumps. Both techniques are described and experience with their operation presented

  15. Biomechanical analysis of press-extension technique on degenerative lumbar with disc herniation and staggered facet joint.

    Science.gov (United States)

    Du, Hong-Gen; Liao, Sheng-Hui; Jiang, Zhong; Huang, Huan-Ming; Ning, Xi-Tao; Jiang, Neng-Yi; Pei, Jian-Wei; Huang, Qin; Wei, Hui

    2016-05-01

    This study investigates the effect of a new Chinese massage technique named "press-extension" on degenerative lumbar with disc herniation and facet joint dislocation, and provides a biomechanical explanation of this massage technique. Self-developed biomechanical software was used to establish a normal L1-S1 lumbar 3D FE model, which integrated the spine CT and MRI data-based anatomical structure. Then graphic technique is utilized to build a degenerative lumbar FE model with disc herniation and facet joint dislocation. According to the actual press-extension experiments, mechanic parameters are collected to set boundary condition for FE analysis. The result demonstrated that press-extension techniques bring the annuli fibrosi obvious induction effect, making the central nucleus pulposus forward close, increasing the pressure in front part. Study concludes that finite element modelling for lumbar spine is suitable for the analysis of press-extension technique impact on lumbar intervertebral disc biomechanics, to provide the basis for the disease mechanism of intervertebral disc herniation using press-extension technique. PMID:27275119

  16. A Novel Active Network Architecture Based on Extensible Services Router

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    Active networks are a new kind of packet-switched networks in which packets have code fragments that are executed on the intermediary nodes (routers). The code can extend or modify the foundation architecture of a network. In this paper, the authors present a novel active network architecture combined with advantages of two major active networks technology based on extensible services router. The architecture consists of extensible service router, active extensible components server and key distribution center (KDC). Users can write extensible service components with programming interface. At the present time, we have finished the extensible services router prototype system based on Highly Efficient Router Operating System (HEROS), active extensible components server and KDC prototype system based on Linux.

  17. Source extension based on ε-entropy

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jian; YU Sheng-sheng; ZHOU Jing-li; ZHENG Xin-wei

    2005-01-01

    It is known by entropy theory that image is a source correlated with a certain characteristic of probability. The entropy rate of the source and ? entropy (rate-distortion function theory) are the information content to identify the characteristics of video images, and hence are essentially related with video image compression. They are fundamental theories of great significance to image compression, though impossible to be directly turned into a compression method. Based on the entropy theory and the image compression theory, by the application of the rate-distortion feature mathematical model and Lagrange multipliers to some theoretical problems in the H.264 standard, this paper presents a new the algorithm model of coding rate-distortion. This model is introduced into complete test on the capability of the test model of JM61e (JUT Test Model). The result shows that the speed of coding increases without significant reduction of the rate-distortion performance of the coder.

  18. The extension of the split window technique to passive microwave surface temperature assessment

    International Nuclear Information System (INIS)

    The theoretical study of both land and sea surface temperature remote sensing is treated through investigating the extension to the microwave region (1-100GHz) of the split window technique, usually used in the thermal infrared region for sea surface temperature measurements. The study of land surface temperature shows that, in both regions (infrared and microwave), the influence (of atmospheric water vapor content and surface emissivity) is critical. The theory is based on the Radiative Transfer Equation, which assumed solutions can be given in both spectral regions, with respect to Wien's and Rayleigh-Jean's laws, respectively. The surface temperature determination is studied in connection with the surface emissivity in both infrared and microwave regions determined with an iterative process. Infrared data is provided by the sensor Advanced Very High Resolution Radiometer (AVHRR) and microwave data by Special Sensor Microwave/Imager (SSM/I), through the WETNET program, directed by NASA/HQS. (author)

  19. Availability analysis and design of storage extension based on CWDM

    Science.gov (United States)

    Qin, Leihua; Yu, Yan

    2007-11-01

    As Fibre Channel becomes the key storage protocol of SAN (Storage Area Network), enterprises are increasingly deploying FC SANs in their data central. Meanwhile, organizations increasingly face an enormous influx of data that must be stored, protected, backed up and replicated for mitigating the risk of losing data. One of the best ways to achieve this goal is to deploy SAN extension based on CWDM(Coarse Wavelength Division Multiplexing). Availability is one of the key performance metrics for business continuity and disaster recovery and has to be well understood by IT departments when deploying SAN extension based on CWDM, for it determines accessibility to remotely located data sites. In this paper, several architecture of storage extension over CWDM is analyzed and the availability of this different storage extension architecture are calculated. Further more, two kinds of high availability storage extension architecture with 1:1 or 1:N protection is designed, and the availability of protection schema storage extension based on CWDM is calculated too.

  20. Epidural volume extension: A novel technique and its efficacy in high risk cases.

    Science.gov (United States)

    Tiwari, Akhilesh Kumar; Singh, Rajeev Ratan; Anupam, Rudra Pratap; Ganguly, S; Tomar, Gaurav Singh

    2012-01-01

    We present a unique case series restricting ourselves only to the high-risk case of different specialities who underwent successful surgery in our Institute by using epidural volume extension's technique using 1 mL of 0.5% ropivacaine and 25 μg of fentanyl. PMID:25885627

  1. Designing and application of SAN extension interface based on CWDM

    Science.gov (United States)

    Qin, Leihua; Yu, Shengsheng; Zhou, Jingli

    2005-11-01

    As Fibre Channel (FC) becomes the protocol of choice within corporate data centers, enterprises are increasingly deploying SANs in their data central. In order to mitigate the risk of losing data and improve the availability of data, more and more enterprises are increasingly adopting storage extension technologies to replicate their business critical data to a secondary site. Transmitting this information over distance requires a carrier grade environment with zero data loss, scalable throughput, low jitter, high security and ability to travel long distance. To address this business requirements, there are three basic architectures for storage extension, they are Storage over Internet Protocol, Storage over Synchronous Optical Network/Synchronous Digital Hierarchy (SONET/SDH) and Storage over Dense Wavelength Division Multiplexing (DWDM). Each approach varies in functionality, complexity, cost, scalability, security, availability , predictable behavior (bandwidth, jitter, latency) and multiple carrier limitations. Compared with these connectiviy technologies,Coarse Wavelength Division Multiplexing (CWDM) is a Simplified, Low Cost and High Performance connectivity solutions for enterprises to deploy their storage extension. In this paper, we design a storage extension connectivity over CWDM and test it's electrical characteristic and random read and write performance of disk array through the CWDM connectivity, testing result show us that the performance of the connectivity over CWDM is acceptable. Furthermore, we propose three kinds of network architecture of SAN extension based on CWDM interface. Finally the credit-Based flow control mechanism of FC, and the relationship between credits and extension distance is analyzed.

  2. Extension of the broadband single-mode integrated optical waveguide technique to the ultraviolet spectral region and its applications

    OpenAIRE

    Wiederkehr, Rodrigo S.; Mendes, Sergio B.

    2014-01-01

    We report here the fabrication, characterization, and application of a single-mode integrated optical waveguide (IOW) spectrometer capable of acquiring optical absorbance spectra of surface-immobilized molecules in the visible and ultraviolet spectral region down to 315 nm. The UV-extension of the single-mode IOW technique to shorter wavelengths was made possible by our development of a low-loss single-mode dielectric waveguide in the UV region based on an alumina film grown...

  3. On the extension of the complex-step derivative technique to pseudospectral algorithms

    International Nuclear Information System (INIS)

    The complex-step derivative (CSD) technique is a convenient and highly accurate strategy to perform a linearized 'perturbation' analysis to determine a 'directional derivative' via a minor modification of an existing nonlinear simulation code. The technique has previously been applied to nonlinear simulation codes (such as finite-element codes) which employ real arithmetic only. The present note examines the suitability of this technique for extension to efficient pseudospectral simulation codes which nominally use the fast Fourier transform (FFT) to convert back and forth between the physical and transformed representations of the system. It is found that, if used carefully, this extension retains the remarkable accuracy of the CSD approach. However, to perform this extension without sacrificing this accuracy, particular care must be exercised; specifically, the state (real) and perturbation (imaginary) components of the complexified system must be transformed separately and arranged in such a manner that they are kept distinct during the process of differentiation in the transformed space in order to avoid the linear combination of the large and small quantities in the analysis. It is shown that this is relatively straightforward to implement even in complicated nonlinear simulation codes, thus positioning the CSD approach as an attractive and relatively simple alternative to hand coding a perturbation (a.k.a. 'tangent linear') code for determining the directional derivative even when pseudospectral algorithms are employed

  4. Support vector machines optimization based theory, algorithms, and extensions

    CERN Document Server

    Deng, Naiyang; Zhang, Chunhua

    2013-01-01

    Support Vector Machines: Optimization Based Theory, Algorithms, and Extensions presents an accessible treatment of the two main components of support vector machines (SVMs)-classification problems and regression problems. The book emphasizes the close connection between optimization theory and SVMs since optimization is one of the pillars on which SVMs are built.The authors share insight on many of their research achievements. They give a precise interpretation of statistical leaning theory for C-support vector classification. They also discuss regularized twi

  5. Promoting Community-based Extension Agents as an Alternative Approach to Formal Agricultural Extension Service Delivery in Northern Ghana

    Directory of Open Access Journals (Sweden)

    Samuel Z. Bonye

    2012-03-01

    Full Text Available The CBEA concept is an alternative to community-based extension intervention aimed at addressing the inadequacy of formal extension services provision to rural poor farmers of the Northern Regions of Ghana. The study sought to find out the extent to which the Community-Based Extension Agent has improved access to extension services to rural farmers. The study used qualitative and quantitative methods such as, Focus Group Discussions, Key Informants, In-depth interviews, Household and Institutional Questionnaires to collect and analyses data. The findings are that: there are vibrant Community Based Extension Agents established providing extension services in crop, livestock and environmental issues in the study District; farmers groups are linked to external agents and other stakeholders for access to credit facilities; the CBEAs were found to be the main link between the community and external agents; the most dominant extension services delivery carried out by the CBEAs in the entire study district were in crop production, livestock production and bushfire management; there are well established criteria for selecting Community Based Extension Agents, and community Based Extension Agents were least motivated. The study recommends among others that: motivation packages such as bicycles would facilitate the movement CBEAs to reach out to majority of the farmers. There is also the need to link CBEAs to relevant institutions/organizations for support and establishment of mechanisms to generate funds to support activities. Finally, stakeholders and organization need to intensify community sensitization and awareness creation on activities of CBEAs.

  6. Promoting Community-Based Extension Agents as an Alternative Approach to Formal Agricultural Extension Service Delivery in Northern Ghana

    Directory of Open Access Journals (Sweden)

    Samuel Z. Bonye

    2012-03-01

    Full Text Available The CBEA concept is an alternative to community-based extension intervention aimed at addressing the inadequacy of formal extension services provision to rural poor farmers of the Northern Regions of Ghana. The study sought to find out the extent to which the Community-Based Extension Agent has improved access to extension services to rural farmers. The study used qualitative and quantitative methods such as, Focus Group Discussions, Key Informants, In-depth interviews, Household and Institutional Questionnaires to collect and analyses data. The findings are that: there are vibrant Community Based Extension Agents established providing extension services in crop, livestock and environmental issues in the study District; farmers groups are linked to external agents and other stakeholders for access to credit facilities; the CBEAs were found to be the main link between the community and external agents; the most dominant extension services delivery carried out by the CBEAs in the entire study district were in crop production, livestock production and bushfire management; there are well established criteria for selecting Community Based Extension Agents, and community Based Extension Agents were least motivated. The study recommends among others that: motivation packages such as bicycles would facilitate the movement CBEAs to reach out to majority of the farmers. There is also the need to link CBEAs to relevant institutions/organizations for support and establishment of mechanisms to generate funds to support activities. Finally, stakeholders and organization need to intensify community sensitization and awareness creation on activities of CBEAs.

  7. Extensive techniques of TIPS in buddi-chiari syndrome with occlusive hepatic veins

    International Nuclear Information System (INIS)

    Objective: To elaborate the ameliorative technique steps of transjugular intrahepatic portosystemic shunt (TIPS) and to evaluate its therapeutic effects to Buddi-Chiari syndrome with occlusive hepatic veins. Methods: Eleven patients were treated by the improved methods of TIPS, who were diagnosed as Buddi-Chiari syndrome with widespread stenosis or occlusive lesions of hepatic veins and verified through imagiology. Key points of the extensive techniques of TIPS were to design and build the access to the artificial hepatic vein. The changes of portal vein pressure, shunt blood flow, and stent patency posterior to the procedure were assessed and followed up for 24 months. Results: The intrahepatic shunt between portal vein and inferior vena cava was successfully built and good clinical responses were obtained in all patients. The main portal pressure decreased from (4.62 ± 0.52) kPa (x-bar ± s) to (2.16 ± 0.21) kPa posterior to the shunt. The maximum velocity of shunt blood flow was (56.2 ± 3.50) cm/s and stent patency was 7/11 at 24 months after the procedure. Conclusion: The extensive TIPS has a high successful technique rate, and is worthy of a new therapeutic means to Buddi-Chiari syndrome with occlusive hepatic veins

  8. The Search for Extension: 7 Steps to Help People Find Research-Based Information on the Internet

    Science.gov (United States)

    Hill, Paul; Rader, Heidi B.; Hino, Jeff

    2012-01-01

    For Extension's unbiased, research-based content to be found by people searching the Internet, it needs to be organized in a way conducive to the ranking criteria of a search engine. With proper web design and search engine optimization techniques, Extension's content can be found, recognized, and properly indexed by search engines and…

  9. A useful technique for adjusting nasal tip projection in Asian rhinoplasty: Trapezoidal caudal extension cartilage grafting.

    Science.gov (United States)

    Liu, Shao-Cheng; Lin, Deng-Shan; Wang, Hsing-Won; Kao, Chuan-Hsiang

    2016-01-01

    The purpose of this article is to present our experience with Asian patients in (1) using a trapezoidal caudal extension cartilage graft to adjust the tip projection in tip refinement for augmentation rhinoplasty, especially for the correction of short nose, and (2) avoiding complications of augmentation rhinoplasty with alloplastic implants. We conducted a retrospective chart review of 358 rhinoplasties that were performed by the corresponding author from January 2004 through July 2009. Patients were included in this study if they had undergone open rhinoplasty with a trapezoidal caudal extension cartilage graft as the only tip-modifying procedure. Patients in whom any additional grafting was performed that might have altered the nasal tip position were excluded. The surgical results were analyzed in terms of the degree of satisfaction judged separately by investigators and by patients. A total of 84 patients-46 males and 38 females, all Asians, aged 13 to 61 years (mean: 29.3)-met our eligibility criteria. Postoperative follow-up for 24 months was achieved in 62 patients. At the 24-month follow-up, the surgeons judged the results to be good or very good in 57 of the 62 patients (91.9%); at the same time, 56 patients (90.3%) said they were satisfied or very satisfied with their aesthetic outcome. Good nasal tip projection, a natural columellar appearance, and improvement in the nasolabial angle were achieved for most patients. Two patients required revision rhinoplasty to correct an insufficient augmentation and migration of the onlay graft. No severe complications were observed during the 2-year follow-up. We have found that trapezoidal caudal extension cartilage grafting in nasal tip refinement is an easy technique to learn and execute, its results are predictable, and it has been associated with no major complications. We recommend trapezoidal caudal extension cartilage grafting for Asian patients as a good and reliable alternative for managing tip projection

  10. CUSTOMER BASED BRAND EQUITY DETERMINANTS ON BRAND EXTENSION IN TELEVISION INDUSTRY

    OpenAIRE

    Vetrivel, V.; A.N.Solayappan; Jothi.Jayakrishnan

    2015-01-01

    The aim of this paper is find out the relationship of customer ased brand equity and brand extension also compare with the selected customer based brand equity determinants of brand awareness, brand association, brand trust, customer satisfaction and CBBE. The structured questionnaire was distributed among the espondents by following the simple random sampling technique. Questionnaires were distributed among 550 respondents, out of which, only 517 were properly filled. Data was analyzed thr...

  11. A graph-based approach for designing extensible pipelines

    Directory of Open Access Journals (Sweden)

    Rodrigues Maíra R

    2012-07-01

    Full Text Available Abstract Background In bioinformatics, it is important to build extensible and low-maintenance systems that are able to deal with the new tools and data formats that are constantly being developed. The traditional and simplest implementation of pipelines involves hardcoding the execution steps into programs or scripts. This approach can lead to problems when a pipeline is expanding because the incorporation of new tools is often error prone and time consuming. Current approaches to pipeline development such as workflow management systems focus on analysis tasks that are systematically repeated without significant changes in their course of execution, such as genome annotation. However, more dynamism on the pipeline composition is necessary when each execution requires a different combination of steps. Results We propose a graph-based approach to implement extensible and low-maintenance pipelines that is suitable for pipeline applications with multiple functionalities that require different combinations of steps in each execution. Here pipelines are composed automatically by compiling a specialised set of tools on demand, depending on the functionality required, instead of specifying every sequence of tools in advance. We represent the connectivity of pipeline components with a directed graph in which components are the graph edges, their inputs and outputs are the graph nodes, and the paths through the graph are pipelines. To that end, we developed special data structures and a pipeline system algorithm. We demonstrate the applicability of our approach by implementing a format conversion pipeline for the fields of population genetics and genetic epidemiology, but our approach is also helpful in other fields where the use of multiple software is necessary to perform comprehensive analyses, such as gene expression and proteomics analyses. The project code, documentation and the Java executables are available under an open source license at http

  12. Project Milestone. Analysis of Range Extension Techniques for Battery Electric Vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, Jeremy [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wood, Eric [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Pesaran, Ahmad [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2013-07-01

    This report documents completion of the July 2013 milestone as part of NREL’s Vehicle Technologies Annual Operating Plan with the U.S. Department of Energy. The objective was to perform analysis on range extension techniques for battery electric vehicles (BEVs). This work represents a significant advancement over previous thru-life BEV analyses using NREL’s Battery Ownership Model, FastSim,* and DRIVE.* Herein, the ability of different charging infrastructure to increase achievable travel of BEVs in response to real-world, year-long travel histories is assessed. Effects of battery and cabin thermal response to local climate, battery degradation, and vehicle auxiliary loads are captured. The results reveal the conditions under which different public infrastructure options are most effective, and encourage continued study of fast charging and electric roadway scenarios.

  13. Automatic assessment of mitral regurgitation severity based on extensive textural features on 2D echocardiography videos.

    Science.gov (United States)

    Moghaddasi, Hanie; Nourian, Saeed

    2016-06-01

    Heart disease is the major cause of death as well as a leading cause of disability in the developed countries. Mitral Regurgitation (MR) is a common heart disease which does not cause symptoms until its end stage. Therefore, early diagnosis of the disease is of crucial importance in the treatment process. Echocardiography is a common method of diagnosis in the severity of MR. Hence, a method which is based on echocardiography videos, image processing techniques and artificial intelligence could be helpful for clinicians, especially in borderline cases. In this paper, we introduce novel features to detect micro-patterns of echocardiography images in order to determine the severity of MR. Extensive Local Binary Pattern (ELBP) and Extensive Volume Local Binary Pattern (EVLBP) are presented as image descriptors which include details from different viewpoints of the heart in feature vectors. Support Vector Machine (SVM), Linear Discriminant Analysis (LDA) and Template Matching techniques are used as classifiers to determine the severity of MR based on textural descriptors. The SVM classifier with Extensive Uniform Local Binary Pattern (ELBPU) and Extensive Volume Local Binary Pattern (EVLBP) have the best accuracy with 99.52%, 99.38%, 99.31% and 99.59%, respectively, for the detection of Normal, Mild MR, Moderate MR and Severe MR subjects among echocardiography videos. The proposed method achieves 99.38% sensitivity and 99.63% specificity for the detection of the severity of MR and normal subjects. PMID:27082766

  14. Extension of the broadband single-mode integrated optical waveguide technique to the ultraviolet spectral region and its applications.

    Science.gov (United States)

    Wiederkehr, Rodrigo S; Mendes, Sergio B

    2014-03-21

    We report here the fabrication, characterization, and application of a single-mode integrated optical waveguide (IOW) spectrometer capable of acquiring optical absorbance spectra of surface-immobilized molecules in the visible and ultraviolet spectral region down to 315 nm. The UV-extension of the single-mode IOW technique to shorter wavelengths was made possible by our development of a low-loss single-mode dielectric waveguide in the UV region based on an alumina film grown by atomic layer deposition (ALD) over a high quality fused silica substrate, and by our design/fabrication of a broadband waveguide coupler formed by an integrated diffraction grating combined with a highly anamorphic optical beam of large numerical aperture. As an application of the developed technology, we report here the surface adsorption process of bacteriochlorophyll a on different interfaces using its Soret absorption band centred at 370 nm. The effects of different chemical compositions at the solid-liquid interface on the adsorption and spectral properties of bacteriochlorophyll a were determined from the polarized UV-Vis IOW spectra acquired with the developed instrumentation. The spectral extension of the single-mode IOW technique into the ultraviolet region is an important advance as it enables extremely sensitive studies in key characteristics of surface molecular processes (e.g., protein unfolding and solvation of aromatic amino-acid groups under surface binding) whose spectral features are mainly located at wavelengths below the visible spectrum. PMID:24466569

  15. Multihop Relay Techniques for Communication Range Extension in Near-Field Magnetic Induction Communication Systems

    Directory of Open Access Journals (Sweden)

    Mehrnoush Masihpour

    2013-05-01

    Full Text Available In this paper, multihop relaying in RF-based communications and near field magnetic induction communication (NFMIC is discussed. Three multihop relay strategies for NFMIC are proposed: Non Line of Sight Magnetic Induction Relay (NLoS-MI Relay, Non Line of Sight Master/Assistant Magnetic Induction Relay1 (NLoS-MAMI Relay1 and Non Line of Sight Master/Assistant Magnetic Induction Relay2 (NLoSMAMI Relay2. In the first approach only one node contributes to the communication, while in the other two techniques (which are based on a master-assistant strategy, two relaying nodes are employed. This paper shows that these three techniques can be used to overcome the problem of dead spots within a body area network and extend the communication range without increasing the transmission power and the antenna size or decreasing receiver sensitivity. The impact of the separation distance between the nodes on the achievable RSS and channel data rate is evaluated for the three techniques. It is demonstrated that the technique which is most effective depends on the specific network topology. Optimum selection of nodes as relay master and assistant based on the location of the nodes is discussed. The paper also studies the impact of the quality factor on achievable data rate. It is shown that to obtain the highest data rate, the optimum quality factor needs to be determined for each proposed cooperative communication method.

  16. Sustainable Agriculture: Towards a Conflict Management Based Agricultural Extension

    OpenAIRE

    Mostafa Ahmadvand; Ezatollah Karami

    2007-01-01

    This study aims to provide an alternative conceptual framework for agricultural extension, which can deal with environmental scarcity, conflict and challenges in sustainable way. For this purpose, a brief history of agricultural extension and conflict is introduced and then conflict management approaches are reviewed. Finally, an alternative model is proposed to use conflict management approach as a basis for agricultural extension. The implication of conflict management approach in agr...

  17. Sustainable Agriculture: Towards a Conflict Management Based Agricultural Extension

    Science.gov (United States)

    Ahmadvand, Mostafa; Karami, Ezatollah

    This study aims to provide an alternative conceptual framework for agricultural extension, which can deal with environmental scarcity, conflict and challenges in sustainable way. For this purpose, a brief history of agricultural extension and conflict is introduced and then conflict management approaches are reviewed. Finally, an alternative model is proposed to use conflict management approach as a basis for agricultural extension. The implication of conflict management approach in agricultural extension is far-reaching: it requires new modes of analysis and different roles and tasks.

  18. Service-based extensions to the JDL fusion model

    Science.gov (United States)

    Antony, Richard T.; Karakowski, Joseph A.

    2008-04-01

    Extensions to a previously developed service-based fusion process model are presented. The model accommodates (1) traditional sensor data and human-generated input, (2) streaming and non-streaming data feeds, and (3) the fusion of both physical and non-physical entities. More than a dozen base-level fusion services are identified. These services provide the foundation functional decomposition of levels 0 - 2 in JDL fusion model. Concepts, such as clustering, link analysis and database mining, that have traditionally been only loosely associated with the fusion process, are shown to play key roles within this fusion framework. Additionally, the proposed formulation extends the concepts of tracking and cross-entity association to non-physical entities, as well as supports effective exploitation of a priori and derived context knowledge. Finally, the proposed framework is shown to support set theoretic properties, such as equivalence and transitivity, as well as the development of a pedigree summary metric that characterizes the informational distance between individual fused products and source data.

  19. The MIDAS Experiment: A New Technique for the Detection of Extensive Air Showers

    CERN Document Server

    Williams, C; Berlin, A; Bohacova, M; Facal, P; Genat, J F; Mills, E; Monasor, M; Privitera, P; Reyes, L C; d'Orfeuil, B Rouille; Wayne, S; Alekotte, I; Bertou, X; Bonifazi, C; Neto, J R T de Mello; Santos, E M; Alvarez-Muñiz, J; Carvalho, W; Zas, E

    2010-01-01

    Recent measurements suggest free electrons created in ultra-high energy cosmic ray extensive air showers (EAS) can interact with neutral air molecules producing Bremsstrahlung radiation in the microwave regime. The microwave radiation produced is expected to scale with the number of free electrons in the shower, which itself is a function of the energy of the primary particle and atmospheric depth. Using these properties a calorimetric measurement of the EAS is possible. This technique is analogous to fluorescence detection with the added benefit of a nearly 100% duty cycle and practically no atmospheric attenuation. The Microwave Detection of Air Showers (MIDAS) prototype is currently being developed at the University of Chicago. MIDAS consists of a 53 feed receiver operating in the 3.4 to 4.2 GHz band. The camera is deployed on a 4.5 meter parabolic reflector and is instrumented with high speed power detectors and autonomous FPGA trigger electronics. We present the current status of the MIDAS instrument and...

  20. An extension to the dynamic plane source technique for measuring thermal conductivity, thermal diffusivity, and specific heat of dielectric solids

    Science.gov (United States)

    Karawacki, Ernest; Suleiman, Bashir M.; ul-Haq, Izhar; Nhi, Bui-Thi

    1992-10-01

    The recently developed dynamic plane source (DPS) technique for simultaneous determination of the thermal properties of fast thermally conducting materials with thermal conductivities between 200 and 2 W/mK has now been extended for studying relatively slow conducting materials with thermal conductivities equal or below 2 W/mK. The method is self-checking since the thermal conductivity, thermal diffusivity specific heat, and effusivity of the material are obtained independently from each other. The theory of the technique and the experimental arrangement are given in detail. The data evaluation procedure is simple and makes it possible to reveal the distortions due to the nonideal experimental conditions. The extension to the DPS technique has been implemented at room temperature to study the samples of cordierite-based ceramic Cecorite 130P (thermal conductivity equal to 1.48 W/mK), rubber (0.403 W/mK), and polycarbonate (0.245 W/mK). The accuracy of the method is within ±5%.

  1. The Liquisolid Technique: Based Drug Delivery System

    OpenAIRE

    Izhar Ahmed Syed; E. Pavani

    2012-01-01

    The “Liquisolid” technique is a novel and capable addition towards such an aims for solubility enhancement and dissolution improvement, thereby it increases the bioavailability. It contains liquid medications in powdered form. This technique is an efficient method for formulating water insoluble and water soluble drugs. This technique is based upon the admixture of drug loaded solutions with appropriate carrier and coating materials. The use of non-volatile solvent causes improved wettability...

  2. A Conformal Extension Theorem based on Null Conformal Geodesics

    CERN Document Server

    Lübbe, Christian

    2008-01-01

    In this article we describe the formulation of null geodesics as null conformal geodesics and their description in the tractor formalism. A conformal extension theorem through an isotropic singularity is proven by requiring the boundedness of the tractor curvature and its derivatives to sufficient order along a congruence of null conformal geodesic. This article extends earlier work by Tod and Luebbe.

  3. Image Based Authentication Using Steganography Technique

    OpenAIRE

    Satish Kumar Sonker; Sanjeev Kumar; Amit Kumar; Dr. Pragya Singh

    2013-01-01

    In the world of Information Security we are generally using Traditional (Text based) or multi factor Authentication Approach. Through which we are facing a lot of problems and it’s also less secure too. In these types conventional method attacks like brute-force attack, Dictionary Attack etc., are possible. This paper proposes the Image Based Authentication Using Steganography Technique considering the advantage of steganography technique along with the image. Including steganography in image...

  4. A graph-based approach for designing extensible pipelines

    OpenAIRE

    Rodrigues Maíra R; Magalhães Wagner CS; Machado Moara; Tarazona-Santos Eduardo

    2012-01-01

    Abstract Background In bioinformatics, it is important to build extensible and low-maintenance systems that are able to deal with the new tools and data formats that are constantly being developed. The traditional and simplest implementation of pipelines involves hardcoding the execution steps into programs or scripts. This approach can lead to problems when a pipeline is expanding because the incorporation of new tools is often error prone and time consuming. Current approaches to pipeline d...

  5. Bandwidth Extension Method Based on Spectral Envelope Estimation

    Directory of Open Access Journals (Sweden)

    Bo Hang

    2011-06-01

    Full Text Available In current communication system, high quality audio signal is supposed to be provided with low bit rate and low computational complexity. This paper proposed a novel audio coding bandwidth extension method, which can improve decoded audio quality with increasing only a few coding bits per frame and a little computational complexity. This method calculate high-frequency synthesis filter by using codebook mapping method, and transmit only quantified gain corrections in high-frequency part of multiplexing coding bit stream. The preliminary test show that this method can provide comparable audio quality with lower bit consumption and computational complexity compared to the high frequency regeneration of AMR-WB+.

  6. Flow Modeling Based Wall Element Technique

    Directory of Open Access Journals (Sweden)

    Sabah Tamimi

    2012-08-01

    Full Text Available Two types of flow where examined, pressure and combination of pressure and Coquette flow of confined turbulent flow with a one equation model used to depict the turbulent viscosity of confined flow in a smooth straight channel when a finite element technique based on a zone close to a solid wall has been adopted for predicting the distribution of the pertinent variables in this zone and examined even with case when the near wall zone was extended away from the wall. The validation of imposed technique has been tested and well compared with other techniques.

  7. OpenARC: Extensible OpenACC Compiler Framework for Directive-Based Accelerator Programming Study

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seyong [ORNL; Vetter, Jeffrey S [ORNL

    2014-01-01

    Directive-based, accelerator programming models such as OpenACC have arisen as an alternative solution to program emerging Scalable Heterogeneous Computing (SHC) platforms. However, the increased complexity in the SHC systems incurs several challenges in terms of portability and productivity. This paper presents an open-sourced OpenACC compiler, called OpenARC, which serves as an extensible research framework to address those issues in the directive-based accelerator programming. This paper explains important design strategies and key compiler transformation techniques needed to implement the reference OpenACC compiler. Moreover, this paper demonstrates the efficacy of OpenARC as a research framework for directive-based programming study, by proposing and implementing OpenACC extensions in the OpenARC framework to 1) support hybrid programming of the unified memory and separate memory and 2) exploit architecture-specific features in an abstract manner. Porting thirteen standard OpenACC programs and three extended OpenACC programs to CUDA GPUs shows that OpenARC performs similarly to a commercial OpenACC compiler, while it serves as a high-level research framework.

  8. An Extensive Evaluation of Anaphora Resolution Based Abstract Generation System

    Directory of Open Access Journals (Sweden)

    Ayyalu Hariharan Nandhu Kishore

    2013-01-01

    Full Text Available This paper presents an extensive evaluation of anaphora resolution (AR algorithm proposed for Abstract Generation System (AGS and discusses the metrics used to measure its performance. The evaluation exercise is conducted with the test sets chosen from different domains and the results are compared with other existing open-source Anaphora Resolution Systems (ARSs like Mitkov’s Anaphora Resolution System (MARS and Java Resolution of Anaphora Procedure (Java RAP. We have selected news documents belong to five different domains and it is trained with AGS among which some set of documents and their corresponding extract is taken as the test set and executed in various ARSs for comparison. The proposed system is evaluated and its performances are studied with standard metrics of Information Extraction (IE systems such as Success rate, Precision, Recall and F-measure.

  9. Extensive Characterisation of Copper-clad Plates, Bonded by the Explosive Technique, for ITER Electrical Joints

    CERN Document Server

    Langeslag, S A E; Libeyre, P; Gung, C Y

    2015-01-01

    Cable-in-conduit conductors will be extensively implemented in the large superconducting magnet coils foreseen to confine the plasma in the ITER experiment. The design of the various magnet systems imposes the use of electrical joints to connect unit lengths of superconducting coils by inter-pancake coupling. These twin-box lap type joints, produced by compacting each cable end in into a copper - stainless steel bimetallic box, are required to be highly performing in terms of electrical and mechanical prop- erties. To ascertain the suitability of the first copper-clad plates, recently produced, the performance of several plates is studied. Validation of the bonded interface is carried out by determining microstructural, tensile and shear characteristics. These measure- ments confirm the suitability of explosion bonded copper-clad plates for an overall joint application. Additionally, an extensive study is conducted on the suitability of certain copper purity grades for the various joint types.

  10. A Shape Based Image Search Technique

    Directory of Open Access Journals (Sweden)

    Aratrika Sarkar

    2014-08-01

    Full Text Available This paper describes an interactive application we have developed based on shaped-based image retrieval technique. The key concepts described in the project are, imatching of images based on contour matching; iimatching of images based on edge matching; iiimatching of images based on pixel matching of colours. Further, the application facilitates the matching of images invariant of transformations like i translation ; ii rotation; iii scaling. The key factor of the system is, the system shows the percentage unmatched of the image uploaded with respect to the images already existing in the database graphically, whereas, the integrity of the system lies on the unique matching techniques used for optimum result. This increases the accuracy of the system. For example, when a user uploads an image say, an image of a mango leaf, then the application shows all mango leaves present in the database as well other leaves matching the colour and shape of the mango leaf uploaded.

  11. Design and Implementation of Agro-technical Extension Information System Based on Cloud Storage

    OpenAIRE

    Guo, Leifeng; Wang, Wensheng; Yang, Yong; Sun,Zhiguo

    2013-01-01

    International audience In order to solve the problems of low efficiency and backward methods in the agro-technical extension activities, this paper designed an agro-technical extension information system based on cloud storage technology. This paper studied the key technologies, such as cloud storage service engine, cloud storage management node and cloud storage data node and designed the overall architecture of the agro-technical extension information system based on cloud storage techno...

  12. CUSTOMER BASED BRAND EQUITY DETERMINANTS ON BRAND EXTENSION IN TELEVISION INDUSTRY

    Directory of Open Access Journals (Sweden)

    V.Vetrivel

    2015-07-01

    Full Text Available The aim of this paper is find out the relationship of customer ased brand equity and brand extension also compare with the selected customer based brand equity determinants of brand awareness, brand association, brand trust, customer satisfaction and CBBE. The structured questionnaire was distributed among the espondents by following the simple random sampling technique. Questionnaires were distributed among 550 respondents, out of which, only 517 were properly filled. Data was analyzed through SPSS and relationships between proposed variables were checked through pearson correlation method. The findings of this study show a significant positive relationship among the five factors of customer based brand equity determinants. The relationships among CBBE, brand trust, brand awareness and customer satisfaction were found to be very strong while the relationship between these four determinants and brand awareness was relatively weaker. This paper indicates the implication for brand managers to build and manage better strategies for these determinants of CBBE it provides clear way to successful brand extension opportunities.

  13. An Extensible, Kinematically-Based Gesture Annotation Scheme

    OpenAIRE

    Martell, Craig H.

    2002-01-01

    Chapter 1 in the book: Advances in Natural Multimodal Dialogue Systems Annotated corpora have played a critical role in speech and natural language research; and, there is an increasing interest in corpora-based research in sign language and gesture as well. We present a non-semantic, geometrically-based annotation scheme, FORM, which allows an annotator to capture the kinematic information in a gesture just from videos of speakers. In addition, FORM stores this gestural in...

  14. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  15. Definition of data bases, codes, and technologies for cable life extension

    International Nuclear Information System (INIS)

    The substantial number of cables inside containment for a typical nuclear facility provides a strong motivation to extend cable life rather than replace cables. Hence, it is important to understand what information is necessary to accomplish life extension. This paper defines utility-specific as well as collective industry actions that would facilitate extending cable life. The focus of these recommendations is (1) to more realistically define the environmental profiles during which cables must function, (2) to better understand the validity of accelerated aging methodology through examination of naturally aged cables, (3) to better understand the validity of accelerated aging methodology via selected experimentation, (4) to support cable aging analysis by improving nonproprietary data bases, (5) to reduce the impact of the design basis accident assumptions on cable performance so additional cable aging can be accommodated during extended life, and (6) to complement life predictions with more powerful cable condition monitoring techniques than those currently available

  16. Extensive Taguchi's Quality Loss Function Based On Asymmetric tolerances

    Institute of Scientific and Technical Information of China (English)

    ZHU Wei; LI Yuan-sheng; LIU Feng

    2004-01-01

    If specification interval is asymmetric, basic specification is the target value of quality characteristics. In this paper Taguchi's quality loss function is applied to describe quality loss based on asymmetric tolerances. The measurement of quality loss which is caused by the deviation of quality characteristics from basic specification is further presented.

  17. Rapid Tooling Technique Based on Stereolithograph Prototype

    Institute of Scientific and Technical Information of China (English)

    丁浩; 狄平; 顾伟生; 朱世根

    2001-01-01

    Rapid tooling technique based on the sterelithograph prototype is investigated. The epoxy tooling technological process was elucidated. It is analyzed in detail that the epoxy resin formula is easy to cast, curing process, and release agents. The transitional plaster model is also proposed. The mold to encrust mutual.inductors with epoxy and mold to inject plastic soapboxes was made with the technique The tooling needs very little time and cost, for the process is only to achieve the nice replica of the prototype. It is benefit for the trial and small batch of production.

  18. A Theoretical Extension of the Consumption-based CAPM Model

    OpenAIRE

    Jingyuan Li; Georges Dionne

    2010-01-01

    We extend the Consumption-based CAPM (C-CAPM) model for representative agents with different risk attitudes. We introduce the concept of expectation dependence and show that for a risk averse representative agent, it is the first-degree expectation dependence rather than the covariance that determines C-CAPM’s riskiness. We extend the assumption of risk aversion to prudence and provide a weaker dependence condition than first-degree expectation dependence to obtain the values of asset price a...

  19. Definition of data base, code, and technologies for cable life extension

    International Nuclear Information System (INIS)

    The substantial number of cables inside containment for a typical nuclear facility provides a strong motivation to extend cable life rather than replace cables as part of an overall plant life extension strategy. Hence, it is important to understand what information is necessary to accomplish life extension. This paper defines utility-specific as well as collective-industry actions that would facilitate extending cable life. The focus of these recommendations is (1) to more realistically define the environmental profiles during which cables must function, (2) to define plant configuration and operational changes which may enahnce cable life, (3) to better understand the validity of accelerated aging methodology through examination of naturally aged cables, (4) to better understand the validity of accelerated aging methodology via selected experimentation, (5) to support cable aging analysis by improving nonproprietary data bases, (6) to reduce the impact of the design basis accident assumptions on cable performance so additional cable aging can be accommodated during extended life, and (7) to complement life predictions with more effective cable condition monitoring techniques than those currently available

  20. The Liquisolid Technique: Based Drug Delivery System

    Directory of Open Access Journals (Sweden)

    Izhar Ahmed Syed

    2012-04-01

    Full Text Available The “Liquisolid” technique is a novel and capable addition towards such an aims for solubility enhancement and dissolution improvement, thereby it increases the bioavailability. It contains liquid medications in powdered form. This technique is an efficient method for formulating water insoluble and water soluble drugs. This technique is based upon the admixture of drug loaded solutions with appropriate carrier and coating materials. The use of non-volatile solvent causes improved wettability and ensures molecular dispersion of drug in the formulation and leads to enhance solubility. By using hydrophobic carriers (non-volatile solvents one can modify release (sustained release of drugs by this technique. Liquisolid system is characterized by flow behavior, wettability, powder bed hydrophilicity, saturation solubility, drug content, differential scanning calorimetry, Fourier transform infra red spectroscopy, powder X-ray diffraction, scanning electron microscopy, in-vitro release and in-vivo evaluation. By using this technique, solubility and dissolution rate can be improved, sustained drug delivery systems be developed for the water soluble drugs.

  1. Development of CDMS-II Surface Event Rejection Techniques and Their Extensions to Lower Energy Thresholds

    Energy Technology Data Exchange (ETDEWEB)

    Hofer, Thomas James [Univ. of Minnesota, Minneapolis, MN (United States)

    2014-12-01

    The CDMS-II phase of the Cryogenic Dark Matter Search, a dark matter direct-detection experiment, was operated at the Soudan Underground Laboratory from 2003 to 2008. The full payload consisted of 30 ZIP detectors, totaling approximately 1.1 kg of Si and 4.8 kg of Ge, operated at temperatures of 50 mK. The ZIP detectors read out both ionization and phonon pulses from scatters within the crystals; channel segmentation and analysis of pulse timing parameters allowed e ective ducialization of the crystal volumes and background rejection su cient to set world-leading limits at the times of their publications. A full re-analysis of the CDMS-II data was motivated by an improvement in the event reconstruction algorithms which improved the resolution of ionization energy and timing information. The Ge data were re-analyzed using three distinct background-rejection techniques; the Si data from runs 125 - 128 were analyzed for the rst time using the most successful of the techniques from the Ge re-analysis. The results of these analyses prompted a novel \\mid-threshold" analysis, wherein energy thresholds were lowered but background rejection using phonon timing information was still maintained. This technique proved to have signi cant discrimination power, maintaining adequate signal acceptance and minimizing background leakage. The primary background for CDMS-II analyses comes from surface events, whose poor ionization collection make them di cult to distinguish from true nuclear recoil events. The novel detector technology of SuperCDMS, the successor to CDMS-II, uses interleaved electrodes to achieve full ionization collection for events occurring at the top and bottom detector surfaces. This, along with dual-sided ionization and phonon instrumentation, allows for excellent ducialization and relegates the surface-event rejection techniques of CDMS-II to a secondary level of background discrimination. Current and future SuperCDMS results hold great promise for mid- to low

  2. A Smalltalk-based extension to traditional Geographic Information Systems

    Energy Technology Data Exchange (ETDEWEB)

    Korp, P.A.; Lurie, G.R.; Christiansen, J.H. [Argonne National Lab., IL (United States). Advanced Computer Applications Group

    1995-11-01

    The Dynamic Environmental Effects Model{copyright} (DEEM), under development at Argonne National Laboratory, is a fully object-based modeling software system that supports distributed, dynamic representation of the interlinked processes and behavior of the earth`s surface and near-surface environment, at variable scales of resolution and aggregation. Many of these real world objects are not stored in a format conducive to efficient GIS usage. Their dynamic nature, complexity and number of possible DEEM entity classes precluded efficient integration with traditional GIS technologies due to the loosely coupled nature of their data representations. To address these shortcomings, an intelligent object-oriented GIS engine (OOGIS) was developed. This engine provides not only a spatially optimized object representation, but also direct linkages to the underlying object, its data and behaviors.

  3. Interactive early warning technique based on SVDD

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    After reviewing current researches on early warning,it is found that"bad" data of some systems is not easy to obtain,which makes methods proposed by these researches unsuitable for monitored systems.An interactive early warning technique based on SVDD(support vector data description)is proposed to adopt"good" data as samples to overcome the difficulty in obtaining the"bad"data.The process consists of two parts:(1)A hypersphere is fitted on"good"data using SVDD.If the data object are outside the hypersphere,it would be taken as"suspicious";(2)A group of experts would decide whether the suspicious data is"bad"or"good",early warning messages would be issued according to the decisions.And the detailed process of implementation is proposed.At last,an experiment based on data of a macroeconomic system is conducted to verify the proposed technique.

  4. Performance Evaluation of Extension Education Centers in Universities Based on the Balanced Scorecard

    Science.gov (United States)

    Wu, Hung-Yi; Lin, Yi-Kuei; Chang, Chi-Hsiang

    2011-01-01

    This study aims at developing a set of appropriate performance evaluation indices mainly based on balanced scorecard (BSC) for extension education centers in universities by utilizing multiple criteria decision making (MCDM). Through literature reviews and experts who have real practical experiences in extension education, adequate performance…

  5. Laser Remote Sensing: Velocimetry Based Techniques

    Science.gov (United States)

    Molebny, Vasyl; Steinvall, Ove

    Laser-based velocity measurement is an area of the field of remote sensing where the coherent properties of laser radiation are the most exposed. Much of the published literature deals with the theory and techniques of remote sensing. We restrict our discussion to current trends in this area, gathered from recent conferences and professional journals. Remote wind sensing and vibrometry are promising in their new scientific, industrial, military, and biomedical applications, including improving flight safety, precise weapon correction, non-contact mine detection, optimization of wind farm operation, object identification based on its vibration signature, fluid flow studies, and vibrometry-associated diagnosis.

  6. Graph based techniques for tag cloud generation

    DEFF Research Database (Denmark)

    Leginus, Martin; Dolog, Peter; Lage, Ricardo Gomes

    2013-01-01

    Tag cloud is one of the navigation aids for exploring documents. Tag cloud also link documents through the user defined terms. We explore various graph based techniques to improve the tag cloud generation. Moreover, we introduce relevance measures based on underlying data such as ratings...... or citation counts for improved measurement of relevance of tag clouds. We show, that on the given data sets, our approach outperforms the state of the art baseline methods with respect to such relevance by 41 % on Movielens dataset and by 11 % on Bibsonomy data set....

  7. Resection of giant ethmoid osteoma with orbital and skull base extension followed by duraplasty

    Directory of Open Access Journals (Sweden)

    Ferekidou Eliza

    2008-10-01

    Full Text Available Abstract Background Osteomas of ethmoid sinus are rare, especially when they involve anterior skull base and orbit, and lead to ophthalmologic and neurological symptoms. Case presentation The present case describes a giant ethmoid osteoma. Patient symptoms and signs were exophthalmos and proptosis of the left eye, with progressive visual acuity impairment and visual fields defects. CT/MRI scanning demonstrated a huge osseous lesion of the left ethmoid sinus (6.5 cm × 5 cm × 2.2 cm, extending laterally in to the orbit and cranially up to the anterior skull base. Bilateral extensive polyposis was also found. Endoscopic and external techniques were combined to remove the lesion. Bilateral endoscopic polypectomy, anterior and posterior ethmoidectomy and middle meatus antrostomy were performed. Finally, the remaining part of the tumor was reached and dissected from the surrounding tissue via a minimally invasive Lynch incision around the left middle canthus. During surgery, CSF rhinorrhea was observed and leakage was grafted with fascia lata and coated with bio-glu. Postoperatively, symptoms disappeared. Eighteen months after surgery, the patient is still free of symptoms. Conclusion Before management of ethmoid osteomas with intraorbital and skull base extension, a thorough neurological, ophthalmological and imaging evaluation is required, in order to define the bounders of the tumor, carefully survey the severity of symptoms and signs, and precisely plan the optimal treatment. The endoscopic procedure can constitute an important part of surgery undertaken for giant ethmoidal osteomas. In addition, surgeons always have to take into account a possible CSF leak and they have to be prepared to resolve it.

  8. Language Based Techniques for Systems Biology

    DEFF Research Database (Denmark)

    Pilegaard, Henrik

    calculi have similarly been used for the study of bio-chemical reactive systems. In this dissertation it is argued that techniques rooted in the theory and practice of programming languages, language based techniques if you will, constitute a strong basis for the investigation of models of biological......Process calculus is the common denominator for a class of compact, idealised, domain-specific formalisms normally associated with the study of reactive concurrent systems within Computer Science. With the rise of the interactioncentred science of Systems Biology a number of bio-inspired process...... systems as formalised in a process calculus. In particular it is argued that Static Program Analysis provides a useful approach to the study of qualitative properties of such models. In support of this claim a number of static program analyses are developed for Regev’s BioAmbients – a bio-inspired variant...

  9. Low-cost computer classification of land cover in the Portland area, Oregon, by signature extension techniques

    Science.gov (United States)

    Gaydos, Leonard

    1978-01-01

    Computer-aided techniques for interpreting multispectral data acquired by Landsat offer economies in the mapping of land cover. Even so, the actual establishment of the statistical classes, or "signatures," is one of the relatively more costly operations involved. Analysts have therefore been seeking cost-saving signature extension techniques that would accept training data acquired for one time or place and apply them to another. Opportunities to extend signatures occur in preprocessing steps and in the classification steps that follow. In the present example, land cover classes were derived by the simplest and most direct form of signature extension: Classes statistically derived from a Landsat scene for the Puget Sound area, Wash., were applied to the Portland area, Oreg., using data for the next Landsat scene acquired less than 25 seconds down orbit. Many features can be recognized on the reduced-scale version of the Portland land cover map shown in this report, although no statistical assessment of its accuracy is available.

  10. Biometric Iris Recognition Based on Hybrid Technique

    Directory of Open Access Journals (Sweden)

    Khattab M. Ali Alheeti

    2011-12-01

    Full Text Available Iris Recognition is one of the important biometric recognition systems that identify people based on theireyes and iris. In this paper the iris recognition algorithm is implemented via histogram equalization andwavelet techniques. In this paper the iris recognition approach is implemented via many steps, these stepsare concentrated on image capturing, enhancement and identification. Different types of edge detectionmechanisms; Canny scheme, Prewitt scheme, Roberts scheme and Sobel scheme are used to detect irisboundaries in the eyes digital image. The implemented system gives adequate results via different types ofiris images.

  11. Combined surgical and catheter-based treatment of extensive thoracic aortic aneurysm and aortic valve stenosis

    DEFF Research Database (Denmark)

    De Backer, Ole; Lönn, Lars; Søndergaard, Lars

    2015-01-01

    An extensive thoracic aortic aneurysm (TAA) is a potentially life-threatening condition and remains a technical challenge to surgeons. Over the past decade, repair of aortic arch aneurysms has been accomplished using both hybrid (open and endovascular) and totally endovascular techniques. Thoraci...

  12. On HTML and XML based web design and implementation techniques

    International Nuclear Information System (INIS)

    Web implementation is truly a multidisciplinary field with influences from programming, choosing of scripting languages, graphic design, user interface design, and database design. The challenge of a Web designer/implementer is his ability to create an attractive and informative Web. To work with the universal framework and link diagrams from the design process as well as the Web specifications and domain information, it is essential to create Hypertext Markup Language (HTML) or other software and multimedia to accomplish the Web's objective. In this article we will discuss Web design standards and the techniques involved in Web implementation based on HTML and Extensible Markup Language (XML). We will also discuss the advantages and disadvantages of HTML over its successor XML in designing and implementing a Web. We have developed two Web pages, one utilizing the features of HTML and the other based on the features of XML to carry out the present investigation. (author)

  13. Artificial Intelligence based technique for BTS placement

    International Nuclear Information System (INIS)

    The increase of the base transceiver station (BTS) in most urban areas can be traced to the drive by network providers to meet demand for coverage and capacity. In traditional network planning, the final decision of BTS placement is taken by a team of radio planners, this decision is not fool proof against regulatory requirements. In this paper, an intelligent based algorithm for optimal BTS site placement has been proposed. The proposed technique takes into consideration neighbour and regulation considerations objectively while determining cell site. The application will lead to a quantitatively unbiased evaluated decision making process in BTS placement. An experimental data of a 2km by 3km territory was simulated for testing the new algorithm, results obtained show a 100% performance of the neighbour constrained algorithm in BTS placement optimization. Results on the application of GA with neighbourhood constraint indicate that the choices of location can be unbiased and optimization of facility placement for network design can be carried out

  14. "YFlag"--a single-base extension primer based method for gender determination.

    Science.gov (United States)

    Allwood, Julia S; Harbison, Sally Ann

    2015-01-01

    Assigning the gender of a DNA contributor in forensic analysis is typically achieved using the amelogenin test. Occasionally, this test produces false-positive results due to deletions occurring on the Y chromosome. Here, a four-marker "YFlag" method is presented to infer gender using single-base extension primers to flag the presence (or absence) of Y-chromosome DNA within a sample to supplement forensic STR profiling. This method offers built-in redundancy, with a single marker being sufficient to detect the presence of male DNA. In a study using 30 male and 30 female individuals, detection of male DNA was achieved with c. 0.03 ng of male DNA. All four markers were present in male/female mixture samples despite the presence of excessive female DNA. In summary, the YFlag system offers a method that is reproducible, specific, and sensitive, making it suitable for forensic use to detect male DNA. PMID:25354446

  15. Extensions of linear regression models based on set arithmetic for interval data

    OpenAIRE

    Blanco-Fernández, Angela; García-Bárzana, Marta; Colubi, Ana; Kontoghiorghes, Erricos J.

    2012-01-01

    Extensions of previous linear regression models for interval data are presented. A more flexible simple linear model is formalized. The new model may express cross-relationships between mid-points and spreads of the interval data in a unique equation based on the interval arithmetic. Moreover, extensions to the multiple case are addressed. The associated least-squares estimation problem are solved. Empirical results and a real-life application are presented in order to show the applicability ...

  16. Performance Based Novel Techniques for Semantic Web Mining

    Directory of Open Access Journals (Sweden)

    Mahendra Thakur

    2012-01-01

    Full Text Available The explosive growth in the size and use of the World Wide Web continuously creates new great challenges and needs. The need for predicting the users preferences in order to expedite and improve the browsing though a site can be achieved through personalizing of the websites. Most of the research efforts in web personalization correspond to the evolution of extensive research in web usage mining, i.e. the exploitation of the navigational patterns of the web site visitors. When a personalization system relies solely on usage-based results, however, valuable information conceptually related to what is finally recommended may be missed. Moreover, the structural properties of the web site are often disregarded. In this paper, we propose novel techniques that use the content semantics and the structural properties of a web site in order to improve the effectiveness of web personalization. In the first part of our work we present standing for Semantic Web Personalization, a personalization system that integrates usage data with content semantics, expressed in ontology terms, in order to compute semantically enhanced navigational patterns and effectively generate useful recommendations. To the best of our knowledge, our proposed technique is the only semantic web personalization system that may be used by non-semantic web sites. In the second part of our work, we present a novel approach for enhancing the quality of recommendations based on the underlying structure of a web site. We introduce UPR (Usage-based PageRank, a PageRank-style algorithm that relies on the recorded usage data and link analysis techniques. Overall, we demonstrate that our proposed hybrid personalization framework results in more objective and representative predictions than existing techniques.

  17. The Knowledge Base as an Extension of Distance Learning Reference Service

    Science.gov (United States)

    Casey, Anne Marie

    2012-01-01

    This study explores knowledge bases as extension of reference services for distance learners. Through a survey and follow-up interviews with distance learning librarians, this paper discusses their interest in creating and maintaining a knowledge base as a resource for reference services to distance learners. It also investigates their perceptions…

  18. Process flow measurement based on tracer techniques

    International Nuclear Information System (INIS)

    Flow measurement methods based on the tracer techniques are the transit time method as well as methods based on tracer dilution. These methods can be applied to the on-site calibration of flowmeters and to measuring the flowrate where no flowmeter is installed. The accuracy of the tracer methods depends on the prevailing measuring conditions. In this report the accuracy of the transit time method under field conditions is estimated to be 1-2% on the 99,7% confidence level. The accuracy of the isotope dilution method is estimated as slightly better, namely about 0.5% at its best. An even better accuracy, about 0.2%, could be achieved by developing the method and the measuring equipment. Tests were carried out with the transit time method for water and steam flow. While measuring water flow the effect of different measuring parameters upon the repeatability of the method were looked into. Such were the number of the detectors and the distance between the measuring points. Different means of tracer injection were tested, as well. These had less effect than expected. The accuracies achieved in steam flow measurements were of the same order of magnitude as in water flow measurements. The tracers used were 137mBa for water flow and 41Ar for steam flow measurements

  19. A description of Seismicity based on Non-extensive Statistical Physics: An introduction to Non-extensive Statistical Seismology.

    Science.gov (United States)

    Vallianatos, Filippos

    2015-04-01

    Despite the extreme complexity that characterizes earthquake generation process, simple phenomenology seems to apply in the collective properties of seismicity. The best known is the Gutenberg-Richter relation. Short and long-term clustering, power-law scaling and scale-invariance have been exhibited in the spatio-temporal evolution of seismicity providing evidence for earthquakes as a nonlinear dynamic process. Regarding the physics of "many" earthquakes and how this can be derived from first principles, one may wonder, how can the collective properties of a set formed by all earthquakes in a given region, be derived and how does the structure of seismicity depend on its elementary constituents - the earthquakes? What are these properties? The physics of many earthquakes has to be studied with a different approach than the physics of one earthquake making the use of statistical physics necessary to understand the collective properties of earthquakes. Then a natural question arises. What type of statistical physics is appropriate to commonly describe effects from the microscale and crack opening level to the level of large earthquakes? An answer to the previous question could be non-extensive statistical physics, introduced by Tsallis (1988), as the appropriate methodological tool to describe entities with (multi) fractal distributions of their elements and where long-range interactions or intermittency are important, as in fracturing phenomena and earthquakes. In the present work, we review some fundamental properties of earthquake physics and how these are derived by means of non-extensive statistical physics. The aim is to understand aspects of the underlying physics that lead to the evolution of the earthquake phenomenon introducing the new topic of non-extensive statistical seismology. This research has been funded by the European Union (European Social Fund) and Greek national resources under the framework of the "THALES Program: SEISMO FEAR HELLARC" project

  20. A New Extension Theory-based Production Operation Method in Industrial Process

    Institute of Scientific and Technical Information of China (English)

    XU Yuan; ZHU Qunxiong

    2013-01-01

    To explore the problems of dynamic change in production demand and operating contradiction in production process,a new extension theory-based production operation method is proposed.The core is the demand requisition,contradiction resolution and operation classification.For the demand requisition,the deep and comprehensive demand elements are collected by the conjugating analysis.For the contradiction resolution,the conflict between the demand and operating elements are solved by the extension reasoning,extension transformation and consistency judgment.For the operating classification,the operating importance among the operating elements is calculated by the extension clustering so as to guide the production operation and ensure the production safety.Through the actual application in the cascade reaction process of high-density polyethylene (HDPE) of a chemicalplant,cases study and comparison show that the proposed extension theory-based production operation method is significantly better than the traditional experience-based operation method in actual production process,which exploits a new way to the research on the production operating methods for industrial process.

  1. Coherent Synchrotron Radiation A Simulation Code Based on the Non-Linear Extension of the Operator Splitting Method

    CERN Document Server

    Dattoli, Giuseppe

    2005-01-01

    The coherent synchrotron radiation (CSR) is one of the main problems limiting the performance of high intensity electron accelerators. A code devoted to the analysis of this type of problems should be fast and reliable: conditions that are usually hardly achieved at the same time. In the past, codes based on Lie algebraic techniques have been very efficient to treat transport problem in accelerators. The extension of these method to the non-linear case is ideally suited to treat CSR instability problems. We report on the development of a numerical code, based on the solution of the Vlasov equation, with the inclusion of non-linear contribution due to wake field effects. The proposed solution method exploits an algebraic technique, using exponential operators implemented numerically in C++. We show that the integration procedure is capable of reproducing the onset of an instability and effects associated with bunching mechanisms leading to the growth of the instability itself. In addition, parametric studies a...

  2. Structural level characterization of base oils using advanced analytical techniques

    KAUST Repository

    Hourani, Nadim

    2015-05-21

    Base oils, blended for finished lubricant formulations, are classified by the American Petroleum Institute into five groups, viz., groups I-V. Groups I-III consist of petroleum based hydrocarbons whereas groups IV and V are made of synthetic polymers. In the present study, five base oil samples belonging to groups I and III were extensively characterized using high performance liquid chromatography (HPLC), comprehensive two-dimensional gas chromatography (GC×GC), and Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) equipped with atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) sources. First, the capabilities and limitations of each analytical technique were evaluated, and then the availed information was combined to reveal compositional details on the base oil samples studied. HPLC showed the overwhelming presence of saturated over aromatic compounds in all five base oils. A similar trend was further corroborated using GC×GC, which yielded semiquantitative information on the compound classes present in the samples and provided further details on the carbon number distributions within these classes. In addition to chromatography methods, FT-ICR MS supplemented the compositional information on the base oil samples by resolving the aromatics compounds into alkyl- and naphtheno-subtituted families. APCI proved more effective for the ionization of the highly saturated base oil components compared to APPI. Furthermore, for the detailed information on hydrocarbon molecules FT-ICR MS revealed the presence of saturated and aromatic sulfur species in all base oil samples. The results presented herein offer a unique perspective into the detailed molecular structure of base oils typically used to formulate lubricants. © 2015 American Chemical Society.

  3. Record extension for short-gauged water quality parameters using a newly proposed robust version of the line of organic correlation technique

    Directory of Open Access Journals (Sweden)

    B. Khalil

    2012-04-01

    Full Text Available In many situations the extension of hydrological or water quality time series at short-gauged stations is required. Ordinary least squares regression (OLS of any hydrological or water quality variable is a traditional and commonly used record extension technique. However, OLS tends to underestimate the variance in the extended records, which leads to underestimation of high percentiles and overestimation of low percentiles, given that the data is normally distributed. The development of the line of organic correlation (LOC technique is aimed at correcting this bias. On the other hand, the Kendall-Theil robust line (KTRL method has been proposed as an analogue of OLS with the advantage of being robust in the presence of outliers. Given that water quality data are characterised by the presence of outliers, positive skewness and non-normal distribution of data, a robust record extension technique is more appropriate. In this paper, four record-extension techniques are described, and their properties are explored. These techniques are OLS, LOC, KTRL and a new technique proposed in this paper, the robust line of organic correlation technique (RLOC. RLOC includes the advantage of the LOC in reducing the bias in estimating the variance, but at the same time it is also robust to the presence of outliers. A Monte Carlo study and empirical experiment were conducted to examine the four techniques for the accuracy and precision of the estimate of statistical moments and over the full range of percentiles. Results of the Monte Carlo study showed that the OLS and KTRL techniques have serious deficiencies as record-extension techniques, while the LOC and RLOC techniques are nearly similar. However, RLOC outperforms OLS, KTRL and LOC when using real water quality records.

  4. Record extension for short-gauged water quality parameters using a newly proposed robust version of the Line of Organic Correlation technique

    Directory of Open Access Journals (Sweden)

    B. Khalil

    2012-07-01

    Full Text Available In many situations the extension of hydrological or water quality time series at short-gauged stations is required. Ordinary least squares regression (OLS of any hydrological or water quality variable is a traditional and commonly used record extension technique. However, OLS tends to underestimate the variance in the extended records, which leads to underestimation of high percentiles and overestimation of low percentiles, given that the data are normally distributed. The development of the line of organic correlation (LOC technique is aimed at correcting this bias. On the other hand, the Kendall-Theil robust line (KTRL method has been proposed as an analogue of OLS with the advantage of being robust in the presence of outliers. Given that water quality data are characterised by the presence of outliers, positive skewness and non-normal distribution of data, a robust record extension technique is more appropriate. In this paper, four record-extension techniques are described, and their properties are explored. These techniques are OLS, LOC, KTRL and a new technique proposed in this paper, the robust line of organic correlation technique (RLOC. RLOC includes the advantage of the LOC in reducing the bias in estimating the variance, but at the same time it is also robust in the presence of outliers. A Monte Carlo study and empirical experiment were conducted to examine the four techniques for the accuracy and precision of the estimate of statistical moments and over the full range of percentiles. Results of the Monte Carlo study showed that the OLS and KTRL techniques have serious deficiencies as record-extension techniques, while the LOC and RLOC techniques are nearly similar. However, RLOC outperforms OLS, KTRL and LOC when using real water quality records.

  5. Type extension trees

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    We introduce type extension trees as a formal representation language for complex combinatorial features of relational data. Based on a very simple syntax this language provides a unified framework for expressing features as diverse as embedded subgraphs on the one hand, and marginal counts of...... attribute values on the other. We show by various examples how many existing relational data mining techniques can be expressed as the problem of constructing a type extension tree and a discriminant function....

  6. Phase difference estimation method based on data extension and Hilbert transform

    International Nuclear Information System (INIS)

    To improve the precision and anti-interference performance of phase difference estimation for non-integer periods of sampling signals, a phase difference estimation method based on data extension and Hilbert transform is proposed. Estimated phase difference is obtained by means of data extension, Hilbert transform, cross-correlation, auto-correlation, and weighted phase average. Theoretical analysis shows that the proposed method suppresses the end effects of Hilbert transform effectively. The results of simulations and field experiments demonstrate that the proposed method improves the anti-interference performance of phase difference estimation and has better performance of phase difference estimation than the correlation, Hilbert transform, and data extension-based correlation methods, which contribute to improving the measurement precision of the Coriolis mass flowmeter. (paper)

  7. Strategic Partnerships that Strengthen Extension's Community-Based Entrepreneurship Programs: An Example from Maine

    Science.gov (United States)

    Bassano, Louis V.; McConnon, James C., Jr.

    2011-01-01

    This article explains how Extension can enhance and expand its nationwide community-based entrepreneurship programs by developing strategic partnerships with other organizations to create highly effective educational programs for rural entrepreneurs. The activities and impacts of the Down East Micro-Enterprise Network (DEMN), an alliance of three…

  8. FLOWCER - a flowmeter based on radiotracer techniques

    International Nuclear Information System (INIS)

    One of the most difficult problems in the field of flow measurement is the lack of a portable, clamp-on type of flowmeter of good accuracy. This is a serious restriction in non-continuous flow measurements and on-site calibrations of flow meters. One possibility of constructing a meter capable for these measurements is to use tracer techniques, particularly radioisotope tracers. A flow measurement instrument, FLOWCER, has been developed in the Reactor Laboratory of the Technical Research Centre of Finland (VTT). The instrument is based on the radioisotope transit time method. The device can be used for the accurate instantaneous measurement of volume flow rate in ducts. The tracer used is 137mBa produced in a portable isotope generator. Because of the short half-life (2.6 min) of 137mBa the measurement is radiologically very safe. The device consists of the isotope generator, an injection device for the tracer, radiation detectors, a data logger unit and a micro-computer. Also a transducer for various other quantities than flow may be connected to the analog input channels of the FLOWCER. The measurement program can be modified for measurements of different types. The FLOWCER has been used for the measurememts of energy and material balances, for the on-site calibrations of flow meters and for pump efficiency analysis. The application most frequently used has been the on-site calibration of flow meters. According to the present experience (over 100 calibrated flow meters) the accuracy level of flow measurements can be increased by a factor of ten or more by using the transit time method for on-site calibration

  9. Light based techniques for improving health care: studies at RRCAT

    International Nuclear Information System (INIS)

    The invention of Lasers in 1960, the phenomenal advances in photonics as well as the information processing capability of the computers has given a major boost to the R and D activity on the use of light for high resolution biomedical imaging, sensitive, non-invasive diagnosis and precision therapy. The effort has resulted in remarkable progress and it is widely believed that light based techniques hold great potential to offer simpler, portable systems which can help provide diagnostics and therapy in a low resource setting. At Raja Ramanna Centre for Advanced Technology (RRCAT) extensive studies have been carried out on fluorescence spectroscopy of native tissue. This work led to two important outcomes. First, a better understanding of tissue fluorescence and insights on the possible use of fluorescence spectroscopy for screening of cancer and second development of diagnostic systems that can serve as standalone tool for non-invasive screening of the cancer of oral cavity. The optical coherence tomography setups and their functional extensions (polarization sensitive, Doppler) have also been developed and used for high resolution (∼10 µm) biomedical imaging applications, in particular for non-invasive monitoring of the healing of wounds. Chlorophyll based photo-sensitisers and their derivatives have been synthesized in house and used for photodynamic therapy of tumors in animal models and for antimicrobial applications. Various variants of optical tweezers (holographic, Raman etc.) have also been developed and utilised for different applications notably Raman spectroscopy of optically trapped red blood cells. An overview of these activities carried out at RRCAT is presented in this article. (author)

  10. DCT-based cyber defense techniques

    Science.gov (United States)

    Amsalem, Yaron; Puzanov, Anton; Bedinerman, Anton; Kutcher, Maxim; Hadar, Ofer

    2015-09-01

    With the increasing popularity of video streaming services and multimedia sharing via social networks, there is a need to protect the multimedia from malicious use. An attacker may use steganography and watermarking techniques to embed malicious content, in order to attack the end user. Most of the attack algorithms are robust to basic image processing techniques such as filtering, compression, noise addition, etc. Hence, in this article two novel, real-time, defense techniques are proposed: Smart threshold and anomaly correction. Both techniques operate at the DCT domain, and are applicable for JPEG images and H.264 I-Frames. The defense performance was evaluated against a highly robust attack, and the perceptual quality degradation was measured by the well-known PSNR and SSIM quality assessment metrics. A set of defense techniques is suggested for improving the defense efficiency. For the most aggressive attack configuration, the combination of all the defense techniques results in 80% protection against cyber-attacks with PSNR of 25.74 db.

  11. Risk-Based Allowed Outage Time and Surveillance Test Interval Extensions for Angra 1

    Directory of Open Access Journals (Sweden)

    Sonia M. Orlando Gibelli

    2012-01-01

    Full Text Available In this work, Probabilistic Safety Assessment (PSA is used to evaluate Allowed Outage Times (AOT and Surveillance Test Intervals (STI extensions for three Angra 1 nuclear power plant safety systems. The interest in such an analysis lies on the fact that PSA comprises a risk-based tool for safety evaluation and has been increasingly applied to support both the regulatory and the operational decision-making processes. Regarding Angra 1, among other applications, PSA is meant to be an additional method that can be used by the utility to justify Technical Specification relaxation to the Brazilian regulatory body. The risk measure used in this work is the Core Damage Frequency, obtained from the Angra 1 Level 1 PSA study. AOT and STI extensions are evaluated for the Safety Injection, Service Water and Auxiliary Feedwater Systems using the SAPHIRE code. In order to compensate for the risk increase caused by the extensions, compensatory measures as (1 test of redundant train prior to entering maintenance and (2 staggered test strategy are proposed. Results have shown that the proposed AOT extensions are acceptable for two of the systems with the implementation of compensatory measures whereas STI extensions are acceptable for all three systems.

  12. Flood alert system based on bayesian techniques

    Science.gov (United States)

    Gulliver, Z.; Herrero, J.; Viesca, C.; Polo, M. J.

    2012-04-01

    The problem of floods in the Mediterranean regions is closely linked to the occurrence of torrential storms in dry regions, where even the water supply relies on adequate water management. Like other Mediterranean basins in Southern Spain, the Guadalhorce River Basin is a medium sized watershed (3856 km2) where recurrent yearly floods occur , mainly in autumn and spring periods, driven by cold front phenomena. The torrential character of the precipitation in such small basins, with a concentration time of less than 12 hours, produces flash flood events with catastrophic effects over the city of Malaga (600000 inhabitants). From this fact arises the need for specific alert tools which can forecast these kinds of phenomena. Bayesian networks (BN) have been emerging in the last decade as a very useful and reliable computational tool for water resources and for the decision making process. The joint use of Artificial Neural Networks (ANN) and BN have served us to recognize and simulate the two different types of hydrological behaviour in the basin: natural and regulated. This led to the establishment of causal relationships between precipitation, discharge from upstream reservoirs, and water levels at a gauging station. It was seen that a recurrent ANN model working at an hourly scale, considering daily precipitation and the two previous hourly values of reservoir discharge and water level, could provide R2 values of 0.86. BN's results slightly improve this fit, but contribute with uncertainty to the prediction. In our current work to Design a Weather Warning Service based on Bayesian techniques the first steps were carried out through an analysis of the correlations between the water level and rainfall at certain representative points in the basin, along with the upstream reservoir discharge. The lower correlation found between precipitation and water level emphasizes the highly regulated condition of the stream. The autocorrelations of the variables were also

  13. Downward Price-Based Brand Line Extensions Effects on Luxury Brands

    Directory of Open Access Journals (Sweden)

    Marcelo Royo-Vela

    2015-07-01

    Full Text Available This study tries to examine the brand concept consistency, the self-concept congruence and the resulting loyalty status of the consumers in order to evaluate whether a downward price-based line extensions in the luxury goods market has any negative or positive effect on them. By conducting focus group and in-depth interviews it was tried to filter out how brand concepts of luxury brands are perceived before and after a line extension. Results revealed that a crucial aspect for the evaluation of downward price-based line extensions is the exclusivity variable. Additionally, the research showed different modification to the brand concept consistency after an extension depending whether the brand is bought for pure hedonic or emotional reasons or actually for functional reasons. As practical implications brands appealing to hedonic/emotional motivations need to be crucially differentiated to those brands appealing to functional/rational motivations. In the case of a mixed concept an in-depth segmentation of the target markets is needed in order to successfully reach the consumers’ needs.

  14. Association of Anterior and Lateral Extraprostatic Extensions with Base-Positive Resection Margins in Prostate Cancer

    Science.gov (United States)

    Abalajon, Mark Joseph; Jang, Won Sik; Kwon, Jong Kyou; Yoon, Cheol Yong; Lee, Joo Yong; Cho, Kang Su; Ham, Won Sik

    2016-01-01

    Introduction Positive surgical margins (PSM) detected in the radical prostatectomy specimen increase the risk of biochemical recurrence (BCR). Still, with formidable number of patients never experiencing BCR in their life, the reason for this inconsistency has been attributed to the artifacts and to the spontaneous regression of micrometastatic site. To investigate the origin of margin positive cancers, we have looked into the influence of extraprostatic extension location on the resection margin positive site and its implications on BCR risk. Materials & Methods The clinical information and follow-up data of 612 patients who had extraprostatic extension and positive surgical margin at the time of robot assisted radical prostatectomy (RARP) in the single center between 2005 and 2014 were modeled using Fine and Gray’s competing risk regression analysis for BCR. Extraprostatic extensions were divided into categories according to location as apex, base, anterior, posterior, lateral, and posterolateral. Extraprostatic extensions were defined as presence of tumor beyond the borders of the gland in the posterior and posterolateral regions. Tumor admixed with periprostatic fat was additionally considered as having extraprostatic extension if capsule was vague in the anterior, apex, and base regions. Positive surgical margins were defined as the presence of tumor cells at the inked margin on the inspection under microscopy. Association of these classifications with the site of PSM was evaluated by Cohen’s Kappa analysis for concordance and logistic regression for the odds of apical and base PSMs. Results Median follow-up duration was 36.5 months (interquartile range[IQR] 20.1–36.5). Apex involvement was found in 158 (25.8%) patients and base in 110 (18.0%) patients. PSMs generally were found to be associated with increased risk of BCR regardless of location, with BCR risk highest for base PSM (HR 1.94, 95% CI 1.40–2.68, p<0.001) after adjusting for age, initial

  15. Physics based modeling of a series parallel battery pack for asymmetry analysis, predictive control and life extension

    Science.gov (United States)

    Ganesan, Nandhini; Basu, Suman; Hariharan, Krishnan S.; Kolake, Subramanya Mayya; Song, Taewon; Yeo, Taejung; Sohn, Dong Kee; Doo, Seokgwang

    2016-08-01

    Lithium-Ion batteries used for electric vehicle applications are subject to large currents and various operation conditions, making battery pack design and life extension a challenging problem. With increase in complexity, modeling and simulation can lead to insights that ensure optimal performance and life extension. In this manuscript, an electrochemical-thermal (ECT) coupled model for a 6 series × 5 parallel pack is developed for Li ion cells with NCA/C electrodes and validated against experimental data. Contribution of the cathode to overall degradation at various operating conditions is assessed. Pack asymmetry is analyzed from a design and an operational perspective. Design based asymmetry leads to a new approach of obtaining the individual cell responses of the pack from an average ECT output. Operational asymmetry is demonstrated in terms of effects of thermal gradients on cycle life, and an efficient model predictive control technique is developed. Concept of reconfigurable battery pack is studied using detailed simulations that can be used for effective monitoring and extension of battery pack life.

  16. Segmentation of Color Images Based on Different Segmentation Techniques

    OpenAIRE

    Purnashti Bhosale; Aniket Gokhale

    2013-01-01

    In this paper, we propose an Color image segmentation algorithm based on different segmentation techniques. We recognize the background objects such as the sky, ground, and trees etc based on the color and texture information using various methods of segmentation. The study of segmentation techniques by using different threshold methods such as global and local techniques and they are compared with one another so as to choose the best technique for threshold segmentation. Further segmentation...

  17. A giant vagal schwannoma with unusual extension from skull base to the mediastinum

    Directory of Open Access Journals (Sweden)

    Shenoy S Vijendra

    2015-01-01

    Full Text Available Cervical vagal schwannoma is an extremely rare neoplasm. Middle aged people are usually affected. These tumors usually present as asymptomatic masses. These tumors are almost always benign. Preoperative diagnosis of these lesions is important due to the morbidity associated with its excision. Preoperative tissue diagnosis is not accurate. The imaging modality can be done to assess the extent and for planning the treatment. Surgical excision with preservation of neural origin is the treatment option. Giant vagal schwannomas are extremely rare. Only one case has been reported in the literature till date. There has no reported case of extensive vagal schwannoma from skull base to the mediastinum. Here, we describe the asymptomatic presentation of an unusual appearing giant cervical vagal schwannoma with an extension from skull base to the mediastinum.

  18. Autofluorescence based diagnostic techniques for oral cancer

    OpenAIRE

    Balasubramaniam, A. Murali; Sriraman, Rajkumari; Sindhuja, P; Mohideen, Khadijah; Parameswar, R. Arjun; Muhamed Haris, K. T.

    2015-01-01

    Oral cancer is one of the most common cancers worldwide. Despite of various advancements in the treatment modalities, oral cancer mortalities are more, particularly in developing countries like India. This is mainly due to the delay in diagnosis of oral cancer. Delay in diagnosis greatly reduces prognosis of the treatment and also cause increased morbidity and mortality rates. Early diagnosis plays a key role in effective management of oral cancer. A rapid diagnostic technique can greatly aid...

  19. Risk-Based Allowed Outage Time and Surveillance Test Interval Extensions for Angra 1

    OpenAIRE

    Orlando Gibelli, Sonia M.; e Melo, P. F. Frutuoso; Bogado Leite, Sérgio Q.

    2012-01-01

    In this work, Probabilistic Safety Assessment (PSA) is used to evaluate Allowed Outage Times (AOT) and Surveillance Test Intervals (STI) extensions for three Angra 1 nuclear power plant safety systems. The interest in such an analysis lies on the fact that PSA comprises a risk-based tool for safety evaluation and has been increasingly applied to support both the regulatory and the operational decision-making processes. Regarding Angra 1, among other applications, PSA is meant to be an additio...

  20. Extensible Markup Language (XML) based analysis and comparison of heterogeneous databases

    OpenAIRE

    Halle, Robert F.

    2001-01-01

    This thesis describes an Extensible Markup Language (XML) based analysis and comparison method that could be used to identity equivalent components of heterogeneous databases. In the Department of Defense there currently exist multiple databases required to support command and control of some portion of the battlefield force. Interoperability between forces will become crucial as the force structure continues to be reduced. This interoperability will be facilitated through the integration of ...

  1. DanteR: an extensible R-based tool for quantitative analysis of -omics data

    OpenAIRE

    Taverner, Tom; Karpievitch, Yuliya V.; Polpitiya, Ashoka D.; Brown, Joseph N.; Dabney, Alan R.; Anderson, Gordon A.; Smith, Richard D.

    2012-01-01

    Motivation: The size and complex nature of mass spectrometry-based proteomics datasets motivate development of specialized software for statistical data analysis and exploration. We present DanteR, a graphical R package that features extensive statistical and diagnostic functions for quantitative proteomics data analysis, including normalization, imputation, hypothesis testing, interactive visualization and peptide-to-protein rollup. More importantly, users can easily extend the existing func...

  2. Service-Based Extensions to an OAIS Archive for Science Data Management

    Science.gov (United States)

    Flathers, E.; Seamon, E.; Gessler, P. E.

    2014-12-01

    With new data management mandates from major funding sources such as the National Institutes for Health and the National Science Foundation, architecture of science data archive systems is becoming a critical concern for research institutions. The Consultative Committee for Space Data Systems (CCSDS), in 2002, released their first version of a Reference Model for an Open Archival Information System (OAIS). The CCSDS document (now an ISO standard) was updated in 2012 with additional focus on verifying the authenticity of data and developing concepts of access rights and a security model. The OAIS model is a good fit for research data archives, having been designed to support data collections of heterogeneous types, disciplines, storage formats, etc. for the space sciences. As fast, reliable, persistent Internet connectivity spreads, new network-available resources have been developed that can support the science data archive. A natural extension of an OAIS archive is the interconnection with network- or cloud-based services and resources. We use the Service Oriented Architecture (SOA) design paradigm to describe a set of extensions to an OAIS-type archive: purpose and justification for each extension, where and how each extension connects to the model, and an example of a specific service that meets the purpose.

  3. Model extension and improvement for simulator-based software safety analysis

    International Nuclear Information System (INIS)

    One of the major concerns when employing digital I and C system in nuclear power plant is digital system may introduce new failure mode, which differs with previous analog I and C system. Various techniques are under developing to analyze the hazard originated from software faults in digital systems. Preliminary hazard analysis, failure modes and effects analysis, and fault tree analysis are the most extensive used techniques. However, these techniques are static analysis methods, cannot perform dynamic analysis and the interactions among systems. This research utilizes 'simulator/plant model testing' technique classified in (IEEE Std 7-4.3.2-2003, 2003. IEEE Standard for Digital Computers in Safety Systems of Nuclear Power Generating Stations) to identify hazards which might be induced by nuclear I and C software defects. The recirculation flow system, control rod system, feedwater system, steam line model, dynamic power-core flow map, and related control systems of PCTran-ABWR model were successfully extended and improved. The benchmark against ABWR SAR proves this modified model is capable to accomplish dynamic system level software safety analysis and better than the static methods. This improved plant simulation can then further be applied to hazard analysis for operator/digital I and C interface interaction failure study, and the hardware-in-the-loop fault injection study

  4. Path Based Mapping Technique for Robots

    Directory of Open Access Journals (Sweden)

    Amiraj Dhawan

    2013-05-01

    Full Text Available The purpose of this paper is to explore a new way of autonomous mapping. Current systems using perception techniques like LAZER or SONAR use probabilistic methods and have a drawback of allowing considerable uncertainty in the mapping process. Our approach is to break down the environment, specifically indoor, into reachable areas and objects, separated by boundaries, and identifying their shape, to render various navigable paths around them. This is a novel method to do away with uncertainties, as far as possible, at the cost of temporal efficiency. Also this system demands only minimum and cheap hardware, as it relies on only Infra-Red sensors to do the job.

  5. Speech recognition based on pattern recognition techniques

    Science.gov (United States)

    Rabiner, Lawrence R.

    1990-05-01

    Algorithms for speech recognition can be characterized broadly as pattern recognition approaches and acoustic phonetic approaches. To date, the greatest degree of success in speech recognition has been obtained using pattern recognition paradigms. The use of pattern recognition techniques were applied to the problems of isolated word (or discrete utterance) recognition, connected word recognition, and continuous speech recognition. It is shown that understanding (and consequently the resulting recognizer performance) is best to the simplest recognition tasks and is considerably less well developed for large scale recognition systems.

  6. Translation of Untranslatable Words — Integration of Lexical Approximation and Phrase-Table Extension Techniques into Statistical Machine Translation

    Science.gov (United States)

    Paul, Michael; Arora, Karunesh; Sumita, Eiichiro

    This paper proposes a method for handling out-of-vocabulary (OOV) words that cannot be translated using conventional phrase-based statistical machine translation (SMT) systems. For a given OOV word, lexical approximation techniques are utilized to identify spelling and inflectional word variants that occur in the training data. All OOV words in the source sentence are then replaced with appropriate word variants found in the training corpus, thus reducing the number of OOV words in the input. Moreover, in order to increase the coverage of such word translations, the SMT translation model is extended by adding new phrase translations for all source language words that do not have a single-word entry in the original phrase-table but only appear in the context of larger phrases. The effectiveness of the proposed methods is investigated for the translation of Hindi to English, Chinese, and Japanese.

  7. Structural reliability analysis based on the cokriging technique

    International Nuclear Information System (INIS)

    Approximation methods are widely used in structural reliability analysis because they are simple to create and provide explicit functional relationships between the responses and variables in stead of the implicit limit state function. Recently, the kriging method which is a semi-parameter interpolation technique that can be used for deterministic optimization and structural reliability has gained popularity. However, to fully exploit the kriging method, especially in high-dimensional problems, a large number of sample points should be generated to fill the design space and this can be very expensive and even impractical in practical engineering analysis. Therefore, in this paper, a new method-the cokriging method, which is an extension of kriging, is proposed to calculate the structural reliability. cokriging approximation incorporates secondary information such as the values of the gradients of the function being approximated. This paper explores the use of the cokriging method for structural reliability problems by comparing it with the Kriging method based on some numerical examples. The results indicate that the cokriging procedure described in this work can generate approximation models to improve on the accuracy and efficiency for structural reliability problems and is a viable alternative to the kriging.

  8. Using Satellite Based Techniques to Combine Volcanic Ash Detection Methods

    Science.gov (United States)

    Hendrickson, B. T.; Kessinger, C.; Herzegh, P.; Blackburn, G.; Cowie, J.; Williams, E.

    2006-12-01

    Volcanic ash poses a serious threat to aircraft avionics due to the corrosive nature of the silicate particles. Aircraft encounters with ash have resulted in millions of dollars in damage and loss of power to aircraft engines. Accurate detection of volcanic ash for the purpose of avoiding these hazardous areas is of the utmost importance to ensure aviation safety as well as to minimize economic loss. Satellite-based detection of volcanic ash has been used extensively to warn the aviation community of its presence through the use of multi-band detection algorithms. However, these algorithms are generally used individually rather than in combination and require the intervention of a human analyst. Automation of the detection and warning of the presence of volcanic ash for the aviation community is a long term goal of the Federal Aviation Administration Oceanic Weather Product Development Team. We are exploring the use of data fusion techniques within a fuzzy logic framework to perform a weighted combination of several multi-band detection algorithms. Our purpose is to improve the overall performance of volcanic ash detection and to test whether automation is feasible. Our initial focus is on deep, stratospheric eruptions.

  9. Feature-specific imaging: Extensions to adaptive object recognition and active illumination based scene reconstruction

    Science.gov (United States)

    Baheti, Pawan K.

    Computational imaging (CI) systems are hybrid imagers in which the optical and post-processing sub-systems are jointly optimized to maximize the task-specific performance. In this dissertation we consider a form of CI system that measures the linear projections (i.e., features) of the scene optically, and it is commonly referred to as feature-specific imaging (FSI). Most of the previous work on FSI has been concerned with image reconstruction. Previous FSI techniques have also been non-adaptive and restricted to the use of ambient illumination. We consider two novel extensions of the FSI system in this work. We first present an adaptive feature-specific imaging (AFSI) system and consider its application to a face-recognition task. The proposed system makes use of previous measurements to adapt the projection basis at each step. We present both statistical and information-theoretic adaptation mechanisms for the AFSI system. The sequential hypothesis testing framework is used to determine the number of measurements required for achieving a specified misclassification probability. We demonstrate that AFSI system requires significantly fewer measurements than static-FSI (SFSI) and conventional imaging at low signal-to-noise ratio (SNR). We also show a trade-off, in terms of average detection time, between measurement SNR and adaptation advantage. Experimental results validating the AFSI system are presented. Next we present a FSI system based on the use of structured light. Feature measurements are obtained by projecting spatially structured illumination onto an object and collecting all of the reflected light onto a single photodetector. We refer to this system as feature-specific structured imaging (FSSI). Principal component features are used to define the illumination patterns. The optimal LMMSE operator is used to generate object estimates from the measurements. We demonstrate that this new imaging approach reduces imager complexity and provides improved image

  10. Temporal Extension to Exemplar-based Inpainting Applied to Scratch Correction in Damaged Image Sequences

    Czech Academy of Sciences Publication Activity Database

    Forbin, G.; Besserer, B.; Boldyš, Jiří; Tschumperlé, D.

    Anaheim: ACTA Press, 2005, s. 1-5. ISBN 0-88986-528-0. [Visualization, Imaging, and Image Processing (VIIP 2005). Benidorm (ES), 07.09.2005-09.09.2005] R&D Projects: GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : image sequences * digital restoration * exemplar-based inpainting method Subject RIV: BD - Theory of Information http://library.utia.cas.cz/separaty/2009/ZOI/boldys-temporal extension to exemplar-based inpainting applied to scratch correction in damaged image sequences.pdf

  11. A Survey Based on Opinion Classification Techniques.

    Directory of Open Access Journals (Sweden)

    Poongodi S

    2013-10-01

    Full Text Available The essential one is to collect the information as what the people think forever. Nowadays people can share everything in online Social network such as twitter, facebook. .People articulates their views and opinions regarding products and services. People can easily access and understand the opinions of others via web resources like as blog, review, and forum. These opinions are individual information which represents user’s opinions. The precise way for predicting opinions are to extract sentiments from the web, which could valuable for marketing research. Opinions are so vital that whenever need to make a decision before want to know others’ opinions. Opinion is not only important for a user but is also useful for an organization. This survey is about various methods and techniques used to classify the user opinions like as positive, negative.

  12. Comparison of Vibration-Based Damage Assessment Techniques

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Rytter, A.

    1995-01-01

    Three different vibration-based damage assessment techniques have been compared. One of the techniques uses the ratios between changes in experimentally and theoretically estimated natural frequencies, respectively, to locate a damage. The second technique relies on updating of a finite element...

  13. Work-based learning and role extension: A match made in heaven?

    International Nuclear Information System (INIS)

    This paper presents, and discusses the findings from an exploratory study which examined a cohort of postgraduate therapeutic radiographer students' experiences of undertaking work-based learning to support role extension. The findings showed that three themes emerged which impacted on individual experiences: organisational issues, role and practice issues related to competence development and the individual's background and experience. The conclusions are that new models must emerge, and be evaluated, to offer appropriate support to those individuals who demonstrate the skills and ability to progress to advanced and consultant levels. Departments need to deliberate how they can effectively introduce and support role extension, giving specific consideration to study time, the number of higher level practitioners in training, as well as how to offer effective clinical supervision. Collaboration between higher education institutes and departments should enable the development of tripartite agreements to facilitate effective support for the learners.

  14. An Extensible Dialogue Script for a Robot Based on Unification of State-Transition Models

    Directory of Open Access Journals (Sweden)

    Yosuke Matsusaka

    2010-01-01

    development of communication function of the robot. Compared to previous extension-by-connection method used in behavior-based communication robot developments, the extension-by-unification method has the ability to decompose the script into components. The decomposed components can be recomposed to build a new application easily. In this paper, first we, explain a reformulation we have applied to the conventional state-transition model. Second, we explain a set of algorithms to decompose, recompose, and detect the conflict of each component. Third, we explain a dialogue engine and a script management server we have developed. The script management server has a function to propose reusable components to the developer in real time by implementing the conflict detection algorithm. The dialogue engine SEAT (Speech Event-Action Translator has flexible adapter mechanism to enable quick integration to robotic systems. We have confirmed that by the application of three robots, development efficiency has improved by 30%.

  15. Extension Activity Support System (EASY: A Web-Based Prototype for Facilitating Farm Management

    Directory of Open Access Journals (Sweden)

    Christopher Pettit

    2012-01-01

    Full Text Available In response to disparate advances in delivering spatial information to support agricultural extension activities, the Extension Activity Support System (EASY project was established to develop a vision statement and conceptual design for such a system based on a national needs assessment. Personnel from across Australia were consulted and a review of existing farm information/management software undertaken to ensure that any system that is eventually produced from the EASY vision will build on the strengths of existing efforts. This paper reports on the collaborative consultative process undertaken to create the EASY vision as well as the conceptual technical design and business models that could support a fully functional spatially enabled online system.

  16. Key Point Based Data Analysis Technique

    Science.gov (United States)

    Yang, Su; Zhang, Yong

    In this paper, a new framework for data analysis based on the "key points" in data distribution is proposed. Here, the key points contain three types of data points: bridge points, border points, and skeleton points, where our main contribution is the bridge points. For each type of key points, we have developed the corresponding detection algorithm and tested its effectiveness with several synthetic data sets. Meanwhile, we further developed a new hierarchical clustering algorithm SPHC (Skeleton Point based Hierarchical Clustering) to demonstrate the possible applications of the key points acquired. Based on some real-world data sets, we experimentally show that SPHC performs better compared with several classical clustering algorithms including Complete-Link Hierarchical Clustering, Single-Link Hierarchical Clustering, KMeans, Ncut, and DBSCAN.

  17. Field of view extension and truncation correction for MR-based human attenuation correction in simultaneous MR/PET imaging

    Energy Technology Data Exchange (ETDEWEB)

    Blumhagen, Jan O., E-mail: janole.blumhagen@siemens.com; Ladebeck, Ralf; Fenchel, Matthias [Magnetic Resonance, Siemens AG Healthcare Sector, Erlangen 91052 (Germany); Braun, Harald; Quick, Harald H. [Institute of Medical Physics, Friedrich-Alexander-University Erlangen-Nürnberg, Erlangen 91052 (Germany); Faul, David [Siemens Medical Solutions, New York, New York 10015 (United States); Scheffler, Klaus [MRC Department, Max Planck Institute for Biological Cybernetics, Tübingen 72076, Germany and Department of Biomedical Magnetic Resonance, University Hospital Tübingen, Tübingen 72076 (Germany)

    2014-02-15

    Purpose: In quantitative PET imaging, it is critical to accurately measure and compensate for the attenuation of the photons absorbed in the tissue. While in PET/CT the linear attenuation coefficients can be easily determined from a low-dose CT-based transmission scan, in whole-body MR/PET the computation of the linear attenuation coefficients is based on the MR data. However, a constraint of the MR-based attenuation correction (AC) is the MR-inherent field-of-view (FoV) limitation due to static magnetic field (B{sub 0}) inhomogeneities and gradient nonlinearities. Therefore, the MR-based human AC map may be truncated or geometrically distorted toward the edges of the FoV and, consequently, the PET reconstruction with MR-based AC may be biased. This is especially of impact laterally where the patient arms rest beside the body and are not fully considered. Methods: A method is proposed to extend the MR FoV by determining an optimal readout gradient field which locally compensates B{sub 0} inhomogeneities and gradient nonlinearities. This technique was used to reduce truncation in AC maps of 12 patients, and the impact on the PET quantification was analyzed and compared to truncated data without applying the FoV extension and additionally to an established approach of PET-based FoV extension. Results: The truncation artifacts in the MR-based AC maps were successfully reduced in all patients, and the mean body volume was thereby increased by 5.4%. In some cases large patient-dependent changes in SUV of up to 30% were observed in individual lesions when compared to the standard truncated attenuation map. Conclusions: The proposed technique successfully extends the MR FoV in MR-based attenuation correction and shows an improvement of PET quantification in whole-body MR/PET hybrid imaging. In comparison to the PET-based completion of the truncated body contour, the proposed method is also applicable to specialized PET tracers with little uptake in the arms and might

  18. Activity Based Costing versus Traditional Technique

    OpenAIRE

    Dragomirescu Simona Elena; Solomon Daniela Cristina

    2011-01-01

    One of the current methods of management is Activity-Based Costing (ABC), method that allows the company to understand more clearly how and on what activity/product profit is achieved. In essence, the method involves identifying all specific activities of a product or service and distribution expenses to achieve them with greater accuracy than with traditional accounting methods. This involves not only costs determining closer to reality, but a better understanding of the factors that determi...

  19. An Authentication Technique Based on Classification

    Institute of Scientific and Technical Information of China (English)

    李钢; 杨杰

    2004-01-01

    We present a novel watermarking approach based on classification for authentication, in which a watermark is embedded into the host image. When the marked image is modified, the extracted watermark is also different to the original watermark, and different kinds of modification lead to different extracted watermarks. In this paper, different kinds of modification are considered as classes, and we used classification algorithm to recognize the modifications with high probability. Simulation results show that the proposed method is potential and effective.

  20. Survey of image-based representations and compression techniques

    OpenAIRE

    Shum, HY; Kang, SB; Chan, SC

    2003-01-01

    In this paper, we survey the techniques for image-based rendering (IBR) and for compressing image-based representations. Unlike traditional three-dimensional (3-D) computer graphics, in which 3-D geometry of the scene is known, IBR techniques render novel views directly from input images. IBR techniques can be classified into three categories according to how much geometric information is used: rendering without geometry, rendering with implicit geometry (i.e., correspondence), and rendering ...

  1. FDI and Accommodation Using NN Based Techniques

    Science.gov (United States)

    Garcia, Ramon Ferreiro; de Miguel Catoira, Alberto; Sanz, Beatriz Ferreiro

    Massive application of dynamic backpropagation neural networks is used on closed loop control FDI (fault detection and isolation) tasks. The process dynamics is mapped by means of a trained backpropagation NN to be applied on residual generation. Process supervision is then applied to discriminate faults on process sensors, and process plant parameters. A rule based expert system is used to implement the decision making task and the corresponding solution in terms of faults accommodation and/or reconfiguration. Results show an efficient and robust FDI system which could be used as the core of an SCADA or alternatively as a complement supervision tool operating in parallel with the SCADA when applied on a heat exchanger.

  2. Medical image analysis of 3D CT images based on extensions of Haralick texture features

    Czech Academy of Sciences Publication Activity Database

    Tesař, Ludvík; Shimizu, A.; Smutek, D.; Kobatake, H.; Nawano, S.

    2008-01-01

    Roč. 32, č. 6 (2008), s. 513-520. ISSN 0895-6111 R&D Projects: GA AV ČR 1ET101050403; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : image segmentation * Gaussian mixture model * 3D image analysis Subject RIV: IN - Informatics, Computer Science Impact factor: 1.192, year: 2008 http://library.utia.cas.cz/separaty/2008/AS/tesar-medical image analysis of 3d ct image s based on extensions of haralick texture features.pdf

  3. A novel protein complex identification algorithm based on Connected Affinity Clique Extension (CACE).

    Science.gov (United States)

    Li, Peng; He, Tingting; Hu, Xiaohua; Zhao, Junmin; Shen, Xianjun; Zhang, Ming; Wang, Yan

    2014-06-01

    A novel algorithm based on Connected Affinity Clique Extension (CACE) for mining overlapping functional modules in protein interaction network is proposed in this paper. In this approach, the value of protein connected affinity which is inferred from protein complexes is interpreted as the reliability and possibility of interaction. The protein interaction network is constructed as a weighted graph, and the weight is dependent on the connected affinity coefficient. The experimental results of our CACE in two test data sets show that the CACE can detect the functional modules much more effectively and accurately when compared with other state-of-art algorithms CPM and IPC-MCE. PMID:24803142

  4. Risk evaluation of bogie system based on extension theory and entropy weight method.

    Science.gov (United States)

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

  5. Two-Stage Fault Diagnosis Method Based on the Extension Theory for PV Power Systems

    OpenAIRE

    Meng-Hui Wang; Mu-Jia Chen

    2012-01-01

    In order to shorten the maintenance time and make sure of the photovoltaic (PV) power generation system steadily in operation, a fault diagnosis system for photovoltaic power generation system was proposed in this paper. First, a PSIM software is used to simulate a 2.2 kW PV system, it can take the operating date of the PV system under different sunlight intensity and temperature conditions. In this paper, a two-stage diagnosis system based on the extension theory for PV power systems is prop...

  6. 加快推进基层农业技术推广体系改革的对策及建议%Countermeasures to Push on Reform of Grass-roots Agricultural Technique Extension System

    Institute of Scientific and Technical Information of China (English)

    杨忠娜; 陈曦; 张淑云; 陶佩君

    2009-01-01

    The grass-roots agricultural technique extension system based on county and township levels is an organization to provide the achievements of farming, animal husbandry, fishery, forestry, agricultural machinery, water conservancy and other technical services for farmers, and animportant carrier to implement the strategy of agricultural revitalization through science and education. The paper puts forward countermeasures of strengthening its horizontal connection and vertical improvement to push on refotto of grass-roots agricultural technique extension system based on analyzing current status of grassroots agricultural extension system and the bottleneck problem in construction of grass-roots agricultural technical extension organization in China.%基层农业技术推广体系设立在县、乡两级,是为农民提供种植业、畜牧业、渔业、林业、农业机械、水利等科研成果和实用技术服务的组织,是实施科教兴农战略的重要载体.以全国基层农业推广体系建设现状为切入点,分析了制约基层农业技术推广机构建设过程中存在的瓶颈问题,提出通过对基层农村推广体系的横向衔接和纵向完善来推进改革.

  7. Personnel neutron monitoring based on albedo technique

    International Nuclear Information System (INIS)

    This work deals with the study, design and test of a personal neutron monitor based on the detection of albedo neutrons from the body and its further relation to the incident flux. By this method, neutrons of energies below about 100 KeV can be efficiently detected, providing good information in the region where the biological effectiveness of neutron radiation starts to rise. The system consists of a pair of Thermoluminescent Detectors (6 LiF - 7 LiF) ∼ inside a polyethylene moderating body, in order to increase the sensitivity. The surface of the dosimeter facing away from the body is covered by a layer of a borated resin to assure appropriate shielding of incident low energy neutrons. The response of the dosimeter to monoenergetic neutrons from a 3 MeV Van de Graaff, to Am Be neutrons and to neutrons from a thermal column was investigated. The directional sensitivity, the effect of beam divergence was well as the effect of changes in dosimeter-to-body distances were also studied. (author)

  8. Huffman-based code compression techniques for embedded processors

    KAUST Repository

    Bonny, Mohamed Talal

    2010-09-01

    The size of embedded software is increasing at a rapid pace. It is often challenging and time consuming to fit an amount of required software functionality within a given hardware resource budget. Code compression is a means to alleviate the problem by providing substantial savings in terms of code size. In this article we introduce a novel and efficient hardware-supported compression technique that is based on Huffman Coding. Our technique reduces the size of the generated decoding table, which takes a large portion of the memory. It combines our previous techniques, Instruction Splitting Technique and Instruction Re-encoding Technique into new one called Combined Compression Technique to improve the final compression ratio by taking advantage of both previous techniques. The instruction Splitting Technique is instruction set architecture (ISA)-independent. It splits the instructions into portions of varying size (called patterns) before Huffman coding is applied. This technique improves the final compression ratio by more than 20% compared to other known schemes based on Huffman Coding. The average compression ratios achieved using this technique are 48% and 50% for ARM and MIPS, respectively. The Instruction Re-encoding Technique is ISA-dependent. It investigates the benefits of reencoding unused bits (we call them reencodable bits) in the instruction format for a specific application to improve the compression ratio. Reencoding those bits can reduce the size of decoding tables by up to 40%. Using this technique, we improve the final compression ratios in comparison to the first technique to 46% and 45% for ARM and MIPS, respectively (including all overhead that incurs). The Combined Compression Technique improves the compression ratio to 45% and 42% for ARM and MIPS, respectively. In our compression technique, we have conducted evaluations using a representative set of applications and we have applied each technique to two major embedded processor architectures

  9. Fault Diagnosis in Industrial Systems Based on Blind Source Separation Techniques Using One Single Vibration Sensor

    OpenAIRE

    Nguyen, V. H.; C. Rutten; J.-C. Golinval

    2012-01-01

    In the field of structural health monitoring or machine condition monitoring, most vibration based methods reported in the literature require to measure responses at several locations on the structure. In machine condition monitoring, the number of available vibration sensors is often small and it is not unusual that only one single sensor is used to monitor a machine. The aim of this paper is to propose an extension of fault detection techniques that may be used when a reduced set of sensors...

  10. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    OpenAIRE

    T. Subramani; J. Jothi

    2014-01-01

    This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case s...

  11. Non-Destructive Techniques Based on Eddy Current Testing

    Directory of Open Access Journals (Sweden)

    Ernesto Vázquez-Sánchez

    2011-02-01

    Full Text Available Non-destructive techniques are used widely in the metal industry in order to control the quality of materials. Eddy current testing is one of the most extensively used non-destructive techniques for inspecting electrically conductive materials at very high speeds that does not require any contact between the test piece and the sensor. This paper includes an overview of the fundamentals and main variables of eddy current testing. It also describes the state-of-the-art sensors and modern techniques such as multi-frequency and pulsed systems. Recent advances in complex models towards solving crack-sensor interaction, developments in instrumentation due to advances in electronic devices, and the evolution of data processing suggest that eddy current testing systems will be increasingly used in the future.

  12. Life extension techniques for aircraft structures-Extending durability and promoting damage tolerance through bonded crack retarders

    OpenAIRE

    Irving, Phil E.; Zhang, Xiang; Doucet, J; Figueroa-Gordon, Douglas J.; Boscolo, M.; Heinimann, M.; Shepherd, G.; Fitzpatrick, M. E.; D. Liljedahl

    2011-01-01

    This paper explores the viability of the bonded crack retarder concept as a device for life extension of damage tolerant aircraft structures. Fatigue crack growth behaviour in metallic substrates with bonded straps has been determined. SENT and M(T) test coupons and large scale skin-stringer panels were tested at constant and variable amplitude loads. The strap materials were glass fibre polymer composites, GLARE, AA7085 and Ti-6Al-4V. Comprehensive measurements were made of...

  13. Array-based techniques for fingerprinting medicinal herbs

    Directory of Open Access Journals (Sweden)

    Xue Charlie

    2011-05-01

    Full Text Available Abstract Poor quality control of medicinal herbs has led to instances of toxicity, poisoning and even deaths. The fundamental step in quality control of herbal medicine is accurate identification of herbs. Array-based techniques have recently been adapted to authenticate or identify herbal plants. This article reviews the current array-based techniques, eg oligonucleotides microarrays, gene-based probe microarrays, Suppression Subtractive Hybridization (SSH-based arrays, Diversity Array Technology (DArT and Subtracted Diversity Array (SDA. We further compare these techniques according to important parameters such as markers, polymorphism rates, restriction enzymes and sample type. The applicability of the array-based methods for fingerprinting depends on the availability of genomics and genetics of the species to be fingerprinted. For the species with few genome sequence information but high polymorphism rates, SDA techniques are particularly recommended because they require less labour and lower material cost.

  14. Development of web based offsite consequence analysis program and periodic risk assessments for ILRT extension

    International Nuclear Information System (INIS)

    PSA is generally used to assess the relative risks posed by various types of operations and facilities, to understand the relative importance of the risk contributors, and to obtain insights on potential safety improvements. There is usually no pass fail criterion that must be met to allow operations to continue. A major goal of PSA is to gain insights that can be used to either minimize the probability of accidents or to minimize the consequences of accidents that might occur. According to NUREG 1493 which applies the PSA concept to an ILRT interval extension, an ILRT interval was extended from three in ten years to one in ten years in almost all US nuclear power plants. In addition, NUREG 1493 stated that there is an imperceptible increase in risk associated with ILRT intervals up to twenty years. Since then, many licensees began to submit requests for one-time ILRT interval extensions of 15 years. To permit permanent 15 year ILRT intervals under the existing NRC guidance, it was necessary to develop a standard method for supporting the risk impact assessment. Thus, NEI performed a project to develop a generic methodology for the risk impact assessment for ILRT interval extensions to 15 years using current performance data and risk informed guidance. The risk impact assessment is generally performed by MACCS II code, which was included in an international collaborative effort to compare predictions obtained from seven consequence codes: ARANO (Finland), CONDOR (UK), COSYMA (EU), LENA (Sweden), MACCS (United States), MECA2 (Spain), and OSCAAR (Japan). However, it costs lots of man power and efforts on using the MACCS II code, especially on collecting raw data for input files and on converting the raw data format. Accordingly, the web based OSCAP based on MACCS II code was developed to focus on automatic processing algorithm for handling the main input files with meteorological data, population distribution data and source term data. It is considered that the web-based

  15. Classification of acute pancreatitis based on retroperitoneal extension: Application of the concept of interfascial planes

    Energy Technology Data Exchange (ETDEWEB)

    Ishikawa, Kazuo [Osaka Prefectural Senshu Critical Care Medical Center, 2-24 Rinku-Ourai-Kita, Izumisano-shi, Osaka 598-0048 (Japan)]. E-mail: ishikawa@sccmc.izumisano.osaka.jp; Idoguchi, Koji [Osaka Prefectural Senshu Critical Care Medical Center, 2-24 Rinku-Ourai-Kita, Izumisano-shi, Osaka 598-0048 (Japan)]. E-mail: idoguchi@sccmc.izumisano.osaka.jp; Tanaka, Hiroshi [Department of Traumatology and Acute Critical Care Medicine, Osaka University Hospital, 2-15 Yamada-Oka, Suita-shi, Osaka 565-0871 (Japan)]. E-mail: tanaka@hp-emerg.med.osaka-u.ac.jp; Tohma, Yoshiki [Osaka Prefectural Nakakawachi Medical Center of Acute Medicine, 3-4-13 Nishi-Iwata, Higashiosaka-shi, Osaka 578-0947 (Japan)]. E-mail: tohma@nmcam.jp; Ukai, Isao [Department of Traumatology and Acute Critical Care Medicine, Osaka University Hospital, 2-15 Yamada-Oka, Suita-shi, Osaka 565-0871 (Japan)]. E-mail: isaoukai@nifty.com; Watanabe, Hiroaki [Osaka Prefectural Senshu Critical Care Medical Center, 2-24 Rinku-Ourai-Kita, Izumisano-shi, Osaka 598-0048 (Japan)]. E-mail: hiwatana@sccmc.izumisano.osaka.jp; Matsuoka, Tetsuya [Osaka Prefectural Senshu Critical Care Medical Center, 2-24 Rinku-Ourai-Kita, Izumisano-shi, Osaka 598-0048 (Japan)]. E-mail: matsuoka@sccmc.izumisano.osaka.jp; Yokota, Jyunichiro [Osaka Prefectural Senshu Critical Care Medical Center, 2-24 Rinku-Ourai-Kita, Izumisano-shi, Osaka 598-0048 (Japan)]. E-mail: jyokota@sccmc.izumisano.osaka.jp; Sugimoto, Tsuyoshi [Ryokufukai Hospital, 1-16-13 Setoguchi, Hirano-ku, Osaka-shi, Osaka 547-0034 (Japan)]. E-mail: ts-sugi@ryokufukai.or.jp

    2006-12-15

    Objective: This study aimed to provide a classification system for acute pancreatitis by applying the principle that the disease spreads along the retroperitoneal interfascial planes. Materials and methods: Medical records and computed tomography (CT) images of 58 patients with acute pancreatitis treated between 2000 and 2005 were reviewed. The retroperitoneum was subdivided into 10 components according to the concept of interfascial planes. Severity of acute pancreatitis was graded according to retroperitoneal extension into these components. Clinical courses and outcomes were compared with the grades. The prognostic value of our classification system was compared with that of Balthazar's CT severity index (CTSI). Results: Retroperitoneal extension of acute fluid collection was classified into five grades: Grade I, fluid confined to the anterior pararenal space or retromesenteric plane (8 patients); Grade II, fluid spreading into the lateroconal or retrorenal plane (16 patients); Grade III, fluid spreading into the combined interfascial plane (8 patients); Grade IV, fluid spreading into the subfascial plane beyond the interfascial planes (15 patients); and Grade V, fluid intruding into the posterior pararenal space (11 patients). Morbidity and mortality were 92.3% and 38.5% in the 26 patients with Grade IV or V disease, and 21.9% and 0% in the 32 patients with Grade I, II, or III disease. Morbidity and mortality were 86.7% and 33.3% in patients with disease classified 'severe' according to the CTSI, and 37.5% and 9.4% in patients with disease classified 'mild' or 'moderate'. Conclusion: Classification of acute pancreatitis based on CT-determined retroperitoneal extension is a useful indicator of the disease severity and prognosis without the need for contrast-medium enhanced CT.

  16. Classification of acute pancreatitis based on retroperitoneal extension: Application of the concept of interfascial planes

    International Nuclear Information System (INIS)

    Objective: This study aimed to provide a classification system for acute pancreatitis by applying the principle that the disease spreads along the retroperitoneal interfascial planes. Materials and methods: Medical records and computed tomography (CT) images of 58 patients with acute pancreatitis treated between 2000 and 2005 were reviewed. The retroperitoneum was subdivided into 10 components according to the concept of interfascial planes. Severity of acute pancreatitis was graded according to retroperitoneal extension into these components. Clinical courses and outcomes were compared with the grades. The prognostic value of our classification system was compared with that of Balthazar's CT severity index (CTSI). Results: Retroperitoneal extension of acute fluid collection was classified into five grades: Grade I, fluid confined to the anterior pararenal space or retromesenteric plane (8 patients); Grade II, fluid spreading into the lateroconal or retrorenal plane (16 patients); Grade III, fluid spreading into the combined interfascial plane (8 patients); Grade IV, fluid spreading into the subfascial plane beyond the interfascial planes (15 patients); and Grade V, fluid intruding into the posterior pararenal space (11 patients). Morbidity and mortality were 92.3% and 38.5% in the 26 patients with Grade IV or V disease, and 21.9% and 0% in the 32 patients with Grade I, II, or III disease. Morbidity and mortality were 86.7% and 33.3% in patients with disease classified 'severe' according to the CTSI, and 37.5% and 9.4% in patients with disease classified 'mild' or 'moderate'. Conclusion: Classification of acute pancreatitis based on CT-determined retroperitoneal extension is a useful indicator of the disease severity and prognosis without the need for contrast-medium enhanced CT

  17. An extension theory-based maximum power tracker using a particle swarm optimization algorithm

    International Nuclear Information System (INIS)

    Highlights: • We propose an adaptive maximum power point tracking (MPPT) approach for PV systems. • Transient and steady state performances in tracking process are improved. • The proposed MPPT can automatically tune tracking step size along a P–V curve. • A PSO algorithm is used to determine the weighting values of extension theory. - Abstract: The aim of this work is to present an adaptive maximum power point tracking (MPPT) approach for photovoltaic (PV) power generation system. Integrating the extension theory as well as the conventional perturb and observe method, an maximum power point (MPP) tracker is made able to automatically tune tracking step size by way of the category recognition along a P–V characteristic curve. Accordingly, the transient and steady state performances in tracking process are improved. Furthermore, an optimization approach is proposed on the basis of a particle swarm optimization (PSO) algorithm for the complexity reduction in the determination of weighting values. At the end of this work, a simulated improvement in the tracking performance is experimentally validated by an MPP tracker with a programmable system-on-chip (PSoC) based controller

  18. Block Copolymer-Based Supramolecular Elastomers with High Extensibility and Large Stress Generation Capability

    Science.gov (United States)

    Noro, Atsushi; Hayashi, Mikihiro

    We prepared block copolymer-based supramolecular elastomers with high extensibility and large stress generation capability. Reversible addition fragmentation chain transfer polymerizations were conducted under normal pressure and high pressure to synthesize several large molecular weight polystyrene-b-[poly(butyl acrylate)-co-polyacrylamide]-b-polystyrene (S-Ba-S) block copolymers. Tensile tests revealed that the largest S-Ba-S with middle block molecular weight of 3140k achieved a breaking elongation of over 2000% with a maximum tensile stress of 3.6 MPa and a toughness of 28 MJ/m3 while the reference sample without any middle block hydrogen bonds, polystyrene-b-poly(butyl acrylate)-b-polystyrene with almost the same molecular weight, was merely viscous and not self-standing. Hence, incorporation of hydrogen bonds into a long soft middle block was found to be beneficial to attain high extensibility and large stress generation capability probably due to concerted combination of entropic changes and internal potential energy changes originaing from the dissociation of multiple hydrogen bonds by elongation. This work was supported by JSPS KAKENHI Grant Numbers 13J02357, 24685035, 15K13785, and 23655213 for M.H. and A.N. A.N. also expresses his gratitude for Tanaka Rubber Science & Technology Award by Enokagaku-Shinko Foundation, Japan.

  19. Fusion Based Neutron Sources for Security Applications: Neutron Techniques

    OpenAIRE

    Albright, S.; Seviour, Rebecca

    2014-01-01

    The current reliance on X-Rays and intelligence for na- tional security is insufficient to combat the current risks of smuggling and terrorism seen on an international level. There are a range of neutron based security techniques which have the potential to dramatically improve national security. Neutron techniques can be broadly grouped into neutron in/neutron out and neutron in/photon out tech- niques. The use of accelerator based fusion devices will potentially enable to wide spread applic...

  20. Base Oils Biodegradability Prediction with Data Mining Techniques

    OpenAIRE

    Malika Trabelsi; Saloua Saidane; Sihem Ben Abdelmelek

    2010-01-01

    In this paper, we apply various data mining techniques including continuous numeric and discrete classification prediction models of base oils biodegradability, with emphasis on improving prediction accuracy. The results show that highly biodegradable oils can be better predicted through numeric models. In contrast, classification models did not uncover a similar dichotomy. With the exception of Memory Based Reasoning and Decision Trees, tested classification techniques achieved high classifi...

  1. The detection of bulk explosives using nuclear-based techniques

    Energy Technology Data Exchange (ETDEWEB)

    Morgado, R.E.; Gozani, T.; Seher, C.C.

    1988-01-01

    In 1986 we presented a rationale for the detection of bulk explosives based on nuclear techniques that addressed the requirements of civil aviation security in the airport environment. Since then, efforts have intensified to implement a system based on thermal neutron activation (TNA), with new work developing in fast neutron and energetic photon reactions. In this paper we will describe these techniques and present new results from laboratory and airport testing. Based on preliminary results, we contended in our earlier paper that nuclear-based techniques did provide sufficiently penetrating probes and distinguishable detectable reaction products to achieve the FAA operational goals; new data have supported this contention. The status of nuclear-based techniques for the detection of bulk explosives presently under investigation by the US Federal Aviation Administration (FAA) is reviewed. These include thermal neutron activation (TNA), fast neutron activation (FNA), the associated particle technique, nuclear resonance absorption, and photoneutron activation. The results of comprehensive airport testing of the TNA system performed during 1987-88 are summarized. From a technical point of view, nuclear-based techniques now represent the most comprehensive and feasible approach for meeting the operational criteria of detection, false alarms, and throughput. 9 refs., 5 figs., 2 tabs.

  2. Application of glyph-based techniques for multivariate engineering visualization

    Science.gov (United States)

    Glazar, Vladimir; Marunic, Gordana; Percic, Marko; Butkovic, Zlatko

    2016-01-01

    This article presents a review of glyph-based techniques for engineering visualization as well as practical application for the multivariate visualization process. Two glyph techniques, Chernoff faces and star glyphs, uncommonly used in engineering practice, are described, applied to the selected data set, run through the chosen optimization methods and user evaluated. As an example of how these techniques function, a set of data for the optimization of a heat exchanger with a microchannel coil is adopted for visualization. The results acquired by the chosen visualization techniques are related to the results of optimization carried out by the response surface method and compared with the results of user evaluation. Based on the data set from engineering research and practice, the advantages and disadvantages of these techniques for engineering visualization are identified and discussed.

  3. Rapid analysis of steels using laser-based techniques

    International Nuclear Information System (INIS)

    Based on the data obtained by this study, we conclude that laser-based techniques can be used to provide at least semi-quantitative information about the elemental composition of molten steel. Of the two techniques investigated here, the Sample-Only method appears preferable to the LIBS (laser-induced breakdown spectroscopy) method because of its superior analytical performance. In addition, the Sample-Only method would probably be easier to incorporate into a steel plant environment. However, before either technique can be applied to steel monitoring, additional research is needed

  4. Structural Fatigue Reliability Based on Extension of Random Loads into Interval Variables

    Directory of Open Access Journals (Sweden)

    Qiangfeng Wang

    2013-01-01

    Full Text Available According to the problem that for a structure under random loads, the structural fatigue life cant be directly calculated out by S-N curves and linear Miner cumulative damage rule. Owing to the uncertainty of loads, and the problem of the inaccuracy of calculated structural reliability index for the existence of deviation between measured data in projects and real data, the research method for structural fatigue reliability based on extension of random loads into interval variables is proposed. The innovation is that we can accurately calculate out the interval of the structural fatigue life and reliability index of a structure according to the probability density function of stress level of random loads and the coefficient of variation of measured loads. By practical calculation example, it is proved that this method is more suitable to practical engineering comparing to traditional methods. It will provide a perfect research approach for reliability analysis of the structure under random loads.

  5. Typing of 49 autosomal SNPs by single base extension and capillary electrophoresis for forensic genetic testing

    DEFF Research Database (Denmark)

    Børsting, Claus; Tomas Mas, Carmen; Morling, Niels

    2012-01-01

    We describe a method for simultaneous amplification of 49 autosomal single nucleotide polymorphisms (SNPs) by multiplex PCR and detection of the SNP alleles by single base extension (SBE) and capillary electrophoresis. All the SNPs may be amplified from only 100 pg of genomic DNA and the length of...... the amplicons range from 65 to 115 bp. The high sensitivity and the short amplicon sizes make the assay very suitable for typing of degraded DNA samples, and the low mutation rate of SNPs makes the assay very useful for relationship testing. Combined, these advantages make the assay well suited for...... disaster victim identifications, where the DNA from the victims may be highly degraded and the victims are identified via investigation of their relatives. The assay was validated according to the ISO 17025 standard and used for routine case work in our laboratory....

  6. Activities of colistin- and minocycline-based combinations against extensive drug resistant Acinetobacter baumannii isolates from intensive care unit patients

    OpenAIRE

    Li Jian; Zhu De-mei; Huang Jun; Liu Xiao-fang; Liang Wang; Zhang Jing

    2011-01-01

    Abstract Background Extensive drug resistance of Acinetobacter baumannii is a serious problem in the clinical setting. It is therefore important to find active antibiotic combinations that could be effective in the treatment of infections caused by this problematic 'superbug'. In this study, we analyzed the in vitro activities of three colistin-based combinations and a minocycline-based combination against clinically isolated extensive drug resistant Acinetobacter baumannii (XDR-AB) strains. ...

  7. An Extension on Logic of Semantic Web Based on OIL and RDF%基于OIL和RDFS的语义化Web逻辑扩展

    Institute of Scientific and Technical Information of China (English)

    姚绍文; 宗勇; 刘爱莲; 周明天

    2002-01-01

    RDF(S) is the standard for metadata of Web that aims to turn the current Web into the foundation of a machine-understandable knowledge system.By employing the folw-blown technique of KE,the Web-NG is targeted to provide the semantic inter-operability for data and knowledge exchange.As a hot spot in KE,ontology can be well integrated with the Web and such integration will support knowledge modeling and description on the Web-based spplications.This paper introduces RDF(S).Web-NG and an Web-Based ontology modeling language OIL.Based on the introduction,the paper defines an extension of OIL/RDFS on proposition logic,and presents an approach for modeling proposition,the paper defines an extension of OIL/RDFS on proposition logic,and presents an approach for modeling propostion formula and inferring rule.

  8. Network Lifetime Extension Based On Network Coding Technique In Underwater Acoustic Sensor Networks

    OpenAIRE

    Padmavathy.T.V; T.V, Gayathri.V; Indumathi V; Karthika.G

    2012-01-01

    Underwater acoustic sensor networks (UWASNs) are playing a lot of interest in ocean applications, such as ocean pollution monitoring, ocean animal surveillance, oceanographic data collection, assisted- navigation, and offshore exploration, UWASN is composed of underwater sensors that engage sound to transmit information collected in the ocean. The reason to utilize sound is that radio frequency (RF) signals used by terrestrial sensor networks (TWSNs) can merely transmit a few meters in the wa...

  9. Community-based Ontology Development, Annotation and Discussion with MediaWiki extension Ontokiwi and Ontokiwi-based Ontobedia

    Science.gov (United States)

    Ong, Edison; He, Yongqun

    2016-01-01

    Hundreds of biological and biomedical ontologies have been developed to support data standardization, integration and analysis. Although ontologies are typically developed for community usage, community efforts in ontology development are limited. To support ontology visualization, distribution, and community-based annotation and development, we have developed Ontokiwi, an ontology extension to the MediaWiki software. Ontokiwi displays hierarchical classes and ontological axioms. Ontology classes and axioms can be edited and added using Ontokiwi form or MediaWiki source editor. Ontokiwi also inherits MediaWiki features such as Wikitext editing and version control. Based on the Ontokiwi/MediaWiki software package, we have developed Ontobedia, which targets to support community-based development and annotations of biological and biomedical ontologies. As demonstrations, we have loaded the Ontology of Adverse Events (OAE) and the Cell Line Ontology (CLO) into Ontobedia. Our studies showed that Ontobedia was able to achieve expected Ontokiwi features.

  10. Protein-protein interactions prediction based on iterative clique extension with gene ontology filtering.

    Science.gov (United States)

    Yang, Lei; Tang, Xianglong

    2014-01-01

    Cliques (maximal complete subnets) in protein-protein interaction (PPI) network are an important resource used to analyze protein complexes and functional modules. Clique-based methods of predicting PPI complement the data defection from biological experiments. However, clique-based predicting methods only depend on the topology of network. The false-positive and false-negative interactions in a network usually interfere with prediction. Therefore, we propose a method combining clique-based method of prediction and gene ontology (GO) annotations to overcome the shortcoming and improve the accuracy of predictions. According to different GO correcting rules, we generate two predicted interaction sets which guarantee the quality and quantity of predicted protein interactions. The proposed method is applied to the PPI network from the Database of Interacting Proteins (DIP) and most of the predicted interactions are verified by another biological database, BioGRID. The predicted protein interactions are appended to the original protein network, which leads to clique extension and shows the significance of biological meaning. PMID:24578640

  11. Protein-Protein Interactions Prediction Based on Iterative Clique Extension with Gene Ontology Filtering

    Directory of Open Access Journals (Sweden)

    Lei Yang

    2014-01-01

    Full Text Available Cliques (maximal complete subnets in protein-protein interaction (PPI network are an important resource used to analyze protein complexes and functional modules. Clique-based methods of predicting PPI complement the data defection from biological experiments. However, clique-based predicting methods only depend on the topology of network. The false-positive and false-negative interactions in a network usually interfere with prediction. Therefore, we propose a method combining clique-based method of prediction and gene ontology (GO annotations to overcome the shortcoming and improve the accuracy of predictions. According to different GO correcting rules, we generate two predicted interaction sets which guarantee the quality and quantity of predicted protein interactions. The proposed method is applied to the PPI network from the Database of Interacting Proteins (DIP and most of the predicted interactions are verified by another biological database, BioGRID. The predicted protein interactions are appended to the original protein network, which leads to clique extension and shows the significance of biological meaning.

  12. Data Mining and Neural Network Techniques in Case Based System

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper first puts forward a case-based system framework basedon data mining techniques. Then the paper examines the possibility of using neural n etworks as a method of retrieval in such a case-based system. In this system we propose data mining algorithms to discover case knowledge and other algorithms.

  13. An Agent Communication Framework Based on XML and SOAP Technique

    Institute of Scientific and Technical Information of China (English)

    李晓瑜

    2009-01-01

    This thesis introducing XML technology and SOAP technology,present an agent communication fi-amework based on XML and SOAP technique,and analyze the principle,architecture,function and benefit of it. At the end, based on KQML communication primitive lan- guages.

  14. Power system stabilizers based on modern control techniques

    Energy Technology Data Exchange (ETDEWEB)

    Malik, O.P.; Chen, G.P.; Zhang, Y.; El-Metwally, K. [Calgary Univ., AB (Canada). Dept. of Electrical and Computer Engineering

    1994-12-31

    Developments in digital technology have made it feasible to develop and implement improved controllers based on sophisticated control techniques. Power system stabilizers based on adaptive control, fuzzy logic and artificial networks are being developed. Each of these control techniques possesses unique features and strengths. In this paper, the relative performance of power systems stabilizers based on adaptive control, fuzzy logic and neural network, both in simulation studies and real time tests on a physical model of a power system, is presented and compared to that of a fixed parameter conventional power system stabilizer. (author) 16 refs., 45 figs., 3 tabs.

  15. EMAAS: An extensible grid-based Rich Internet Application for microarray data analysis and management

    Directory of Open Access Journals (Sweden)

    Aitman T

    2008-11-01

    Full Text Available Abstract Background Microarray experimentation requires the application of complex analysis methods as well as the use of non-trivial computer technologies to manage the resultant large data sets. This, together with the proliferation of tools and techniques for microarray data analysis, makes it very challenging for a laboratory scientist to keep up-to-date with the latest developments in this field. Our aim was to develop a distributed e-support system for microarray data analysis and management. Results EMAAS (Extensible MicroArray Analysis System is a multi-user rich internet application (RIA providing simple, robust access to up-to-date resources for microarray data storage and analysis, combined with integrated tools to optimise real time user support and training. The system leverages the power of distributed computing to perform microarray analyses, and provides seamless access to resources located at various remote facilities. The EMAAS framework allows users to import microarray data from several sources to an underlying database, to pre-process, quality assess and analyse the data, to perform functional analyses, and to track data analysis steps, all through a single easy to use web portal. This interface offers distance support to users both in the form of video tutorials and via live screen feeds using the web conferencing tool EVO. A number of analysis packages, including R-Bioconductor and Affymetrix Power Tools have been integrated on the server side and are available programmatically through the Postgres-PLR library or on grid compute clusters. Integrated distributed resources include the functional annotation tool DAVID, GeneCards and the microarray data repositories GEO, CELSIUS and MiMiR. EMAAS currently supports analysis of Affymetrix 3' and Exon expression arrays, and the system is extensible to cater for other microarray and transcriptomic platforms. Conclusion EMAAS enables users to track and perform microarray data

  16. Simulation-based optimization parametric optimization techniques and reinforcement learning

    CERN Document Server

    Gosavi, Abhijit

    2003-01-01

    Simulation-Based Optimization: Parametric Optimization Techniques and Reinforcement Learning introduces the evolving area of simulation-based optimization. The book's objective is two-fold: (1) It examines the mathematical governing principles of simulation-based optimization, thereby providing the reader with the ability to model relevant real-life problems using these techniques. (2) It outlines the computational technology underlying these methods. Taken together these two aspects demonstrate that the mathematical and computational methods discussed in this book do work. Broadly speaking, the book has two parts: (1) parametric (static) optimization and (2) control (dynamic) optimization. Some of the book's special features are: *An accessible introduction to reinforcement learning and parametric-optimization techniques. *A step-by-step description of several algorithms of simulation-based optimization. *A clear and simple introduction to the methodology of neural networks. *A gentle introduction to converg...

  17. Extension of car-to-X-communication by radiolocation techniques; Erweiterung der Fahrzeug-zu-Fahrzeug-Kommunikation mit Funkortungstechniken

    Energy Technology Data Exchange (ETDEWEB)

    Schwarz, Daniel [BMW Group, Muenchen (Germany). Bereich Konzepte ' ' Aktive und Integrale Sicherheit' '

    2012-10-15

    A transponder, carried by vulnerable road users, e.g. pedestrians or bicyclists, and integrated in a smartphone, enables localisation from the vehicle side. A connected driver assistant system can detect movements, predict a possible collision and take preventive actions like informing the driver, braking or steering. In the research project Ko-TAG (Cooperative Transponder) within the research initiative Ko-FAS (Cooperative Vehicle Safety) new sensor technologies are developed. BMW, as Ko-TAG project manager, explains the cooperative sensor technology and the automatic control technique of the preventive safety system. (orig.)

  18. A Hough Transform based Technique for Text Segmentation

    CERN Document Server

    Saha, Satadal; Nasipuri, Mita; Basu, Dipak Kr

    2010-01-01

    Text segmentation is an inherent part of an OCR system irrespective of the domain of application of it. The OCR system contains a segmentation module where the text lines, words and ultimately the characters must be segmented properly for its successful recognition. The present work implements a Hough transform based technique for line and word segmentation from digitized images. The proposed technique is applied not only on the document image dataset but also on dataset for business card reader system and license plate recognition system. For standardization of the performance of the system the technique is also applied on public domain dataset published in the website by CMATER, Jadavpur University. The document images consist of multi-script printed and hand written text lines with variety in script and line spacing in single document image. The technique performs quite satisfactorily when applied on mobile camera captured business card images with low resolution. The usefulness of the technique is verifie...

  19. Product and process effectiveness using performance-based auditing techniques

    International Nuclear Information System (INIS)

    Focus is the backbone of genius. Focus is the lifeblood of adequate products and effective processes. Focus is the theme of Performance-Based Audits (PBA). The Civilian Radioactive Waste Management (CRWM) Program is using the PBA tool extensively to focus on the evaluation of product adequacy and process effectiveness. The term Performance-Based Audit has been around for several years. however, the approach presented here for the systematic end-product selection, planning, and measurement of adequacy and effectiveness is new and innovative

  20. Face Recognition Approach Based on Wavelet - Curvelet Technique

    Directory of Open Access Journals (Sweden)

    Muzhir Shaban Al-Ani

    2012-04-01

    Full Text Available In this paper, a novel face recognition approach based on wavelet-curvelet technique, is proposed. This algorithm based on the similarities embedded in the images, That utilize the wavelet-curvelet technique to extract facial features. The implemented technique can overcome on the other mathematical image analysis approaches. This approaches may suffered from the potential for a high dimensional feature space, Therefore it aims to reduce the dimensionality that reduce the required computational power and memory size. Then the Nearest Mean Classifier (NMC is adopted to recognize different faces. In this work, three major experiments were done. two face databases (MAFD & ORL, and higher recognition rate is obtained by the implementation of this techniques.

  1. Intramuscular injection technique: an evidence-based approach.

    Science.gov (United States)

    Ogston-Tuck, Sherri

    2014-09-30

    Intramuscular injections require a thorough and meticulous approach to patient assessment and injection technique. This article, the second in a series of two, reviews the evidence base to inform safer practice and to consider the evidence for nursing practice in this area. A framework for safe practice is included, identifying important points for safe technique, patient care and clinical decision making. It also highlights the ongoing debate in selection of intramuscular injection sites, predominately the ventrogluteal and dorsogluteal muscles. PMID:25249123

  2. A Hough Transform based Technique for Text Segmentation

    OpenAIRE

    Saha, Satadal; Basu, Subhadip; Nasipuri, Mita; Basu, Dipak Kr.

    2010-01-01

    Text segmentation is an inherent part of an OCR system irrespective of the domain of application of it. The OCR system contains a segmentation module where the text lines, words and ultimately the characters must be segmented properly for its successful recognition. The present work implements a Hough transform based technique for line and word segmentation from digitized images. The proposed technique is applied not only on the document image dataset but also on dataset for business card rea...

  3. Combining Speedup techniques based on Landmarks and Containers

    Directory of Open Access Journals (Sweden)

    R. KALPANA

    2010-09-01

    Full Text Available The Dijkstra’s algorithm [1] , which is applied in many real world problems like mobile routing, road maps, railway networks, etc,. is used to find the shortest path between source and destination. There are many techniques available to speedup the algorithm while guaranteeing the optimality of the solution. The main focus of the work is to implement landmark technique and Containers separately and compare the results with random graphs and planar graphs. The combined speedup technique which is based on landmarks and containers were also experimented with random graphs and planar graphs to improvethe speedup of the shortest path queries.

  4. Finding Within Cluster Dense Regions Using Distance Based Technique

    OpenAIRE

    Wesam Ashour; Motaz Murtaja

    2012-01-01

    One of the main categories in Data Clustering is density based clustering. Density based clustering techniques like DBSCAN are attractive because they can find arbitrary shaped clusters along with noisy outlier. The main weakness of the traditional density based algorithms like DBSCAN is clustering the different density level data sets. DBSCAN calculations done according to given parameters applied to all points in a data set, while densities of the data set clusters may be totally different....

  5. Efficiency of Integrated Geophysical techniques in delineating the extension of Bauxites ore in north Riyadh, Saudi Arabia

    Science.gov (United States)

    Almutairi, Yasir; Alanazi, Abdulrahman; Almutairi, Muteb; Alsama, Ali; Alhenaki, Bander; Almalki, Awadh

    2014-05-01

    We exploit the integration of Ground Penetrating Radar (GPR) techniques, magnetic gradiometry, resistivity measurements and seismic tomography for the high-resolution non-invasive study for delineating the subsurface Bauxite layer in Zabira locality, north of Riyadh. Integrated GPR, magnetic gradiometry resistivity and seismic refraction are used in the case of high contrast targets and provide an accurate subsurface reconstruction of foundations in sediments. Resistivity pseudo-sections are in particular useful for the areal identification of contacts between soils and foundations while GPR and magnetic gradiometry provide detailed information about location and depth of the structures. Results obtained by GPR, Magnetics and resistivity shows a very good agreement in mapping the bauxite layer depth at range of 5 m to 10 m while the depth obtained by seismic refraction was 10 m to 15 m due to lack of velocity information.

  6. Runtime Monitoring Technique to handle Tautology based SQL Injection Attacks

    Directory of Open Access Journals (Sweden)

    Ramya Dharam

    2015-05-01

    Full Text Available Software systems, like web applications, are often used to provide reliable online services such as banking, shopping, social networking, etc., to users. The increasing use of such systems has led to a high need for assuring confidentiality, integrity, and availability of user data. SQL Injection Attacks (SQLIAs is one of the major security threats to web applications. It allows attackers to get unauthorized access to the back-end database consisting of confidential user information. In this paper we present and evaluate a Runtime Monitoring Technique to detect and prevent tautology based SQLIAs in web applications. Our technique monitors the behavior of the application during its post- deployment to identify all the tautology based SQLIAs. A framework called Runtime Monitoring Framework, that implements our technique, is used in the development of runtime monitors. The framework uses two pre-deployment testing techniques, such as basis-path and data-flow to identify a minimal set of all legal/valid execution paths of the application. Runtime monitors are then developed and integrated to perform runtime monitoring of the application, during its post-deployment for the identified valid/legal execution paths. For evaluation we targeted a subject application with a large number of both legitimate inputs and illegitimate tautology based inputs, and measured the performance of the proposed technique. The results of our study show that runtime monitor developed for the application was successfully able to detect all the tautology based attacks without generating any false positives.

  7. Laser-based direct-write techniques for cell printing

    Science.gov (United States)

    Schiele, Nathan R; Corr, David T; Huang, Yong; Raof, Nurazhani Abdul; Xie, Yubing; Chrisey, Douglas B

    2016-01-01

    Fabrication of cellular constructs with spatial control of cell location (±5 μm) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing. PMID:20814088

  8. Using Maps in Web Analytics to Evaluate the Impact of Web-Based Extension Programs

    Science.gov (United States)

    Veregin, Howard

    2015-01-01

    Maps can be a valuable addition to the Web analytics toolbox for Extension programs that use the Web to disseminate information. Extension professionals use Web analytics tools to evaluate program impacts. Maps add a unique perspective through visualization and analysis of geographic patterns and their relationships to other variables. Maps can…

  9. R2O, an extensible and semantically based database-to-ontology mapping language

    OpenAIRE

    Barrasa Rodríguez, Jesús; Corcho, Oscar; A. GÓMEZ-PÉREZ

    2004-01-01

    We present R2O, an extensible and declarative language to describe mappings between relational DB schemas and ontologies implemented in RDF(S) or OWL. R2O provides an extensible set of primitives with welldefined semantics. This language has been conceived expressive enough to cope with complex mapping cases arisen from situations of low similarity between the ontology and the DB models.

  10. Promoting Behavior Change Using Social Norms: Applying a Community Based Social Marketing Tool to Extension Programming

    Science.gov (United States)

    Chaudhary, Anil Kumar; Warner, Laura A.

    2015-01-01

    Most educational programs are designed to produce lower level outcomes, and Extension educators are challenged to produce behavior change in target audiences. Social norms are a very powerful proven tool for encouraging sustainable behavior change among Extension's target audiences. Minor modifications to program content to demonstrate the…

  11. A Component-Based Vocabulary-Extensible Sign Language Gesture Recognition Framework

    Directory of Open Access Journals (Sweden)

    Shengjing Wei

    2016-04-01

    Full Text Available Sign language recognition (SLR can provide a helpful tool for the communication between the deaf and the external world. This paper proposed a component-based vocabulary extensible SLR framework using data from surface electromyographic (sEMG sensors, accelerometers (ACC, and gyroscopes (GYRO. In this framework, a sign word was considered to be a combination of five common sign components, including hand shape, axis, orientation, rotation, and trajectory, and sign classification was implemented based on the recognition of five components. Especially, the proposed SLR framework consisted of two major parts. The first part was to obtain the component-based form of sign gestures and establish the code table of target sign gesture set using data from a reference subject. In the second part, which was designed for new users, component classifiers were trained using a training set suggested by the reference subject and the classification of unknown gestures was performed with a code matching method. Five subjects participated in this study and recognition experiments under different size of training sets were implemented on a target gesture set consisting of 110 frequently-used Chinese Sign Language (CSL sign words. The experimental results demonstrated that the proposed framework can realize large-scale gesture set recognition with a small-scale training set. With the smallest training sets (containing about one-third gestures of the target gesture set suggested by two reference subjects, (82.6 ± 13.2% and (79.7 ± 13.4% average recognition accuracy were obtained for 110 words respectively, and the average recognition accuracy climbed up to (88 ± 13.7% and (86.3 ± 13.7% when the training set included 50~60 gestures (about half of the target gesture set. The proposed framework can significantly reduce the user’s training burden in large-scale gesture recognition, which will facilitate the implementation of a practical SLR system.

  12. A Component-Based Vocabulary-Extensible Sign Language Gesture Recognition Framework.

    Science.gov (United States)

    Wei, Shengjing; Chen, Xiang; Yang, Xidong; Cao, Shuai; Zhang, Xu

    2016-01-01

    Sign language recognition (SLR) can provide a helpful tool for the communication between the deaf and the external world. This paper proposed a component-based vocabulary extensible SLR framework using data from surface electromyographic (sEMG) sensors, accelerometers (ACC), and gyroscopes (GYRO). In this framework, a sign word was considered to be a combination of five common sign components, including hand shape, axis, orientation, rotation, and trajectory, and sign classification was implemented based on the recognition of five components. Especially, the proposed SLR framework consisted of two major parts. The first part was to obtain the component-based form of sign gestures and establish the code table of target sign gesture set using data from a reference subject. In the second part, which was designed for new users, component classifiers were trained using a training set suggested by the reference subject and the classification of unknown gestures was performed with a code matching method. Five subjects participated in this study and recognition experiments under different size of training sets were implemented on a target gesture set consisting of 110 frequently-used Chinese Sign Language (CSL) sign words. The experimental results demonstrated that the proposed framework can realize large-scale gesture set recognition with a small-scale training set. With the smallest training sets (containing about one-third gestures of the target gesture set) suggested by two reference subjects, (82.6 ± 13.2)% and (79.7 ± 13.4)% average recognition accuracy were obtained for 110 words respectively, and the average recognition accuracy climbed up to (88 ± 13.7)% and (86.3 ± 13.7)% when the training set included 50~60 gestures (about half of the target gesture set). The proposed framework can significantly reduce the user's training burden in large-scale gesture recognition, which will facilitate the implementation of a practical SLR system. PMID:27104534

  13. PCA Based Rapid and Real Time Face Recognition Technique

    Directory of Open Access Journals (Sweden)

    T R Chandrashekar

    2013-12-01

    Full Text Available Economical and efficient that is used in various applications is face Biometric which has been a popular form biometric system. Face recognition system is being a topic of research for last few decades. Several techniques are proposed to improve the performance of face recognition system. Accuracy is tested against intensity, distance from camera, and pose variance. Multiple face recognition is another subtopic which is under research now a day. Speed at which the technique works is a parameter under consideration to evaluate a technique. As an example a support vector machine performs really well for face recognition but the computational efficiency degrades significantly with increase in number of classes. Eigen Face technique produces quality features for face recognition but the accuracy is proved to be comparatively less to many other techniques. With increase in use of core processors in personal computers and application demanding speed in processing and multiple face detection and recognition system (for example an entry detection system in shopping mall or an industry, demand for such systems are cumulative as there is a need for automated systems worldwide. In this paper we propose a novel system of face recognition developed with C# .Net that can detect multiple faces and can recognize the faces parallel by utilizing the system resources and the core processors. The system is built around Haar Cascade based face detection and PCA based face recognition system with C#.Net. Parallel library designed for .Net is used to aide to high speed detection and recognition of the real time faces. Analysis of the performance of the proposed technique with some of the conventional techniques reveals that the proposed technique is not only accurate, but also is fast in comparison to other techniques.

  14. Extension of the COSYMA-ECONOMICS module - cost calculations based on different economic sectors

    International Nuclear Information System (INIS)

    The COSYMA program system for evaluating the off-site consequences of accidental releases of radioactive material to the atmosphere includes an ECONOMICS module for assessing economic consequences. The aim of this module is to convert various consequences (radiation-induced health effects and impacts resulting from countermeasures) caused by an accident into the common framework of economic costs; this allows different effects to be expressed in the same terms and thus to make these effects comparable. With respect to the countermeasure 'movement of people', the dominant cost categories are 'loss-of-income costs' and 'costs of lost capital services'. In the original version of the ECONOMICS module these costs are calculated on the basis of the total number of people moved. In order to take into account also regional or local economic peculiarities of a nuclear site, the ECONOMICS module has been extended: Calculation of the above mentioned cost categories is now based on the number of employees in different economic sectors in the affected area. This extension of the COSYMA ECONOMICS module is described in more detail. (orig.)

  15. CDAPubMed: a browser extension to retrieve EHR-based biomedical literature

    Directory of Open Access Journals (Sweden)

    Perez-Rey David

    2012-04-01

    Full Text Available Abstract Background Over the last few decades, the ever-increasing output of scientific publications has led to new challenges to keep up to date with the literature. In the biomedical area, this growth has introduced new requirements for professionals, e.g., physicians, who have to locate the exact papers that they need for their clinical and research work amongst a huge number of publications. Against this backdrop, novel information retrieval methods are even more necessary. While web search engines are widespread in many areas, facilitating access to all kinds of information, additional tools are required to automatically link information retrieved from these engines to specific biomedical applications. In the case of clinical environments, this also means considering aspects such as patient data security and confidentiality or structured contents, e.g., electronic health records (EHRs. In this scenario, we have developed a new tool to facilitate query building to retrieve scientific literature related to EHRs. Results We have developed CDAPubMed, an open-source web browser extension to integrate EHR features in biomedical literature retrieval approaches. Clinical users can use CDAPubMed to: (i load patient clinical documents, i.e., EHRs based on the Health Level 7-Clinical Document Architecture Standard (HL7-CDA, (ii identify relevant terms for scientific literature search in these documents, i.e., Medical Subject Headings (MeSH, automatically driven by the CDAPubMed configuration, which advanced users can optimize to adapt to each specific situation, and (iii generate and launch literature search queries to a major search engine, i.e., PubMed, to retrieve citations related to the EHR under examination. Conclusions CDAPubMed is a platform-independent tool designed to facilitate literature searching using keywords contained in specific EHRs. CDAPubMed is visually integrated, as an extension of a widespread web browser, within the standard

  16. GIS-based assessment of groundwater level on extensive karst areas

    Science.gov (United States)

    Kopecskó, Zsanett; Józsa, Edina

    2016-04-01

    Karst topographies represent unique geographical regions containing caves and extensive underground water systems developed especially on soluble rocks such as limestone, marble and gypsum. The significance of these areas is evident considering that 12% of the ice-free continental area consists of landscapes developed on carbonate rocks and 20-25% of the global population depends mostly on groundwater obtained from these systems. Karst water reservoirs already give the 25% of the freshwater resources globally. Comprehensive studies considering these regions are the key to explore chances of the exploitation and to analyze the consequences of contamination, anthropogenic effects and natural processes within these specific hydro-geological characteristics. For the proposed work we chose several of the largest karst regions over the ice-free part of continents, representing diverse climatic and topographic characteristics. An important aspect of the study is that there are no available in situ hydrologic measurements over the entire research area that would provide discrete sampling of soil, ground and surface water. As replacement for the detailed surveys, multi remote sensing data (Gravity Recovery and Climate Experiment (GRACE) satellite derivatives products, Moderate Resolution Imaging Spectroradiometer (MODIS) satellite products and Tropical Rainfall Measuring Mission (TRMM) monthly rainfalls satellite datasets) are used along with model reanalysis data (Global Precipitation Climate Center data (GPCC) and Global Land Data Assimilation System (GLDAS)) to study the variation on extensive karst areas in response to the changing climate and anthropogenic effects. The analyses are carried out within open source software environment to enable sharing of the proposed algorithm. The GRASS GIS geoinformatic software and the R statistical program proved to be adequate choice to collect and analyze the above mentioned datasets by taking advantage of their interoperability

  17. Wavelet transformation based watermarking technique for human electrocardiogram (ECG).

    Science.gov (United States)

    Engin, Mehmet; Cidam, Oğuz; Engin, Erkan Zeki

    2005-12-01

    Nowadays, watermarking has become a technology of choice for a broad range of multimedia copyright protection applications. Watermarks have also been used to embed prespecified data in biomedical signals. Thus, the watermarked biomedical signals being transmitted through communication are resistant to some attacks. This paper investigates discrete wavelet transform based watermarking technique for signal integrity verification in an Electrocardiogram (ECG) coming from four ECG classes for monitoring application of cardiovascular diseases. The proposed technique is evaluated under different noisy conditions for different wavelet functions. Daubechies (db2) wavelet function based technique performs better than those of Biorthogonal (bior5.5) wavelet function. For the beat-to-beat applications, all performance results belonging to four ECG classes are highly moderate. PMID:16235811

  18. Extension of DQE to include scatter, grid, magnification, and focal spot blur: a new experimental technique and metric

    Science.gov (United States)

    Ranger, N. T.; Mackenzie, A.; Honey, I. D.; Dobbins, J. T., III; Ravin, C. E.; Samei, E.

    2009-02-01

    In digital radiography, conventional DQE evaluations are performed under idealized conditions that do not reflect typical clinical operating conditions. For this reason, we have developed and evaluated an experimental methodology for measuring theeffective detective quantum efficiency (eDQE) of digital radiographic systems and its utility in chest imaging applications.To emulate the attenuation and scatter properties of the human thorax across a range of sizes, the study employed pediatric and adult geometric chest imaging phantoms designed for use in the FDA/CDRH Nationwide Evaluation of X-Ray Trends (NEXT) program and a third phantom configuration designed to represent the bariatric population. The MTF for each phantom configuration was measured using images of an opaque edge device placed at the nominal surface of each phantom and at a common reference point. For each phantom, the NNPS was measured in a uniform region within the phantom image acquired at an exposure level determined from a prior phototimed acquisition. Scatter measurements were made using a beam-stop technique. These quantities were used along with measures of phantom attenuation and estimates of x-ray flux, to compute the eDQE at the beam-entrance surface of the phantoms, reflecting the presence of scatter, grid, magnification, and focal spot blur. The MTF results showed notable degradation due to focal spot blurring enhanced by geometric magnification, with increasing phantom size. Measured scatter fractions were 33%, 34% and 46% for the pediatric, adult, and bariatric phantoms, respectively. Correspondingly, the measured narrow beam transmission fractions were 16%, 9%, and 3%. The eDQE results for the pediatric and adult phantoms correlate well at low spatial frequencies but show degradation in the eDQE at increasing spatial frequencies for the adult phantom in comparison to the pediatric phantom. The results for the bariatric configuration showed a marked decrease in eDQE in comparison to

  19. "Ayeli": Centering Technique Based on Cherokee Spiritual Traditions.

    Science.gov (United States)

    Garrett, Michael Tlanusta; Garrett, J. T.

    2002-01-01

    Presents a centering technique called "Ayeli," based on Cherokee spiritual traditions as a way of incorporating spirituality into counseling by helping clients identify where they are in their journey, where they want to be, and how they can get there. Relevant Native cultural traditions and meanings are explored. (Contains 25 references.) (GCP)

  20. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune

    1998-01-01

    This article describes the work carried out within the project: Modal Analysis Based on the Random Decrement Technique - Application to Civil Engineering Structures. The project is part of the research programme: Dynamics of Structures sponsored by the Danish Technical Research Counsil. The planned...

  1. A systematic approach to multiphysics extensions of finite-element-based micromagnetic simulations: Nmag

    OpenAIRE

    Fischbacher, T.; Franchin, M.; Bordignon, G.; Fangohr, H.

    2007-01-01

    Extensions of the basic micromagnetic model that include effects such as spin-current interaction, diffusion of thermal energy or anisotropic magnetoresistance are often studied by performing simulations that use case-specific ad-hoc extensions of widely used software packages such as OOMMF or Magpar. We present the novel software framework 'Nmag' that handles specifications of micromagnetic systems at a sufficiently abstract level to enable users with little programming experience to automa...

  2. Case-based reasoning diagnostic technique based on multi-attribute similarity

    Energy Technology Data Exchange (ETDEWEB)

    Makoto, Takahashi [Tohoku University, Miyagi (Japan); Akio, Gofuku [Okayama University, Okayamaa (Japan)

    2014-08-15

    Case-based diagnostic technique has been developed based on the multi-attribute similarity. Specific feature of the developed system is to use multiple attributes of process signals for similarity evaluation to retrieve a similar case stored in a case base. The present technique has been applied to the measurement data from Monju with some simulated anomalies. The results of numerical experiments showed that the present technique can be utilizes as one of the methods for a hybrid-type diagnosis system.

  3. The Real-Time Image Processing Technique Based on DSP

    Institute of Scientific and Technical Information of China (English)

    QI Chang; CHEN Yue-hua; HUANG Tian-shu

    2005-01-01

    This paper proposes a novel real-time image processing technique based on digital singnal processor (DSP). At the aspect of wavelet transform(WT) algorithm, the technique uses algorithm of second generation wavelet transform-lifting scheme WT that has low calculation complexity property for the 2-D image data processing. Since the processing effect of lifting scheme WT for 1-D data is better than the effect of it for 2-D data obviously, this paper proposes a reformative processing method: Transform 2-D image data to 1-D data sequence by linearization method, then process the 1-D data sequence by algorithm of lifting scheme WT. The method changes the image convolution mode,which based on the cross filtering of rows and columns. At the aspect of hardware realization, the technique optimizes the program structure of DSP to exert the operation power with the in-chip memorizer of DSP. The experiment results show that the real-time image processing technique proposed in this paper can meet the real-time requirement of video-image transmitting in the video surveillance system of electric power. So the technique is a feasible and efficient DSP solution.

  4. Video Multiple Watermarking Technique Based on Image Interlacing Using DWT

    Science.gov (United States)

    Ibrahim, Mohamed M.; Abdel Kader, Neamat S.; Zorkany, M.

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth. PMID:25587570

  5. Experiments on Adaptive Techniques for Host-Based Intrusion Detection

    International Nuclear Information System (INIS)

    This research explores four experiments of adaptive host-based intrusion detection (ID) techniques in an attempt to develop systems that can detect novel exploits. The technique considered to have the most potential is adaptive critic designs (ACDs) because of their utilization of reinforcement learning, which allows learning exploits that are difficult to pinpoint in sensor data. Preliminary results of ID using an ACD, an Elman recurrent neural network, and a statistical anomaly detection technique demonstrate an ability to learn to distinguish between clean and exploit data. We used the Solaris Basic Security Module (BSM) as a data source and performed considerable preprocessing on the raw data. A detection approach called generalized signature-based ID is recommended as a middle ground between signature-based ID, which has an inability to detect novel exploits, and anomaly detection, which detects too many events including events that are not exploits. The primary results of the ID experiments demonstrate the use of custom data for generalized signature-based intrusion detection and the ability of neural network-based systems to learn in this application environment

  6. An Observed Voting System Based On Biometric Technique

    Directory of Open Access Journals (Sweden)

    B. Devikiruba

    2015-08-01

    Full Text Available ABSTRACT This article describes a computational framework which can run almost on every computer connected to an IP based network to study biometric techniques. This paper discusses with a system protecting confidential information puts strong security demands on the identification. Biometry provides us with a user-friendly method for this identification and is becoming a competitor for current identification mechanisms. The experimentation section focuses on biometric verification specifically based on fingerprints. This article should be read as a warning to those thinking of using methods of identification without first examine the technical opportunities for compromising mechanisms and the associated legal consequences. The development is based on the java language that easily improves software packages that is useful to test new control techniques.

  7. Knowledge based systems advanced concepts, techniques and applications

    CERN Document Server

    1997-01-01

    The field of knowledge-based systems (KBS) has expanded enormously during the last years, and many important techniques and tools are currently available. Applications of KBS range from medicine to engineering and aerospace.This book provides a selected set of state-of-the-art contributions that present advanced techniques, tools and applications. These contributions have been prepared by a group of eminent researchers and professionals in the field.The theoretical topics covered include: knowledge acquisition, machine learning, genetic algorithms, knowledge management and processing under unc

  8. Proposing a Wiki-Based Technique for Collaborative Essay Writing

    Directory of Open Access Journals (Sweden)

    Mabel Ortiz Navarrete

    2014-10-01

    Full Text Available This paper aims at proposing a technique for students learning English as a foreign language when they collaboratively write an argumentative essay in a wiki environment. A wiki environment and collaborative work play an important role within the academic writing task. Nevertheless, an appropriate and systematic work assignment is required in order to make use of both. In this paper the proposed technique when writing a collaborative essay mainly attempts to provide the most effective way to enhance equal participation among group members by taking as a base computer mediated collaboration. Within this context, the students’ role is clearly defined and individual and collaborative tasks are explained.

  9. Statistics and Machine Learning based Outlier Detection Techniques for Exoplanets

    Science.gov (United States)

    Goel, Amit; Montgomery, Michele

    2015-08-01

    Architectures of planetary systems are observable snapshots in time that can indicate formation and dynamic evolution of planets. The observable key parameters that we consider are planetary mass and orbital period. If planet masses are significantly less than their host star masses, then Keplerian Motion is defined as P^2 = a^3 where P is the orbital period in units of years and a is the orbital period in units of Astronomical Units (AU). Keplerian motion works on small scales such as the size of the Solar System but not on large scales such as the size of the Milky Way Galaxy. In this work, for confirmed exoplanets of known stellar mass, planetary mass, orbital period, and stellar age, we analyze Keplerian motion of systems based on stellar age to seek if Keplerian motion has an age dependency and to identify outliers. For detecting outliers, we apply several techniques based on statistical and machine learning methods such as probabilistic, linear, and proximity based models. In probabilistic and statistical models of outliers, the parameters of a closed form probability distributions are learned in order to detect the outliers. Linear models use regression analysis based techniques for detecting outliers. Proximity based models use distance based algorithms such as k-nearest neighbour, clustering algorithms such as k-means, or density based algorithms such as kernel density estimation. In this work, we will use unsupervised learning algorithms with only the proximity based models. In addition, we explore the relative strengths and weaknesses of the various techniques by validating the outliers. The validation criteria for the outliers is if the ratio of planetary mass to stellar mass is less than 0.001. In this work, we present our statistical analysis of the outliers thus detected.

  10. An Efficient Image Compression Technique Based on Arithmetic Coding

    Directory of Open Access Journals (Sweden)

    Prof. Rajendra Kumar Patel

    2012-12-01

    Full Text Available The rapid growth of digital imaging applications, including desktop publishing, multimedia, teleconferencing, and high visual definition has increased the need for effective and standardized image compression techniques. Digital Images play a very important role for describing the detailed information. The key obstacle for many applications is the vast amount of data required to represent a digital image directly. The various processes of digitizing the images to obtain it in the best quality for the more clear and accurate information leads to the requirement of more storage space and better storage and accessing mechanism in the form of hardware or software. In this paper we concentrate mainly on the above flaw so that we reduce the space with best quality image compression. State-ofthe-art techniques can compress typical images from 1/10 to 1/50 their uncompressed size without visibly affecting image quality. From our study I observe that there is a need of good image compression technique which provides better reduction technique in terms of storage and quality. Arithmetic coding is the best way to reducing encoding data. So in this paper we propose arithmetic coding with walsh transformation based image compression technique which is an efficient way of reduction

  11. Finding Within Cluster Dense Regions Using Distance Based Technique

    Directory of Open Access Journals (Sweden)

    Wesam Ashour

    2012-03-01

    Full Text Available One of the main categories in Data Clustering is density based clustering. Density based clustering techniques like DBSCAN are attractive because they can find arbitrary shaped clusters along with noisy outlier. The main weakness of the traditional density based algorithms like DBSCAN is clustering the different density level data sets. DBSCAN calculations done according to given parameters applied to all points in a data set, while densities of the data set clusters may be totally different. The proposed algorithm overcomes this weakness of the traditional density based algorithms. The algorithm starts with partitioning the data within a cluster to units based on a user parameter and compute the density for each unit separately. Consequently, the algorithm compares the results and merges neighboring units with closer approximate density values to become a new cluster. The experimental results of the simulation show that the proposed algorithm gives good results in finding clusters for different density cluster data set.

  12. Vibration based fault detection techniques for mechanical structures

    International Nuclear Information System (INIS)

    Fault detection techniques for mechanical structures and their application are becoming more important in recent years in the field of structure health monitoring. The intention of this paper is to present available state of the art methods that could be implemented in mechanical structures. Global based methods that contribute on detection, isolation and analysis of fault from changes in vibration characteristics of the structure are presented. Techniques are based on the idea that modal frequencies, mode shapes and modal damping as model properties of the structure can be determine as function of physical properties. In addition, if a fault appears in mechanical structure, this can be recognized as changes in the physical properties, which leads to cause changes in the modal properties of the structure. (Author)

  13. WAVELET-BASED WARPING TECHNIQUE FOR MOBILE DEVICES

    Directory of Open Access Journals (Sweden)

    Ekta Walia

    2014-07-01

    Full Text Available The role of digital images is increasing rapidly in mobile devices. They are used in many applications including virtual tours, virtual reality, e-commerce etc. Such applications synthesize realistic looking novel views of the reference images on mobile devices using the techniques like image-based rendering (IBR. However, with this increasing role of digital images comes the serious issue of processing large images which requires considerable time. Hence, methods to compress these large images are very important. Wavelets are excellent data compression tools that can be used with IBR algorithms to generate the novel views of compressed image data. This paper proposes a framework that uses wavelet-based warping technique to render novel views of compressed images on mobile/ handheld devices. The experiments are performed using Android Development Tools (ADT which shows the proposed framework gives better results for large images in terms of rendering time.

  14. A thermopneumatic micropump based on micro-engineering techniques

    OpenAIRE

    Pol, van der, P.; Lintel, van, H.T.G.; Elwenspoek, M; Fluitman, J.H.J.

    1990-01-01

    The design, working principle and realization of an electro-thermopneumatic liquid pump based on micro-engineering techniques are described. The pump, which is of the reciprocating displacement type, comprises a pump chamber, a thin silicon pump membrane and two silicon check valves to direct the flow. The dynamic pressure of an amount of gas contained in a cavity, controlled by resistive heating, actuates the pump membrane. The cavity, chambers, channels and valves are realized in silicon wa...

  15. Microanalysis of clay-based pigments by XRD techniques

    Czech Academy of Sciences Publication Activity Database

    Hradil, David; Bezdička, Petr; Hradilová, J.

    Catania : Technart, 2015. "O-48". [Technart 2015 : non-destructive and microanalytical techniques in art and cultural heritage. 27.04.2015-30.04.2015, Catania] R&D Projects: GA ČR GA14-22984S Keywords : micro-XRD * clay-based pigments * paintings Subject RIV: CA - Inorganic Chemistry http://technart2015.lns.infn.it/images/BoA.pdf

  16. WAVELET-BASED WARPING TECHNIQUE FOR MOBILE DEVICES

    OpenAIRE

    Ekta Walia; Vishal Verma,

    2014-01-01

    The role of digital images is increasing rapidly in mobile devices. They are used in many applications including virtual tours, virtual reality, e-commerce etc. Such applications synthesize realistic looking novel views of the reference images on mobile devices using the techniques like image-based rendering (IBR). However, with this increasing role of digital images comes the serious issue of processing large images which requires considerable time. Hence, methods to compress ...

  17. A microscopy technique based on bio-impedance sensors

    OpenAIRE

    Yúfera, A.; Huertas, Gloria; Olmo, Alberto

    2012-01-01

    It is proposed a microscopy for cell culture applications based on impedance sensors. The imagined signals are measured with the Electrical Cell-Substrate Spectroscopy (ECIS) technique, by identifying the cell area. The proposed microscopy allows real-time monitoring inside the incubator, reducing the contamination risk by human manipulation. It requires specific circuits for impedance measurements, a two-dimensional sensor array (pixels), and employing electrical models to decode efficiently...

  18. Designing a Competency-Based New County Extension Personnel Training Program: A Novel Approach

    Science.gov (United States)

    Brodeur, Cheri Winton; Higgins, Cynthia; Galindo-Gonzalez, Sebastian; Craig, Diane D.; Haile, Tyann

    2011-01-01

    Voluntary county personnel turnover occurs for a multitude of reasons, including the lack of job satisfaction, organizational commitment, and job embeddedness and lack of proper training. Loss of personnel can be costly both economically and in terms of human capital. Retention of Extension professionals can be improved through proper training or…

  19. Quantum state tomography of orbital angular momentum photonics qubits via a projection-based technique

    CERN Document Server

    Nicolas, Adrien; Giacobino, Elisabeth; Maxein, Dominik; Laurat, Julien

    2014-01-01

    While measuring the orbital angular momentum state of bright light beams can be performed using imaging techniques, a full characterization at the single-photon level is challenging. For applications to quantum optics and quantum information science, such characterization is an essential capability. Here, we present a setup to perform the quantum state tomography of photonic qubits encoded in this degree of freedom. The method is based on a projective technique using spatial mode projection via fork holograms and single-mode fibers inserted into an interferometer. The alignment and calibration of the device is detailed as well as the measurement sequence to reconstruct the associated density matrix. Possible extensions to higher-dimensional spaces are discussed.

  20. A Different Web-Based Geocoding Service Using Fuzzy Techniques

    Science.gov (United States)

    Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.

    2015-12-01

    Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.

  1. Hydrocarbon microseepage mapping using signature based target detection techniques

    Science.gov (United States)

    Soydan, Hilal; Koz, Alper; Şebnem Düzgün, H.; Aydin Alatan, A.

    2015-10-01

    In this paper, we compare the conventional methods in hydrocarbon seepage anomalies with the signature based detection algorithms. The Crosta technique [1] is selected as a basement in the experimental comparisons for the conventional approach. The Crosta technique utilizes the characteristic bands of the searched target for principal component transformation in order to determine the components characterizing the target in interest. Desired Target Detection and Classification Algorithm (DTDCA), Spectral Matched Filter (SMF), and Normalized Correlation (NC) are employed for signature based target detection. Signature based target detection algorithms are applied to the whole spectrum benefiting from the information stored in all spectral bands. The selected methods are applied to a multispectral Advanced SpaceBorne Thermal Emission and Radiometer (ASTER) image of the study region, with an atmospheric correction prior to the realization of the algorithms. ASTER provides multispectral bands covering visible, short wave, and thermal infrared region, which serves as a useful tool for the interpretation of the areas with hydrocarbon anomalies. The exploration area is selected as Gemrik Anticline which is located in South East Anatolia, Adıyaman, Bozova Oil Field, where microseeps can be observed with almost no vegetation cover. The spectral signatures collected with Analytical Spectral Devices Inc. (ASD) spectrometer from the reference valley [2] have been utilized as an input to the signature based detection algorithms. The experiments have indicated that DTDCA and MF outperforms the Crosta technique by locating the microseepage patterns along the mitigation pathways with a better contrast. On the other hand, NC has not been able to map the searched target with a visible distinction. It is concluded that the signature based algorithms can be more effective than the conventional methods for the detection of microseepage induced anomalies.

  2. A Review of Financial Accounting Fraud Detection based on Data Mining Techniques

    Science.gov (United States)

    Sharma, Anuj; Kumar Panigrahi, Prabin

    2012-02-01

    With an upsurge in financial accounting fraud in the current economic scenario experienced, financial accounting fraud detection (FAFD) has become an emerging topic of great importance for academic, research and industries. The failure of internal auditing system of the organization in identifying the accounting frauds has lead to use of specialized procedures to detect financial accounting fraud, collective known as forensic accounting. Data mining techniques are providing great aid in financial accounting fraud detection, since dealing with the large data volumes and complexities of financial data are big challenges for forensic accounting. This paper presents a comprehensive review of the literature on the application of data mining techniques for the detection of financial accounting fraud and proposes a framework for data mining techniques based accounting fraud detection. The systematic and comprehensive literature review of the data mining techniques applicable to financial accounting fraud detection may provide a foundation to future research in this field. The findings of this review show that data mining techniques like logistic models, neural networks, Bayesian belief network, and decision trees have been applied most extensively to provide primary solutions to the problems inherent in the detection and classification of fraudulent data.

  3. Regression based peak load forecasting using a transformation technique

    International Nuclear Information System (INIS)

    This paper presents a regression based daily peak load forecasting method with a transformation technique. In order to forecast the load precisely through a year, the authors should consider seasonal load change, annual load growth and the latest daily load change. To deal with these characteristics in the load forecasting, a transformation technique is presented. This technique consists of a transformation function with translation and reflection methods. The transformation function is estimated with the previous year's data points, in order that the function converts the data points into a set of new data points with preserving the shape of temperature-load relationships in the previous year. Then, the function is slightly translated so that the transformed data points will fit the shape of temperature-load relationships in the year. Finally, multivariate regression analysis with the latest daily loads and weather observations estimates the forecasting model. Large forecasting errors caused by the weather-load nonlinear characteristic in the transitional seasons such as spring and fall are reduced. Performance of the technique which is verified with simulations on actual load data of Tokyo Electric Power Company is also described

  4. The Roland Maze Project - school-based extensive air shower network

    International Nuclear Information System (INIS)

    We plan to construct the large area network of extensive air shower detectors placed on the roofs of high school buildings in the city of Lodz. Detection points will be connected by INTERNET to the central server and their work will be synchronized by GPS. The main scientific goal of the project are studies of ultra high energy cosmic rays. Using existing town infrastructure (INTERNET, power supply, etc.) will significantly reduce the cost of the experiment. Engaging high school students in the research program should significantly increase their knowledge of science and modern technologies, and can be a very efficient way of science popularisation. We performed simulations of the projected network capabilities of registering Extensive Air Showers and reconstructing energies of primary particles. Results of the simulations and the current status of project realisation will be presented

  5. Noninvasive in vivo glucose sensing using an iris based technique

    Science.gov (United States)

    Webb, Anthony J.; Cameron, Brent D.

    2011-03-01

    Physiological glucose monitoring is important aspect in the treatment of individuals afflicted with diabetes mellitus. Although invasive techniques for glucose monitoring are widely available, it would be very beneficial to make such measurements in a noninvasive manner. In this study, a New Zealand White (NZW) rabbit animal model was utilized to evaluate a developed iris-based imaging technique for the in vivo measurement of physiological glucose concentration. The animals were anesthetized with isoflurane and an insulin/dextrose protocol was used to control blood glucose concentration. To further help restrict eye movement, a developed ocular fixation device was used. During the experimental time frame, near infrared illuminated iris images were acquired along with corresponding discrete blood glucose measurements taken with a handheld glucometer. Calibration was performed using an image based Partial Least Squares (PLS) technique. Independent validation was also performed to assess model performance along with Clarke Error Grid Analysis (CEGA). Initial validation results were promising and show that a high percentage of the predicted glucose concentrations are within 20% of the reference values.

  6. IMAGE SEGMENTATION BASED ON MARKOV RANDOM FIELD AND WATERSHED TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    This paper presented a method that incorporates Markov Random Field(MRF), watershed segmentation and merging techniques for performing image segmentation and edge detection tasks. MRF is used to obtain an initial estimate of x regions in the image under process where in MRF model, gray level x, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The process needs an initial segmented result. An initial segmentation is got based on K-means clustering technique and the minimum distance, then the region process in modeled by MRF to obtain an image contains different intensity regions. Starting from this we calculate the gradient values of that image and then employ a watershed technique. When using MRF method it obtains an image that has different intensity regions and has all the edge and region information, then it improves the segmentation result by superimpose closed and an accurate boundary of each region using watershed algorithm. After all pixels of the segmented regions have been processed, a map of primitive region with edges is generated. Finally, a merge process based on averaged mean values is employed. The final segmentation and edge detection result is one closed boundary per actual region in the image.

  7. Extensible Component Based Architecture for FLASH, A Massively Parallel, Multiphysics Simulation Code

    OpenAIRE

    Dubey, A.; Reid, L. B.; Weide, K.; Antypas, K.; Ganapathy, M. K.; Riley, K.; Sheeler, D.; Siegal, A

    2009-01-01

    FLASH is a publicly available high performance application code which has evolved into a modular, extensible software system from a collection of unconnected legacy codes. FLASH has been successful because its capabilities have been driven by the needs of scientific applications, without compromising maintainability, performance, and usability. In its newest incarnation, FLASH3 consists of inter-operable modules that can be combined to generate different applications. The FLASH architecture a...

  8. Human resource development for a community-based health extension program: a case study from Ethiopia

    OpenAIRE

    Teklehaimanot, Hailay D; Teklehaimanot, Awash

    2013-01-01

    Introduction Ethiopia is one of the sub-Saharan countries most affected by high disease burden, aggravated by a shortage and imbalance of human resources, geographical distance, and socioeconomic factors. In 2004, the government introduced the Health Extension Program (HEP), a primary care delivery strategy, to address the challenges and achieve the World Health Organization Millennium Development Goals (MDGs) within a context of limited resources. Case description The health system was refor...

  9. Architecture of an Open-Sourced, Extensible Data Warehouse Builder: InterBase 6 Data Warehouse Builder (IB-DWB)

    OpenAIRE

    Ling, Maurice HT; So, Chi Wai

    2003-01-01

    We report the development of an open-sourced data warehouse builder, InterBase Data Warehouse Builder (IB-DWB), based on Borland InterBase 6 Open Edition Database Server. InterBase 6 is used for its low maintenance and small footprint. IB-DWB is designed modularly and consists of 5 main components, Data Plug Platform, Discoverer Platform, Multi-Dimensional Cube Builder, and Query Supporter, bounded together by a Kernel. It is also an extensible system, made possible by the Data Plug Platform ...

  10. Research on Deep Joints and Lode Extension Based on Digital Borehole Camera Technology

    Directory of Open Access Journals (Sweden)

    Han Zengqiang

    2015-09-01

    Full Text Available Structure characteristics of rock and orebody in deep borehole are obtained by borehole camera technology. By investigating on the joints and fissures in Shapinggou molybdenum mine, the dominant orientation of joint fissure in surrounding rock and orebody were statistically analyzed. Applying the theory of metallogeny and geostatistics, the relationship between joint fissure and lode’s extension direction is explored. The results indicate that joints in the orebody of ZK61borehole have only one dominant orientation SE126° ∠68°, however, the dominant orientations of joints in surrounding rock were SE118° ∠73°, SW225° ∠70° and SE122° ∠65°, NE79° ∠63°. Then a preliminary conclusion showed that the lode’s extension direction is specific and it is influenced by joints of surrounding rock. Results of other boreholes are generally agree well with the ZK61, suggesting the analysis reliably reflects the lode’s extension properties and the conclusion presents important references for deep ore prospecting.

  11. New modulation-based watermarking technique for video

    Science.gov (United States)

    Lemma, Aweke; van der Veen, Michiel; Celik, Mehmet

    2006-02-01

    Successful watermarking algorithms have already been developed for various applications ranging from meta-data tagging to forensic tracking. Nevertheless, it is commendable to develop alternative watermarking techniques that provide a broader basis for meeting emerging services, usage models and security threats. To this end, we propose a new multiplicative watermarking technique for video, which is based on the principles of our successful MASK audio watermark. Audio-MASK has embedded the watermark by modulating the short-time envelope of the audio signal and performed detection using a simple envelope detector followed by a SPOMF (symmetrical phase-only matched filter). Video-MASK takes a similar approach and modulates the image luminance envelope. In addition, it incorporates a simple model to account for the luminance sensitivity of the HVS (human visual system). Preliminary tests show algorithms transparency and robustness to lossy compression.

  12. An Improved Face Recognition Technique Based on Modular LPCA Approach

    Directory of Open Access Journals (Sweden)

    Mathu S.S. Kumar

    2011-01-01

    Full Text Available Problem statement: A face identification algorithm based on modular localized variation by Eigen Subspace technique, also called modular localized principal component analysis, is presented in this study. Approach: The face imagery was partitioned into smaller sub-divisions from a predefined neighborhood and they were ultimately fused to acquire many sets of features. Since a few of the normal facial features of an individual do not differ even when the pose and illumination may differ, the proposed method manages these variations. Results: The proposed feature selection module has significantly, enhanced the identification precision using standard face databases when compared to conservative and modular PCA techniques. Conclusion: The proposed algorithm, when related with conservative PCA algorithm and modular PCA, has enhanced recognition accuracy for face imagery with illumination, expression and pose variations.

  13. Studying Satellite Image Quality Based on the Fusion Techniques

    CERN Document Server

    Al-Wassai, Firouz Abdullah; Al-Zaky, Ali A

    2011-01-01

    Various and different methods can be used to produce high-resolution multispectral images from high-resolution panchromatic image (PAN) and low-resolution multispectral images (MS), mostly on the pixel level. However, the jury is still out on the benefits of a fused image compared to its original images. There is also a lack of measures for assessing the objective quality of the spatial resolution for the fusion methods. Therefore, an objective quality of the spatial resolution assessment for fusion images is required. So, this study attempts to develop a new qualitative assessment to evaluate the spatial quality of the pan sharpened images by many spatial quality metrics. Also, this paper deals with a comparison of various image fusion techniques based on pixel and feature fusion techniques.

  14. Feature-based multiresolution techniques for product design

    Institute of Scientific and Technical Information of China (English)

    LEE Sang Hun; LEE Kunwoo

    2006-01-01

    3D computer-aided design (CAD) systems based on feature-based solid modelling technique have been widely spread and used for product design. However, when part models associated with features are used in various downstream applications,simplified models in various levels of detail (LODs) are frequently more desirable than the full details of the parts. In particular,the need for feature-based multiresolution representation of a solid model representing an object at multiple LODs in the feature unit is increasing for engineering tasks. One challenge is to generate valid models at various LODs after an arbitrary rearrangement of features using a certain LOD criterion, because composite Boolean operations consisting of union and subtraction are not commutative. The other challenges are to devise proper topological framework for multiresolution representation, to suggest more reasonable LOD criteria, and to extend applications. This paper surveys the recent research on these issues.

  15. Extensive analysis of potentialities and limitations of a maximum cross-correlation technique for surface circulation by using realistic ocean model simulations

    Science.gov (United States)

    Doronzo, Bartolomeo; Taddei, Stefano; Brandini, Carlo; Fattorini, Maria

    2015-08-01

    As shown in the literature, ocean surface circulation can be estimated from sequential satellite imagery by using the maximum cross-correlation (MCC) technique. This approach is very promising since it offers the potential to acquire synoptic-scale coverage of the surface currents on a quasi-continuous temporal basis. However, MCC has also many limits due, for example, to cloud cover or the assumption that Sea Surface Temperature (SST) or other surface parameters from satellite imagery are considered as conservative passive tracers. Also, since MCC can detect only advective flows, it might not work properly in shallow water, where local heating and cooling, upwelling and other small-scale processes have a strong influence. Another limitation of the MCC technique is the impossibility of detecting currents moving along surface temperature fronts. The accuracy and reliability of MCC can be analysed by comparing the estimated velocities with those measured by in situ instrumentation, but the low number of experimental measurements does not allow a systematic statistical study of the potentials and limitations of the method. Instead, an extensive analysis of these features can be done by applying the MCC to synthetic imagery obtained from a realistic numerical ocean model that takes into account most physical phenomena. In this paper a multi-window (MW-) MCC technique is proposed, and its application to synthetic imagery obtained by a regional high-resolution implementation of the Regional Ocean Modeling System (ROMS) is discussed. An application of the MW-MCC algorithm to a real case and a comparison with experimental measurements are then shown.

  16. Herd-scale measurements of methane emissions from cattle grazing extensive sub-tropical grasslands using the open-path laser technique.

    Science.gov (United States)

    Tomkins, N W; Charmley, E

    2015-12-01

    Methane (CH4) emissions associated with beef production systems in northern Australia are yet to be quantified. Methodologies are available to measure emissions, but application in extensive grazing environments is challenging. A micrometeorological methodology for estimating herd-scale emissions using an indirect open-path spectroscopic technique and an atmospheric dispersion model is described. The methodology was deployed on five cattle properties across Queensland and Northern Territory, with measurements conducted during two occasions at one site. On each deployment, data were collected every 10 min for up to 7 h a day over 4 to 16 days. To increase the atmospheric concentration of CH4 to measurable levels, cattle were confined to a known area around water points from ~0800 to 1600 h, during which time measurements of wind statistics and line-averaged CH4 concentration were taken. Filtering to remove erroneous data accounted for 35% of total observations. For five of the six deployments CH4 emissions were within the expected range of 0.4 to 0.6 g/kg BW. At one site, emissions were ~2 times expected values. There was small but consistent variation with time of day, although for some deployments measurements taken early in the day tended to be higher than at the other times. There was a weak linear relationship (R 2=0.47) between animal BW and CH4 emission per kg BW. Where it was possible to compare emissions in the early and late dry season at one site, it was speculated that higher emissions at the late dry season may have been attributed to poorer diet quality. It is concluded that the micrometeorological methodology using open-path lasers can be successfully deployed in extensive grazing conditions to directly measure CH4 emissions from cattle at a herd scale. PMID:26290115

  17. Transformer-based design techniques for oscillators and frequency dividers

    CERN Document Server

    Luong, Howard Cam

    2016-01-01

    This book provides in-depth coverage of transformer-based design techniques that enable CMOS oscillators and frequency dividers to achieve state-of-the-art performance.  Design, optimization, and measured performance of oscillators and frequency dividers for different applications are discussed in detail, focusing on not only ultra-low supply voltage but also ultra-wide frequency tuning range and locking range.  This book will be an invaluable reference for anyone working or interested in CMOS radio-frequency or mm-Wave integrated circuits and systems.

  18. Line impedance estimation using model based identification technique

    DEFF Research Database (Denmark)

    Ciobotaru, Mihai; Agelidis, Vassilios; Teodorescu, Remus

    into the operation of the grid-connected power converters. This paper describes a quasi passive method for estimating the line impedance of the distribution electricity network. The method uses the model based identification technique to obtain the resistive and inductive parts of the line impedance....... The quasi-passive behaviour of the proposed method comes from the combination of the non intrusive behaviour of the passive methods with a better accuracy of the active methods. The simulation results reveal the good accuracy of the proposed method....

  19. Coupled cavity model based on the mode matching technique

    CERN Document Server

    Ayzatsky, M I

    2015-01-01

    We have developed the mode matching technique that is based on the using the eigenmodes of circular cavities and the eigenwaves of circular waveguides as the basic functions for calculation the properties of nonuniform disc-loaded waveguides. We have obtained exact infinite systems of coupled equations which can be reduced by making some assumptions. Under such procedure we can receive more exact parameters of nonuniform equivalent circuits by solving the appropriative algebraic systems. These parameters of equivalent circuits are functions both geometric sizes and frequency. Moreover, under such approach all used values have interpretation. We called this approach as coupled cavity model.

  20. Simultaneous algebraic reconstruction technique based on guided image filtering.

    Science.gov (United States)

    Ji, Dongjiang; Qu, Gangrong; Liu, Baodong

    2016-07-11

    The challenge of computed tomography is to reconstruct high-quality images from few-view projections. Using a prior guidance image, guided image filtering smoothes images while preserving edge features. The prior guidance image can be incorporated into the image reconstruction process to improve image quality. We propose a new simultaneous algebraic reconstruction technique based on guided image filtering. Specifically, the prior guidance image is updated in the image reconstruction process, merging information iteratively. To validate the algorithm practicality and efficiency, experiments were performed with numerical phantom projection data and real projection data. The results demonstrate that the proposed method is effective and efficient for nondestructive testing and rock mechanics. PMID:27410859

  1. A Comparative Analysis of Exemplar Based and Wavelet Based Inpainting Technique

    Directory of Open Access Journals (Sweden)

    Vaibhav V Nalawade

    2012-06-01

    Full Text Available Image inpainting is the process of filling in of missing region so as to preserve its overall continuity. Image inpainting is manipulation and modification of an image in a form that is not easily detected. Digital image inpainting is relatively new area of research, but numerous and different approaches to tackle the inpainting problem have been proposed since the concept was first introduced. This paper compares two separate techniques viz, Exemplar based inpainting technique and Wavelet based inpainting technique, each portraying a different set of characteristics. The algorithms analyzed under exemplar technique are large object removal by exemplar based inpainting technique (Criminisi’s and modified exemplar (Cheng. The algorithm analyzed under wavelet is Chen’s visual image inpainting method. A number of examples on real and synthetic images are demonstrated to compare the results of different algorithms using both qualitative and quantitative parameters.

  2. Radiation synthesized protein-based nanoparticles: A technique overview

    International Nuclear Information System (INIS)

    Seeking for alternative routes for protein engineering a novel technique – radiation induced synthesis of protein nanoparticles – to achieve size controlled particles with preserved bioactivity has been recently reported. This work aimed to evaluate different process conditions to optimize and provide an overview of the technique using γ-irradiation. Papain was used as model protease and the samples were irradiated in a gamma cell irradiator in phosphate buffer (pH=7.0) containing ethanol (0–35%). The dose effect was evaluated by exposure to distinct γ-irradiation doses (2.5, 5, 7.5 and 10 kGy) and scale up experiments involving distinct protein concentrations (12.5–50 mg mL−1) were also performed. Characterization involved size monitoring using dynamic light scattering. Bityrosine detection was performed using fluorescence measurements in order to provide experimental evidence of the mechanism involved. Best dose effects were achieved at 10 kGy with regard to size and no relevant changes were observed as a function of papain concentration, highlighting very broad operational concentration range. Bityrosine changes were identified for the samples as a function of the process confirming that such linkages play an important role in the nanoparticle formation. - Highlights: • Synthesis of protein-based nanoparticles by γ-irradiation. • Optimization of the technique. • Overview of mechanism involved in the nanoparticle formation. • Engineered papain nanoparticles for biomedical applications

  3. Extension of theoretically based DNB prediction to high void fraction, intermediate mass flow rate, and rod bundles

    International Nuclear Information System (INIS)

    The present work extends a theoretically based DNB predictive procedure so that it is applicable at high void fractions and low velocities. This theoretically based DNB predictive procedure was originally developed for round tubes by Weisman and Pei and in its original range of applicability, is as accurate as most empirical correlations. Highly favorable comparisons of the present extension with experimental data were obtained. The present work also develops a procedure for applying this DNB prediction procedure to rod bundles. The results of this study show that application of this theoretically based DNB prediction procedure to rod bundles with and without mixing vane grids leads to good agreement with experimental observations

  4. Risk-based maintenance-Techniques and applications

    International Nuclear Information System (INIS)

    Plant and equipment, however well designed, will not remain safe or reliable if it is not maintained. The general objective of the maintenance process is to make use of the knowledge of failures and accidents to achieve the possible safety with the lowest possible cost. The concept of risk-based maintenance was developed to inspect the high-risk components usually with greater frequency and thoroughness and to maintain in a greater manner, to achieve tolerable risk criteria. Risk-based maintenance methodology provides a tool for maintenance planning and decision making to reduce the probability of failure of equipment and the consequences of failure. In this paper, the risk analysis and risk-based maintenance methodologies were identified and classified into suitable classes. The factors affecting the quality of risk analysis were identified and analyzed. The applications, input data and output data were studied to understand their functioning and efficiency. The review showed that there is no unique way to perform risk analysis and risk-based maintenance. The use of suitable techniques and methodologies, careful investigation during the risk analysis phase, and its detailed and structured results are necessary to make proper risk-based maintenance decisions

  5. DIDACTIC BASES OF EXPLICIT AND IMPLICIT TRAINING TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Nail Iskanderov

    2015-08-01

    Full Text Available The article highlights the study of the evolutionary movement of scientific and pedagogical thought in recent decades, a trend of movement from the issues of improving the organization of the learning process through the increased efficiency of the educational process to the problem of understanding the nature of mental processes, analyzes the state of psychological and pedagogical methods of teaching, summarizes instructional methods, associated with the implicit and explicit formation of knowledge, develops the scheme, based on the internal logic of the relationship of neural ensembles, allowing to form stage-by-stage and develop a frame-based and conceptual organization of knowledge in the process of learning new information, considers the features and classification of explicit and implicit methods and techniques of training, and the possibility of using these methods in teaching, shows the similarities and differences between the terms "concept", "frame" and "cognitom", explains the insight into implicit and explicit teaching methodology.

  6. Designing on ICT reconstruction software based on DSP techniques

    International Nuclear Information System (INIS)

    The convolution back project (CBP) algorithm is used to realize the CT image's reconstruction in ICT generally, which is finished by using PC or workstation. In order to add the ability of multi-platform operation of CT reconstruction software, a CT reconstruction method based on modern digital signal processor (DSP) technique is proposed and realized in this paper. The hardware system based on TI's C6701 DSP processor is selected to support the CT software construction. The CT reconstruction software is compiled only using assembly language related to the DSP hardware. The CT software can be run on TI's C6701 EVM board by inputting the CT data, and can get the CT Images that satisfy the real demands. (authors)

  7. Evolutionary Based Techniques for Fault Tolerant Field Programmable Gate Arrays

    Science.gov (United States)

    Larchev, Gregory V.; Lohn, Jason D.

    2006-01-01

    The use of SRAM-based Field Programmable Gate Arrays (FPGAs) is becoming more and more prevalent in space applications. Commercial-grade FPGAs are potentially susceptible to permanently debilitating Single-Event Latchups (SELs). Repair methods based on Evolutionary Algorithms may be applied to FPGA circuits to enable successful fault recovery. This paper presents the experimental results of applying such methods to repair four commonly used circuits (quadrature decoder, 3-by-3-bit multiplier, 3-by-3-bit adder, 440-7 decoder) into which a number of simulated faults have been introduced. The results suggest that evolutionary repair techniques can improve the process of fault recovery when used instead of or as a supplement to Triple Modular Redundancy (TMR), which is currently the predominant method for mitigating FPGA faults.

  8. Principles and applications of neutron based inspection techniques

    International Nuclear Information System (INIS)

    Neutron based explosive inspection systems can detect a wide variety of substances of importance, for a variety of purposes from national security threats (e.g., nuclear materials, explosives, narcotics) to customs duties, shipment control and validation, and for protection of the environment. The inspection is generally founded on the nuclear interactions of the neutrons with the various nuclides present and the detection of resultant characteristic emissions. These can be discrete gamma lines resulting from the thermal (n,γ) neutron capture process or inelastic neutron scattering (n,n'γ) occurring with fast neutrons. The two types of reactions are generally complementary. The capture process provides energetic and highly penetrating gamma rays in most inorganic substances and in hydrogen, while fast neutron inelastic scattering provides relatively strong gamma-ray signatures in light elements such as carbon and oxygen. In some specific important cases, though, unique signatures are provided by the neutron capture process in light elements such as nitrogen, where unusually high energy gamma rays are produced. This forms the basis for key explosive detection techniques. The detection of nuclear materials, both fissionable (e.g., 238U) and fissile (e.g., 235U), are generally based on the fissions induced by the probing neutrons and detecting one or more of the unique signatures of the fission process. These include prompt and delayed neutrons and prompt and delayed gamma rays. These signatures are not discrete in energy (typically they are continua) but temporally and energetically significantly different from the background, thus making them readily distinguishable. The penetrability of fast neutrons as probes and the gamma rays and fission neutrons as signatures make neutron interrogation applicable for large conveyances such as cars, trucks and marine containers. The neutron-based techniques can be used in a variety of scenarios and operational modes. They can

  9. Principles and applications of neutron based inspection techniques

    International Nuclear Information System (INIS)

    Neutron based explosive inspection systems can detect a wide variety of substances of importance for a variety of purposes from national security threats (e.g., nuclear materials, explosives, narcotics) to customs dutiable goods, to hazardous substances to protect the environment. The inspection is generally founded on the nuclear interactions of the neutrons with the various nuclides present and the detection of resultant characteristic emissions. These can be discrete γ lines resulting from the thermal (n, γ) neutron capture process or inelastic neutron scattering (n, n'γ) occurring with fast neutrons. The two types of reactions are generally complementary. The capture process provides energetic and highly penetrating γ rays in most inorganic substances and hydrogen. Fast neutrons inelastic scattering provide relatively strong γ-ray signatures in light elements such as carbon and oxygen. In some specific important cases, unique signatures are provided by the neutron (n, γ) process in light elements such as nitrogen, where unusually high-energy γ rays are produced. This forms the basis for key explosive detection techniques. The detection of nuclear materials, both fissionable (e.g., 238U) and fissile (e.g., 235U), is generally based on the fissions induced by the probing neutrons and detecting one or more of the unique signatures of the fission process. These include prompt and delayed neutrons and prompt and delayed γ rays. These signatures are not discrete in energy (typically they are continua) but temporally and energetically significantly different from the background, thus making them readily distinguishable. The penetrability of fast neutrons as probes, and the γ rays and fission neutrons as signatures makes neutron interrogation applicable to the inspection of large conveyances such as cars, trucks, and marine containers. Neutron based inspection techniques have a broad applications. They can be used as stand-alone for complete scans of objects

  10. RBF-based technique for statistical demodulation of pathological tremor.

    Science.gov (United States)

    Gianfelici, Francesco

    2013-10-01

    This paper presents an innovative technique based on the joint approximation capabilities of radial basis function (RBF) networks and the estimation capability of the multivariate iterated Hilbert transform (IHT) for the statistical demodulation of pathological tremor from electromyography (EMG) signals in patients with Parkinson's disease. We define a stochastic model of the multichannel high-density surface EMG by means of the RBF networks applied to the reconstruction of the stochastic process (characterizing the disease) modeled by the multivariate relationships generated by the Karhunen-Loéve transform in Hilbert spaces. Next, we perform a demodulation of the entire random field by means of the estimation capability of the multivariate IHT in a statistical setting. The proposed method is applied to both simulated signals and data recorded from three Parkinsonian patients and the results show that the amplitude modulation components of the tremor oscillation can be estimated with signal-to-noise ratio close to 30 dB with root-mean-square error for the estimates of the tremor instantaneous frequency. Additionally, the comparisons with a large number of techniques based on all the combinations of the RBF, extreme learning machine, backpropagation, support vector machine used in the first step of the algorithm; and IHT, empirical mode decomposition, multiband energy separation algorithm, periodic algebraic separation and energy demodulation used in the second step of the algorithm, clearly show the effectiveness of our technique. These results show that the proposed approach is a potential useful tool for advanced neurorehabilitation technologies that aim at tremor characterization and suppression. PMID:24808594

  11. Multiplex single-base extension typing to identify nuclear genes required for RNA editing in plant organelles

    OpenAIRE

    Takenaka, Mizuki; Brennicke, Axel

    2008-01-01

    We developed a multiplex single-base extension single-nucleotide polymorphism-typing procedure for screening large numbers of plants for mutations in mitochondrial RNA editing. The high sensitivity of the approach detects changes in the RNA editing status generated in total cellular cDNA from pooled RNA preparations of up to 50 green plants. The method has been employed to tag several nuclear encoded genes required for RNA editing at specific sites in mitochondria of Arabidopsis thaliana. Thi...

  12. On The Multi-Hop Extension of Energy-Efficient WSN Time Synchronization Based on Time-Translating Gateways

    OpenAIRE

    Liao, Qimeng; Kim, Kyeong Soo

    2016-01-01

    We report preliminary results of a simulation study on the multi-hop extension of the recently-proposed energy-efficient wireless sensor network time synchronization scheme based on time-translating gateways. Unlike the single-hop case, in multi-hop time synchronization a sensor node sends measurement data to a head node through gateways which translate the timestamp values of the received measurement data. Through simulations for two-hop time synchronization, we analyze the impact of the add...

  13. Semantic-based technique for thai documents plagiarism detection

    Directory of Open Access Journals (Sweden)

    Sorawat Prapanitisatian

    2014-03-01

    Full Text Available Plagiarism is the act of taking another person's writing or idea without referring to the source of information. This is one of major problems in educational institutes. There is a number of plagiarism detection software available on the Internet. However, a few numbers of them works. Typically, they use a simple method for plagiarism detection e.g. string matching. The main weakness of this method is it cannot detect the plagiarism when the author replaces some words using synonyms. As such, this paper presents a new technique for a semantic-based plagiarism detection using Semantic Role Labeling (SRL and term weighting. SRL is deployed in order to calculate the semantic-based similarity. The main different from the existing framework is terms in a sentence are weighted dynamically depending on their roles in the sentence e.g. subject, verb or object. This technique enhances the plagiarism detection mechanism more efficiently than existing system although positions of terms in a sentence are reordered. The experimental results show that the proposed method can detect the plagiarism document more effective than the existing methods, Anti-kobpae, Turnit-in and Traditional Semantic Role Labeling.

  14. Generation of Quasi-Gaussian Pulses Based on Correlation Techniques

    Directory of Open Access Journals (Sweden)

    POHOATA, S.

    2012-02-01

    Full Text Available The Gaussian pulses have been mostly used within communications, where some applications can be emphasized: mobile telephony (GSM, where GMSK signals are used, as well as the UWB communications, where short-period pulses based on Gaussian waveform are generated. Since the Gaussian function signifies a theoretical concept, which cannot be accomplished from the physical point of view, this should be expressed by using various functions, able to determine physical implementations. New techniques of generating the Gaussian pulse responses of good precision are approached, proposed and researched in this paper. The second and third order derivatives with regard to the Gaussian pulse response are accurately generated. The third order derivates is composed of four individual rectangular pulses of fixed amplitudes, being easily to be generated by standard techniques. In order to generate pulses able to satisfy the spectral mask requirements, an adequate filter is necessary to be applied. This paper emphasizes a comparative analysis based on the relative error and the energy spectra of the proposed pulses.

  15. Hash Based Least Significant Bit Technique For Video Steganography

    Directory of Open Access Journals (Sweden)

    Prof. Dr. P. R. Deshmukh ,

    2014-01-01

    Full Text Available The Hash Based Least Significant Bit Technique For Video Steganography deals with hiding secret message or information within a video.Steganography is nothing but the covered writing it includes process that conceals information within other data and also conceals the fact that a secret message is being sent.Steganography is the art of secret communication or the science of invisible communication. In this paper a Hash based least significant bit technique for video steganography has been proposed whose main goal is to embed a secret information in a particular video file and then extract it using a stego key or password. In this Least Significant Bit insertion method is used for steganography so as to embed data in cover video with change in the lower bit.This LSB insertion is not visible.Data hidding is the process of embedding information in a video without changing its perceptual quality. The proposed method involve with two terms that are Peak Signal to Noise Ratio (PSNR and the Mean Square Error (MSE .This two terms measured between the original video files and steganographic video files from all video frames where a distortion is measured using PSNR. A hash function is used to select the particular position for insertion of bits of secret message in LSB bits.

  16. Galaxy Cluster Mass Reconstruction Project: I. Methods and first results on galaxy-based techniques

    CERN Document Server

    Old, L; Pearce, F R; Croton, D; Muldrew, S I; Muñoz-Cuartas, J C; Gifford, D; Gray, M E; von der Linden, A; Mamon, G A; Merrifield, M R; Müller, V; Pearson, R J; Ponman, T J; Saro, A; Sepp, T; Sifón, C; Tempel, E; Tundo, E; Wang, Y O; Wojtak, R

    2014-01-01

    This paper is the first in a series in which we perform an extensive comparison of various galaxy-based cluster mass estimation techniques that utilise the positions, velocities and colours of galaxies. Our primary aim is to test the performance of these cluster mass estimation techniques on a diverse set of models that will increase in complexity. We begin by providing participating methods with data from a simple model that delivers idealised clusters, enabling us to quantify the underlying scatter intrinsic to these mass estimation techniques. The mock catalogue is based on a Halo Occupation Distribution (HOD) model that assumes spherical Navarro, Frenk and White (NFW) haloes truncated at R_200, with no substructure nor colour segregation, and with isotropic, isothermal Maxwellian velocities. We find that, above 10^14 M_solar, recovered cluster masses are correlated with the true underlying cluster mass with an intrinsic scatter of typically a factor of two. Below 10^14 M_solar, the scatter rises as the nu...

  17. Testing of Large Diameter Fresnel Optics for Space Based Observations of Extensive Air Showers

    Science.gov (United States)

    Adams, James H.; Christl, Mark J.; Young, Roy M.

    2011-01-01

    The JEM-EUSO mission will detect extensive air showers produced by extreme energy cosmic rays. It operates from the ISS looking down on Earth's night time atmosphere to detect the nitrogen fluorescence and Cherenkov produce by the charged particles in the EAS. The JEM-EUSO science objectives require a large field of view, sensitivity to energies below 50 EeV, and must fit within available ISS resources. The JEM-EUSO optic module uses three large diameter, thin plastic lenses with Fresnel surfaces to meet the instrument requirements. A bread-board model of the optic has been manufactured and has undergone preliminary tests. We report the results of optical performance tests and evaluate the present capability to manufacture these optical elements.

  18. An extensive survey of dayside diffuse aurora based on optical observations at Yellow River Station

    CERN Document Server

    Han, De-Sheng; Liu, Jian-Jun; Qiu, Qi; Keika, K; Hu, Ze-Jun; Liu, Jun-Ming; Hu, Hong-Qiao; Yang, Hui-Gen

    2016-01-01

    By using 7 years optical auroral observations obtained at Yellow River Station (magnetic latitude $76.24\\,^{\\circ}{\\rm C}$N) at Ny-Alesund, Svalbard, we performed the first extensive survey for the dayside diffuse auroras (DDAs) and acquired observational results as follows. (1) The DDAs can be classified into two broad categories, i.e., unstructured and structured DDAs. The unstructured DDAs are mainly distributed in the morning and afternoon, but the structured DDAs predominantly occurred around the magnetic local noon (MLN). (2) The unstructured DDAs observed in morning and afternoon present obviously different properties. The afternoon ones are much stable and seldom show pulsating property. (3) The DDAs are more easily observed under geomagnetically quiet times. (4) The structured DDAsmainly show patchy, stripy, and irregular forms and are often pulsating and drifting. The drifting directions are mostly westward (with speed $\\sim$5km/s), but there are cases showing eastward or poleward drifting. (5) The ...

  19. DanteR: an extensible R-based tool for quantitative analysis of -omics data

    Energy Technology Data Exchange (ETDEWEB)

    Taverner, Thomas; Karpievitch, Yuliya; Polpitiya, Ashoka D.; Brown, Joseph N.; Dabney, Alan R.; Anderson, Gordon A.; Smith, Richard D.

    2012-09-15

    Motivation: The size and complex nature of LC-MS proteomics data sets motivates development of specialized software for statistical data analysis and exploration. We present DanteR, a graphical R package that features extensive statistical and diagnostic functions for quantitative proteomics data analysis, including normalization, imputation, hypothesis testing, interactive visualization and peptide-to-protein rollup. More importantly, users can easily extend the existing functionality by including their own algorithms under the Add-On tab. Availability: DanteR and its associated user guide are available for download at http://omics.pnl.gov/software/. For Windows, a single click automatically installs DanteR along with the R programming environment. For Linux and Mac OS X, users must first install R and then follow instructions on the DanteR web site for package installation.

  20. AvoPlot: An extensible scientific plotting tool based on matplotlib

    Directory of Open Access Journals (Sweden)

    Nial Peters

    2014-02-01

    Full Text Available AvoPlot is a simple-to-use graphical plotting program written in Python and making extensive use of the matplotlib plotting library. It can be found at a href="http://code.google.com/p/avoplot/" http://code.google.com/p/avoplot/ . In addition to providing a user-friendly interface to the powerful capabilities of the matplotlib library, it also offers users the possibility of extending its functionality by creating plug-ins. These can import specific types of data into the interface and also provide new tools for manipulating them. In this respect, AvoPlot is a convenient platform for researchers to build their own data analysis tools on top of, as well as being a useful standalone program.

  1. A restrained-torque-based motion instructor: forearm flexion/extension-driving exoskeleton

    Science.gov (United States)

    Nishimura, Takuya; Nomura, Yoshihiko; Sakamoto, Ryota

    2013-01-01

    When learning complicated movements by ourselves, we encounter such problems as a self-rightness. The self-rightness results in a lack of detail and objectivity, and it may cause to miss essences and even twist the essences. Thus, we sometimes fall into the habits of doing inappropriate motions. To solve these problems or to alleviate the problems as could as possible, we have been developed mechanical man-machine human interfaces to support us learning such motions as cultural gestures and sports form. One of the promising interfaces is a wearable exoskeleton mechanical system. As of the first try, we have made a prototype of a 2-link 1-DOF rotational elbow joint interface that is applied for teaching extension-flexion operations with forearms and have found its potential abilities for teaching the initiating and continuing flection motion of the elbow.

  2. Enhancing the effectiveness of IST through risk-based techniques

    Energy Technology Data Exchange (ETDEWEB)

    Floyd, S.D.

    1996-12-01

    Current IST requirements were developed mainly through deterministic-based methods. While this approach has resulted in an adequate level of safety and reliability for pumps and valves, insights from probabilistic safety assessments suggest a better safety focus can be achieved at lower costs. That is, some high safety impact pumps and valves are currently not tested under the IST program and should be added, while low safety impact valves could be tested at significantly greater intervals than allowed by the current IST program. The nuclear utility industry, through the Nuclear Energy Institute (NEI), has developed a draft guideline for applying risk-based techniques to focus testing on those pumps and valves with a high safety impact while reducing test frequencies on low safety impact pumps and valves. The guideline is being validated through an industry pilot application program that is being reviewed by the U.S. Nuclear Regulatory Commission. NEI and the ASME maintain a dialogue on the two groups` activities related to risk-based IST. The presenter will provide an overview of the NEI guideline, discuss the methodological approach for applying risk-based technology to IST and provide the status of the industry pilot plant effort.

  3. Stopped depletion region extension in an AlGaN/GaN-HEMT: A new technique for improving high-frequency performance

    Science.gov (United States)

    Asad, Mohsen; Rahimian, Morteza

    2015-08-01

    We present a novel structure for AlGaN/GaN high electron mobility transistors. The structure consists of a multi-recess AlGaN barrier layer and recessed metal ring (RBRM-HEMT). The barrier thickness narrowing between the gate and the source/drain regions minimizes the depletion region extension, which leads to smaller gate-drain ( C GD ) and gate-source ( C GS ) capacitances. This technique shows a great improvement in high-frequency and high-power applications. In high-frequency operation, the cut-off frequency ( f T ) and the maximum oscillation frequency ( f max ) of the RBRM-HEMT are found to be 133 GHz and 216 GHz respectively, which is significantly higher than the 94 GHz and the 175 GHz obtained for the conventional GaN HEMT (C-HEMT). In addition, a more uniform and low-crowding electric field is obtained under the gate close to the drain side due to the recessed metal-ring structure. A 128% improvement in breakdown voltage ( V BR ) is achieved compared to the C-HEMT. Consequently, the maximum output power density ( P max ) is increased from 11.3 W/mm in the C-HEMT to 24.4 W/mm for the RBRM-HEMT.

  4. A New Rerouting Technique for the Extensor Pollicis Longus in Palliative Treatment for Wrist and Finger Extension Paralysis Resulting From Radial Nerve and C5C6C7 Root Injury.

    Science.gov (United States)

    Laravine, Jennifer; Cambon-Binder, Adeline; Belkheyar, Zoubir

    2016-03-01

    Wrist and finger extension paralysis is a consequence of an injury to the radial nerve or the C5C6C7 roots. Despite these 2 different levels of lesions, palliative treatment for this type of paralysis depends on the same tendon transfers. A large majority of the patients are able to compensate for a deficiency of the extension of the wrist and fingers. However, a deficiency in the opening of the first web space, which could be responsible for transfers to the abductor pollicis longus, the extensor pollicis brevis, and the extensor pollicis longus (EPL), frequently exists. The aim of this work was to evaluate the feasibility of a new EPL rerouting technique outside of Lister's tubercle. Another aim was to verify whether this technique allows a better opening of the thumb-index pinch in this type of paralysis. In the first part, we performed an anatomic study comparing the EPL rerouting technique and the frequently used technique for wrist and finger extension paralyses. In the second part, we present 2 clinical cases in which this new technique will be practiced. Preliminary results during this study favor the EPL rerouting technique. This is a simple and reproducible technique that allows for good opening of the first web space in the treatment of wrist and finger extension paralysis. PMID:26709570

  5. Evaluations of mosquito age grading techniques based on morphological changes.

    Science.gov (United States)

    Hugo, L E; Quick-Miles, S; Kay, B H; Ryan, P A

    2008-05-01

    Evaluations were made of the accuracy and practicality of mosquito age grading methods based on changes to mosquito morphology; including the Detinova ovarian tracheation, midgut meconium, Polovodova ovariole dilatation, ovarian injection, and daily growth line methods. Laboratory maintained Aedes vigilax (Skuse) and Culex annulirostris (Skuse) females of known chronological and physiological ages were used for these assessments. Application of the Detinova technique to laboratory reared Ae. vigilax females in a blinded trial enabled the successful identification of nulliparous and parous females in 83.7-89.8% of specimens. The success rate for identifying nulliparous females increased to 87.8-98.0% when observations of ovarian tracheation were combined with observations of the presence of midgut meconium. However, application of the Polovodova method only enabled 57.5% of nulliparous, 1-parous, 2-parous, and 3-parous Ae. vigilax females to be correctly classified, and ovarian injections were found to be unfeasible. Poor correlation was observed between the number of growth lines per phragma and the calendar age of laboratory reared Ae. vigilax females. In summary, morphological age grading methods that offer simple two-category predictions (ovarian tracheation and midgut meconium methods) were found to provide high-accuracy classifications, whereas methods that offer the separation of multiple age categories (ovariolar dilatation and growth line methods) were found to be extremely difficult and of low accuracy. The usefulness of the morphology-based methods is discussed in view of the availability of new mosquito age grading techniques based on cuticular hydrocarbon and gene transcription changes. PMID:18533427

  6. An Empirical Comparative Study of Checklist based and Ad Hoc Code Reading Techniques in a Distributed Groupware Environment

    CERN Document Server

    Akinola, Olalekan S

    2009-01-01

    Software inspection is a necessary and important tool for software quality assurance. Since it was introduced by Fagan at IBM in 1976, arguments exist as to which method should be adopted to carry out the exercise, whether it should be paper based or tool based, and what reading technique should be used on the inspection document. Extensive works have been done to determine the effectiveness of reviewers in paper based environment when using ad hoc and checklist reading techniques. In this work, we take the software inspection research further by examining whether there is going to be any significant difference in defect detection effectiveness of reviewers when they use either ad hoc or checklist reading techniques in a distributed groupware environment. Twenty final year undergraduate students of computer science, divided into ad hoc and checklist reviewers groups of ten members each were employed to inspect a medium sized java code synchronously on groupware deployed on the Internet. The data obtained were...

  7. Cryptanalysis of a technique to transform discrete logarithm based cryptosystems into identity-based cryptosystems

    OpenAIRE

    TANG, QIANG; MITCHELL, CHRIS J.

    2005-01-01

    In this paper we analyse a technique designed to transform any discrete logarithm based cryptosystem into an identity-based cryptosystem. The transformation method is claimed to be efficient and secure and to eliminate the need to invent new identity-based cryptosystems. However, we show that the identity-based cryptosystem created by the proposed transformation method suffers from a number of security and efficiency problems.

  8. Knee extension isometric torque production differences based on verbal motivation given to introverted and extroverted female children.

    Science.gov (United States)

    McWhorter, J Wesley; Landers, Merrill; Young, Daniel; Puentedura, E Louie; Hickman, Robbin A; Brooksby, Candi; Liveratti, Marc; Taylor, Lisa

    2011-08-01

    To date, little research has been conducted to test the efficacy of different forms of motivation based on a female child's personality type. The purpose of this study was to evaluate the ability of female children to perform a maximal knee extension isometric torque test with varying forms of motivation, based on the child's personality type (introvert vs. extrovert). The subjects were asked to perform a maximal isometric knee extension test under three different conditions: 1) with no verbal motivation, 2) with verbal motivation from the evaluator only, and 3) with verbal motivation from a group of their peers and the evaluator combined. A 2×3 mixed ANOVA was significant for an interaction (F 2,62=17.530; pextroverted group revealed that scores with verbal motivation from the evaluator or the evaluator plus the peers were significantly higher than without verbal motivation. Results suggest that verbal motivation has a varying effect on isometric knee extension torque production in female children with different personality types. Extroverted girls perform better with motivation, whereas introverted girls perform better without motivation from others. PMID:20812856

  9. Protein-Protein Interactions Prediction Based on Iterative Clique Extension with Gene Ontology Filtering

    OpenAIRE

    Lei Yang; Xianglong Tang

    2014-01-01

    Cliques (maximal complete subnets) in protein-protein interaction (PPI) network are an important resource used to analyze protein complexes and functional modules. Clique-based methods of predicting PPI complement the data defection from biological experiments. However, clique-based predicting methods only depend on the topology of network. The false-positive and false-negative interactions in a network usually interfere with prediction. Therefore, we propose a method combining clique-based m...

  10. A New Particle Swarm Optimization Based Stock Market Prediction Technique

    Directory of Open Access Journals (Sweden)

    Essam El. Seidy

    2016-04-01

    Full Text Available Over the last years, the average person's interest in the stock market has grown dramatically. This demand has doubled with the advancement of technology that has opened in the International stock market, so that nowadays anybody can own stocks, and use many types of software to perform the aspired profit with minimum risk. Consequently, the analysis and prediction of future values and trends of the financial markets have got more attention, and due to large applications in different business transactions, stock market prediction has become a critical topic of research. In this paper, our earlier presented particle swarm optimization with center of mass technique (PSOCoM is applied to the task of training an adaptive linear combiner to form a new stock market prediction model. This prediction model is used with some common indicators to maximize the return and minimize the risk for the stock market. The experimental results show that the proposed technique is superior than the other PSO based models according to the prediction accuracy.

  11. Magnetic Flux Leakage Sensing-Based Steel Cable NDE Technique

    Directory of Open Access Journals (Sweden)

    Seunghee Park

    2014-01-01

    Full Text Available Nondestructive evaluation (NDE of steel cables in long span bridges is necessary to prevent structural failure. Thus, an automated cable monitoring system is proposed that uses a suitable NDE technique and a cable-climbing robot. A magnetic flux leakage- (MFL- based inspection system was applied to monitor the condition of cables. This inspection system measures magnetic flux to detect the local faults (LF of steel cable. To verify the feasibility of the proposed damage detection technique, an 8-channel MFL sensor head prototype was designed and fabricated. A steel cable bunch specimen with several types of damage was fabricated and scanned by the MFL sensor head to measure the magnetic flux density of the specimen. To interpret the condition of the steel cable, magnetic flux signals were used to determine the locations of the flaws and the levels of damage. Measured signals from the damaged specimen were compared with thresholds that were set for objective decision-making. In addition, the measured magnetic flux signals were visualized as a 3D MFL map for intuitive cable monitoring. Finally, the results were compared with information on actual inflicted damages, to confirm the accuracy and effectiveness of the proposed cable monitoring method.

  12. Model-checking techniques based on cumulative residuals.

    Science.gov (United States)

    Lin, D Y; Wei, L J; Ying, Z

    2002-03-01

    Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided. PMID:11890304

  13. Detecting Molecular Properties by Various Laser-Based Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hsin, Tse-Ming [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    Four different laser-based techniques were applied to study physical and chemical characteristics of biomolecules and dye molecules. These techniques are liole burning spectroscopy, single molecule spectroscopy, time-resolved coherent anti-Stokes Raman spectroscopy and laser-induced fluorescence microscopy. Results from hole burning and single molecule spectroscopy suggested that two antenna states (C708 & C714) of photosystem I from cyanobacterium Synechocystis PCC 6803 are connected by effective energy transfer and the corresponding energy transfer time is ~6 ps. In addition, results from hole burning spectroscopy indicated that the chlorophyll dimer of the C714 state has a large distribution of the dimer geometry. Direct observation of vibrational peaks and evolution of coumarin 153 in the electronic excited state was demonstrated by using the fs/ps CARS, a variation of time-resolved coherent anti-Stokes Raman spectroscopy. In three different solvents, methanol, acetonitrile, and butanol, a vibration peak related to the stretch of the carbonyl group exhibits different relaxation dynamics. Laser-induced fluorescence microscopy, along with the biomimetic containers-liposomes, allows the measurement of the enzymatic activity of individual alkaline phosphatase from bovine intestinal mucosa without potential interferences from glass surfaces. The result showed a wide distribution of the enzyme reactivity. Protein structural variation is one of the major reasons that are responsible for this highly heterogeneous behavior.

  14. NVC Based Model for Selecting Effective Requirement Elicitation Technique

    Directory of Open Access Journals (Sweden)

    Md. Rizwan Beg

    2012-10-01

    Full Text Available Requirement Engineering process starts from gathering of requirements i.e.; requirements elicitation. Requirementselicitation (RE is the base building block for a software project and has very high impact onsubsequent design and builds phases as well. Accurately capturing system requirements is the major factorin the failure of most of software projects. Due to the criticality and impact of this phase, it is very importantto perform the requirements elicitation in no less than a perfect manner. One of the most difficult jobsfor elicitor is to select appropriate technique for eliciting the requirement. Interviewing and Interactingstakeholder during Elicitation process is a communication intensive activity involves Verbal and Nonverbalcommunication (NVC. Elicitor should give emphasis to Non-verbal communication along with verbalcommunication so that requirements recorded more efficiently and effectively. In this paper we proposea model in which stakeholders are classified by observing non-verbal communication and use it as a basefor elicitation technique selection. We also propose an efficient plan for requirements elicitation which intendsto overcome on the constraints, faced by elicitor.

  15. Chemistry research and chemical techniques based on research reactors

    International Nuclear Information System (INIS)

    Chemistry has occupied an important position historically in the sciences associated with nuclear reactors and it continues to play a prominent role in reactor-based research investigations. This Panel of prominent scientists in the field was convened by the International Atomic Energy Agency (IAEA) to assess the present state of such chemistry research for the information of its Member States and others interested in the subject. There are two ways in which chemistry is associated with nuclear reactors: (a) general applications to many scientific fields in which chemical techniques are involved as essential service functions; and (b) specific applications of reactor facilities to the solution of chemical problems themselves. Twenty years of basic research with nuclear reactors have demonstrated a very widespread, and still increasing, demand for radioisotopes and isotopically-labelled molecules in all fields of the physical and biological sciences. Similarly, the determination of the elemental composition of a material through the analytical technique of activation analysis can be applied throughout experimental science. Refs, figs and tabs

  16. Investigations on landmine detection by neutron-based techniques

    Energy Technology Data Exchange (ETDEWEB)

    Csikai, J. E-mail: csikai@delfin.klte.hu; Doczi, R.; Kiraly, B

    2004-07-01

    Principles and techniques of some neutron-based methods used to identify the antipersonnel landmines (APMs) are discussed. New results have been achieved in the field of neutron reflection, transmission, scattering and reaction techniques. Some conclusions are as follows: The neutron hand-held detector is suitable for the observation of anomaly caused by a DLM2-like sample in different soils with a scanning speed of 1 m{sup 2}/1.5 min; the reflection cross section of thermal neutrons rendered the determination of equivalent thickness of different soil components possible; a simple method was developed for the determination of the thermal neutron flux perturbation factor needed for multi-elemental analysis of bulky samples; unfolded spectra of elastically backscattered neutrons using broad-spectrum sources render the identification of APMs possible; the knowledge of leakage spectra of different source neutrons is indispensable for the determination of the differential and integrated reaction rates and through it the dimension of the interrogated volume; the precise determination of the C/O atom fraction requires the investigations on the angular distribution of the 6.13 MeV gamma-ray emitted in the {sup 16}O(n,n'{gamma}) reaction. These results, in addition to the identification of landmines, render the improvement of the non-intrusive neutron methods possible.

  17. Astronomical Image Compression Techniques Based on ACC and KLT Coder

    Science.gov (United States)

    Schindler, J.; Páta, P.; Klíma, M.; Fliegel, K.

    This paper deals with a compression of image data in applications in astronomy. Astronomical images have typical specific properties -- high grayscale bit depth, size, noise occurrence and special processing algorithms. They belong to the class of scientific images. Their processing and compression is quite different from the classical approach of multimedia image processing. The database of images from BOOTES (Burst Observer and Optical Transient Exploring System) has been chosen as a source of the testing signal. BOOTES is a Czech-Spanish robotic telescope for observing AGN (active galactic nuclei) and the optical transient of GRB (gamma ray bursts) searching. This paper discusses an approach based on an analysis of statistical properties of image data. A comparison of two irrelevancy reduction methods is presented from a scientific (astrometric and photometric) point of view. The first method is based on a statistical approach, using the Karhunen-Loève transform (KLT) with uniform quantization in the spectral domain. The second technique is derived from wavelet decomposition with adaptive selection of used prediction coefficients. Finally, the comparison of three redundancy reduction methods is discussed. Multimedia format JPEG2000 and HCOMPRESS, designed especially for astronomical images, are compared with the new Astronomical Context Coder (ACC) coder based on adaptive median regression.

  18. Astronomical Image Compression Techniques Based on ACC and KLT Coder

    Directory of Open Access Journals (Sweden)

    J. Schindler

    2011-01-01

    Full Text Available This paper deals with a compression of image data in applications in astronomy. Astronomical images have typical specific properties — high grayscale bit depth, size, noise occurrence and special processing algorithms. They belong to the class of scientific images. Their processing and compression is quite different from the classical approach of multimedia image processing. The database of images from BOOTES (Burst Observer and Optical Transient Exploring System has been chosen as a source of the testing signal. BOOTES is a Czech-Spanish robotic telescope for observing AGN (active galactic nuclei and the optical transient of GRB (gamma ray bursts searching. This paper discusses an approach based on an analysis of statistical properties of image data. A comparison of two irrelevancy reduction methods is presented from a scientific (astrometric and photometric point of view. The first method is based on a statistical approach, using the Karhunen-Loeve transform (KLT with uniform quantization in the spectral domain. The second technique is derived from wavelet decomposition with adaptive selection of used prediction coefficients. Finally, the comparison of three redundancy reduction methods is discussed. Multimedia format JPEG2000 and HCOMPRESS, designed especially for astronomical images, are compared with the new Astronomical Context Coder (ACC coder based on adaptive median regression.

  19. Crop Yield Forecasted Model Based on Time Series Techniques

    Institute of Scientific and Technical Information of China (English)

    Li Hong-ying; Hou Yan-lin; Zhou Yong-juan; Zhao Hui-ming

    2012-01-01

    Traditional studies on potential yield mainly referred to attainable yield: the maximum yield which could be reached by a crop in a given environment. The new concept of crop yield under average climate conditions was defined in this paper, which was affected by advancement of science and technology. Based on the new concept of crop yield, the time series techniques relying on past yield data was employed to set up a forecasting model. The model was tested by using average grain yields of Liaoning Province in China from 1949 to 2005. The testing combined dynamic n-choosing and micro tendency rectification, and an average forecasting error was 1.24%. In the trend line of yield change, and then a yield turning point might occur, in which case the inflexion model was used to solve the problem of yield turn point.

  20. Whitelists Based Multiple Filtering Techniques in SCADA Sensor Networks

    Directory of Open Access Journals (Sweden)

    DongHo Kang

    2014-01-01

    Full Text Available Internet of Things (IoT consists of several tiny devices connected together to form a collaborative computing environment. Recently IoT technologies begin to merge with supervisory control and data acquisition (SCADA sensor networks to more efficiently gather and analyze real-time data from sensors in industrial environments. But SCADA sensor networks are becoming more and more vulnerable to cyber-attacks due to increased connectivity. To safely adopt IoT technologies in the SCADA environments, it is important to improve the security of SCADA sensor networks. In this paper we propose a multiple filtering technique based on whitelists to detect illegitimate packets. Our proposed system detects the traffic of network and application protocol attacks with a set of whitelists collected from normal traffic.

  1. Diagnosis of Dengue Infection Using Conventional and Biosensor Based Techniques

    Directory of Open Access Journals (Sweden)

    Om Parkash

    2015-10-01

    Full Text Available Dengue is an arthropod-borne viral disease caused by four antigenically different serotypes of dengue virus. This disease is considered as a major public health concern around the world. Currently, there is no licensed vaccine or antiviral drug available for the prevention and treatment of dengue disease. Moreover, clinical features of dengue are indistinguishable from other infectious diseases such as malaria, chikungunya, rickettsia and leptospira. Therefore, prompt and accurate laboratory diagnostic test is urgently required for disease confirmation and patient triage. The traditional diagnostic techniques for the dengue virus are viral detection in cell culture, serological testing, and RNA amplification using reverse transcriptase PCR. This paper discusses the conventional laboratory methods used for the diagnosis of dengue during the acute and convalescent phase and highlights the advantages and limitations of these routine laboratory tests. Subsequently, the biosensor based assays developed using various transducers for the detection of dengue are also reviewed.

  2. An RSS based location estimation technique for cognitive relay networks

    KAUST Repository

    Qaraqe, Khalid A.

    2010-11-01

    In this paper, a received signal strength (RSS) based location estimation method is proposed for a cooperative wireless relay network where the relay is a cognitive radio. We propose a method for the considered cognitive relay network to determine the location of the source using the direct and the relayed signal at the destination. We derive the Cramer-Rao lower bound (CRLB) expressions separately for x and y coordinates of the location estimate. We analyze the effects of cognitive behaviour of the relay on the performance of the proposed method. We also discuss and quantify the reliability of the location estimate using the proposed technique if the source is not stationary. The overall performance of the proposed method is presented through simulations. ©2010 IEEE.

  3. Diagnosis of Dengue Infection Using Conventional and Biosensor Based Techniques.

    Science.gov (United States)

    Parkash, Om; Shueb, Rafidah Hanim

    2015-10-01

    Dengue is an arthropod-borne viral disease caused by four antigenically different serotypes of dengue virus. This disease is considered as a major public health concern around the world. Currently, there is no licensed vaccine or antiviral drug available for the prevention and treatment of dengue disease. Moreover, clinical features of dengue are indistinguishable from other infectious diseases such as malaria, chikungunya, rickettsia and leptospira. Therefore, prompt and accurate laboratory diagnostic test is urgently required for disease confirmation and patient triage. The traditional diagnostic techniques for the dengue virus are viral detection in cell culture, serological testing, and RNA amplification using reverse transcriptase PCR. This paper discusses the conventional laboratory methods used for the diagnosis of dengue during the acute and convalescent phase and highlights the advantages and limitations of these routine laboratory tests. Subsequently, the biosensor based assays developed using various transducers for the detection of dengue are also reviewed. PMID:26492265

  4. Demand Management Based on Model Predictive Control Techniques

    Directory of Open Access Journals (Sweden)

    Yasser A. Davizón

    2014-01-01

    Full Text Available Demand management (DM is the process that helps companies to sell the right product to the right customer, at the right time, and for the right price. Therefore the challenge for any company is to determine how much to sell, at what price, and to which market segment while maximizing its profits. DM also helps managers efficiently allocate undifferentiated units of capacity to the available demand with the goal of maximizing revenue. This paper introduces control system approach to demand management with dynamic pricing (DP using the model predictive control (MPC technique. In addition, we present a proper dynamical system analogy based on active suspension and a stability analysis is provided via the Lyapunov direct method.

  5. SAR IMAGE ENHANCEMENT BASED ON BEAM SHARPENING TECHNIQUE

    Institute of Scientific and Technical Information of China (English)

    LIYong; ZI-IANGKun-hui; ZHUDai-yin; ZHUZhao-da

    2004-01-01

    A major problem encountered in enhancing SAR image is the total loss of phase information and the unknown parameters of imaging system. The beam sharpening technique, combined with synthetic aperture radiation pattern estimation provides an approach to process this kind of data to achieve higher apparent resolution. Based on the criterion of minimizing the expected quadratic estimation error, an optimum FIR filter with a symmetrical structure is designed whose coefficients depend on the azimuth response of local isolated prominent points because this response can be approximately regarded as the synthetic aperture radiation pattern of the imaging system. The point target simulation shows that the angular resolution is improved by a ratio of almost two to one. The processing results of a live SAR image demonstrate the validity of the method.

  6. A Novel Technique Based on Node Registration in MANETs

    Directory of Open Access Journals (Sweden)

    Rashid Jalal Qureshi

    2012-09-01

    Full Text Available In ad hoc network communication links between the nodes are wireless and each node acts as a router for the other node and packet is forward from one node to other. This type of networks helps in solving challenges and problems that may arise in every day communication. Mobile Ad Hoc Networks is a new field of research and it is particularly useful in situations where network infrastructure is costly. Protecting MANETs from security threats is a challenging task because of the MANETs dynamic topology. Every node in a MANETs is independent and is free to move in any direction, therefore change its connections to other nodes frequently. Due to its decentralized nature different types of attacks can be occur. The aim of this research paper is to investigate different MANETs security attacks and proposed nodes registration based technique by using cryptography functions.

  7. Applications of imaged capillary isoelectric focussing technique in development of biopharmaceutical glycoprotein-based products.

    Science.gov (United States)

    Anderson, Carrie L; Wang, Yang; Rustandi, Richard R

    2012-06-01

    CE-based methods have increasingly been applied to the analysis of a variety of different type proteins. One of those techniques is imaged capillary isoelectric focusing (icIEF), a method that has been used extensively in the field of protein-based drug development as a tool for product identification, stability monitoring, and characterization. It offers many advantages over the traditional labor-intensive IEF slab gel method and even standard cIEF with on-line detection technologies with regard to method development, reproducibility, robustness, and speed. Here, specific examples are provided for biopharmaceutical glycoprotein products such as mAbs, erythropoietin (EPO), and recombinant Fc-fusion proteins, though the technique can be adapted for many other therapeutic proteins. Applications of iCIEF using a Convergent Bioscience instrument (Toronto, Canada) with whole-field imaging technology are presented and discussed. These include a quick method to establish an identity test for many protein-based products, product release, and stability evaluation of glycoproteins with respect to charge heterogeneity under accelerated temperature stress, different pH conditions, and in different formulations. Finally, characterization of glycoproteins using this iCIEF technology is discussed with respect to biosimilar development, clone selection, and antigen binding. The data presented provide a "taste'' of what icIEF method can do to support the development of biopharmaceutical glycoprotein products from early clone screening for better product candidates to characterization of the final commercial products. PMID:22736354

  8. An investigation of a video-based patient repositioning technique

    International Nuclear Information System (INIS)

    Purpose: We have investigated a video-based patient repositioning technique designed to use skin features for radiotherapy repositioning. We investigated the feasibility of the clinical application of this system by quantitative evaluation of performance characteristics of the methodology. Methods and Materials: Multiple regions of interest (ROI) were specified in the field of view of video cameras. We used a normalized correlation pattern-matching algorithm to compute the translations of each ROI pattern in a target image. These translations were compared against trial translations using a quadratic cost function for an optimization process in which the patient rotation and translational parameters were calculated. Results: A hierarchical search technique achieved high-speed (compute correlation for 128x128 ROI in 512x512 target image within 0.005 s) and subpixel spatial accuracy (as high as 0.2 pixel). By treating the observed translations as movements of points on the surfaces of a hypothetical cube, we were able to estimate accurately the actual translations and rotations of the test phantoms used in our experiments to less than 1 mm and 0.2 deg. with a standard deviation of 0.3 mm and 0.5 deg. respectively. For human volunteer cases, we estimated the translations and rotations to have an accuracy of 2 mm and 1.2 deg. Conclusion: A personal computer-based video system is suitable for routine patient setup of fractionated conformal radiotherapy. It is expected to achieve high-precision repositioning of the skin surface with high efficiency

  9. Shellac and Aloe vera gel based surface coating for shelf life extension of tomatoes.

    Science.gov (United States)

    Chauhan, O P; Nanjappa, C; Ashok, N; Ravi, N; Roopa, N; Raju, P S

    2015-02-01

    Shellac (S) and Aloe vera gel (AG) were used to develop edible surface coatings for shelf-life extension of tomato fruits. The coating was prepared by dissolving de-waxed and bleached shellac in an alkaline aqueous medium as such as well as in combination with AG. Incorporation of AG in shellac coating improved permeability characteristics of the coating film towards oxygen and carbon dioxide and water vapours. The coatings when applied to tomatoes delayed senescence which was characterized by restricted changes in respiration and ethylene synthesis rates during storage. Texture of the fruits when measured in terms of firmness showed restricted changes as compared to untreated control. Similar observations were also recorded in the case of instrumental colour (L*, a* and b* values). The developed coatings extended shelf-life of tomatoes by 10, 8 and 12 days in case of shellac (S), AG and composite coating (S + AG) coated fruits, respectively; when kept at ambient storage conditions (28 ± 2 °C). PMID:25694740

  10. Quantum extension of European option pricing based on the Ornstein-Uhlenbeck process

    CERN Document Server

    Piotrowski, E W; Zambrzycka, A; Piotrowski, Edward W.; Schroeder, Malgorzata; Zambrzycka, Anna

    2005-01-01

    In this work we propose a option pricing model based on the Ornstein-Uhlenbeck process. It is a new look at the Black-Scholes formula which is based on the quantum game theory. We show the differences between a classical look which is price changing by a Wiener process and the pricing is supported by a quantum model.

  11. CANDU in-reactor quantitative visual-based inspection techniques

    Science.gov (United States)

    Rochefort, P. A.

    2009-02-01

    This paper describes two separate visual-based inspection procedures used at CANDU nuclear power generating stations. The techniques are quantitative in nature and are delivered and operated in highly radioactive environments with access that is restrictive, and in one case is submerged. Visual-based inspections at stations are typically qualitative in nature. For example a video system will be used to search for a missing component, inspect for a broken fixture, or locate areas of excessive corrosion in a pipe. In contrast, the methods described here are used to measure characteristic component dimensions that in one case ensure ongoing safe operation of the reactor and in the other support reactor refurbishment. CANDU reactors are Pressurized Heavy Water Reactors (PHWR). The reactor vessel is a horizontal cylindrical low-pressure calandria tank approximately 6 m in diameter and length, containing heavy water as a neutron moderator. Inside the calandria, 380 horizontal fuel channels (FC) are supported at each end by integral end-shields. Each FC holds 12 fuel bundles. The heavy water primary heat transport water flows through the FC pressure tube, removing the heat from the fuel bundles and delivering it to the steam generator. The general design of the reactor governs both the type of measurements that are required and the methods to perform the measurements. The first inspection procedure is a method to remotely measure the gap between FC and other in-core horizontal components. The technique involves delivering vertically a module with a high-radiation-resistant camera and lighting into the core of a shutdown but fuelled reactor. The measurement is done using a line-of-sight technique between the components. Compensation for image perspective and viewing elevation to the measurement is required. The second inspection procedure measures flaws within the reactor's end shield FC calandria tube rolled joint area. The FC calandria tube (the outer shell of the FC) is

  12. Energy-Efficient Network Transmission between Satellite Swarms and Earth Stations Based on Lyapunov Optimization Techniques

    Directory of Open Access Journals (Sweden)

    Weiwei Fang

    2014-01-01

    Full Text Available The recent advent of satellite swarm technologies has enabled space exploration with a massive number of picoclass, low-power, and low-weight spacecraft. However, developing swarm-based satellite systems, from conceptualization to validation, is a complex multidisciplinary activity. One of the primary challenges is how to achieve energy-efficient data transmission between the satellite swarm and terrestrial terminal stations. Employing Lyapunov optimization techniques, we present an online control algorithm to optimally dispatch traffic load among different satellite-ground links for minimizing overall energy consumption over time. Our algorithm is able to independently and simultaneously make control decisions on traffic dispatching over intersatellite-links and up-down-links so as to offer provable energy and delay guarantees, without requiring any statistical information of traffic arrivals and link condition. Rigorous analysis and extensive simulations have demonstrated the performance and robustness of the proposed new algorithm.

  13. Streaming Media over a Color Overlay Based on Forward Error Correction Technique

    Institute of Scientific and Technical Information of China (English)

    张晓瑜; 沈国斌; 李世鹏; 钟玉琢

    2004-01-01

    The number of clients that receive high-quality streaming video from a source is greatly limited by the application requirements,such as the high bandwidth and reliability.In this work,a method was developed to construct a color overlay,which enables clients to receive data across multiple paths,based on the forward error correction technique.The color overlay enlarges system capacity by reducing the bottlenecks and extending the bandwidth,improves reliability against node failure,and is more resilient to fluctuations of network metrics.A light-weight protocol for building the overlay is also presented.Extensive simulations were conducted and the results clearly support the claimed advantages.

  14. An extension of the 'Malkus hypothesis' to the turbulent base flow of blunt sections

    Science.gov (United States)

    Vorus, William S.; Chen, Liyong

    1987-11-01

    An approximate theory for the mean turbulent near-wake of cylindrical bodies with blunt after edges is developed and implemented in terms of a linearized closed free-streamline theory of thin blunt-based symmetric sections. In the present application, the Malkus hypothesis leads to maximization of the rate of change of mean kinetic energy along the separation-cavity streamline. The results compare well with experimental measurements of mean base pressures and section drag, although the linearizing assumptions of section-cavity slenderness and base-pressure magnitude are not so well preserved in the calculated results.

  15. Physical, Chemical and Biochemical Modifications of Protein-Based Films and Coatings: An Extensive Review.

    Science.gov (United States)

    Zink, Joël; Wyrobnik, Tom; Prinz, Tobias; Schmid, Markus

    2016-01-01

    Protein-based films and coatings are an interesting alternative to traditional petroleum-based materials. However, their mechanical and barrier properties need to be enhanced in order to match those of the latter. Physical, chemical, and biochemical methods can be used for this purpose. The aim of this article is to provide an overview of the effects of various treatments on whey, soy, and wheat gluten protein-based films and coatings. These three protein sources have been chosen since they are among the most abundantly used and are well described in the literature. Similar behavior might be expected for other protein sources. Most of the modifications are still not fully understood at a fundamental level, but all the methods discussed change the properties of the proteins and resulting products. Mastering these modifications is an important step towards the industrial implementation of protein-based films. PMID:27563881

  16. Automata-Based Programming Technology Extension for Generation of JML Annotated Java Card Code

    OpenAIRE

    Andrey, A.

    2008-01-01

    This paper gives an overview of the ongoing research project which concerns generation of dependable Java Card code. According to the automata-based programming technology, code is generated from a high-level application behavior description which is based on finite state machines. An extra benefit from the use of such description is the possibility of generation of formal application specification in Java Modeling Language. Conformance of the code against its specification could be checked b...

  17. Parameter tuning of PVD process based on artificial intelligence technique

    Science.gov (United States)

    Norlina, M. S.; Diyana, M. S. Nor; Mazidah, P.; Rusop, M.

    2016-07-01

    In this study, an artificial intelligence technique is proposed to be implemented in the parameter tuning of a PVD process. Due to its previous adaptation in similar optimization problems, genetic algorithm (GA) is selected to optimize the parameter tuning of the RF magnetron sputtering process. The most optimized parameter combination obtained from GA's optimization result is expected to produce the desirable zinc oxide (ZnO) thin film from the sputtering process. The parameters involved in this study were RF power, deposition time and substrate temperature. The algorithm was tested to optimize the 25 datasets of parameter combinations. The results from the computational experiment were then compared with the actual result from the laboratory experiment. Based on the comparison, GA had shown that the algorithm was reliable to optimize the parameter combination before the parameter tuning could be done to the RF magnetron sputtering machine. In order to verify the result of GA, the algorithm was also been compared to other well known optimization algorithms, which were, particle swarm optimization (PSO) and gravitational search algorithm (GSA). The results had shown that GA was reliable in solving this RF magnetron sputtering process parameter tuning problem. GA had shown better accuracy in the optimization based on the fitness evaluation.

  18. Electron tomography based on a total variation minimization reconstruction technique

    Energy Technology Data Exchange (ETDEWEB)

    Goris, B., E-mail: bart.goris@ua.ac.be [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Van den Broek, W. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Batenburg, K.J. [Centrum Wiskunde and Informatica, Science Park 123, NL-1098XG Amsterdam (Netherlands); Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Heidari Mezerji, H.; Bals, S. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium)

    2012-02-15

    The 3D reconstruction of a tilt series for electron tomography is mostly carried out using the weighted backprojection (WBP) algorithm or using one of the iterative algorithms such as the simultaneous iterative reconstruction technique (SIRT). However, it is known that these reconstruction algorithms cannot compensate for the missing wedge. Here, we apply a new reconstruction algorithm for electron tomography, which is based on compressive sensing. This is a field in image processing specialized in finding a sparse solution or a solution with a sparse gradient to a set of ill-posed linear equations. Therefore, it can be applied to electron tomography where the reconstructed objects often have a sparse gradient at the nanoscale. Using a combination of different simulated and experimental datasets, it is shown that missing wedge artefacts are reduced in the final reconstruction. Moreover, it seems that the reconstructed datasets have a higher fidelity and are easier to segment in comparison to reconstructions obtained by more conventional iterative algorithms. -- Highlights: Black-Right-Pointing-Pointer A reconstruction algorithm for electron tomography is investigated based on total variation minimization. Black-Right-Pointing-Pointer Missing wedge artefacts are reduced by this algorithm. Black-Right-Pointing-Pointer The reconstruction is easier to segment. Black-Right-Pointing-Pointer More reliable quantitative information can be obtained.

  19. Crack identification based on synthetic artificial intelligent technique

    Energy Technology Data Exchange (ETDEWEB)

    Shim, Mun Bo; Suh, Myung Won [Sungkyunkwan Univ., Suwon (Korea, Republic of)

    2001-07-01

    It has been established that a crack has an important effect on the dynamic behavior of a structure. This effect depends mainly on the location and depth of the crack. To identify the location and depth of a crack in a structure, a method is presented in this paper which uses synthetic artificial intelligent technique, that is, Adaptive-Network-based Fuzzy Inference System(ANFIS) solved via hybrid learning algorithm(the back-propagation gradient descent and the least-squares method) are used to learn the input(the location and depth of a crack)-output(the structural eigenfrequencies) relation of the structural system. With this ANFIS and a Continuous Evolutionary Algorithm(CEA), it is possible to formulate the inverse problem. CEAs based on genetic algorithms work efficiently for continuous search space optimization problems like a parameter identification problem. With this ANFIS, CEAs are used to identify the crack location and depth minimizing the difference from the measured frequencies. We have tried this new idea on a simple beam structure and the results are promising.

  20. Structural design systems using knowledge-based techniques

    International Nuclear Information System (INIS)

    Engineering information management and the corresponding information systems are of a strategic importance for industrial enterprises. This thesis treats the interdisciplinary field of designing computing systems for structural design and analysis using knowledge-based techniques. Specific conceptual models have been designed for representing the structure and the process of objects and activities in a structural design and analysis domain. In this thesis, it is shown how domain knowledge can be structured along several classification principles in order to reduce complexity and increase flexibility. By increasing the conceptual level of the problem description and representation of the domain knowledge in a declarative form, it is possible to enhance the development, maintenance and use of software for mechanical engineering. This will result in a corresponding increase of the efficiency of the mechanical engineering design process. These ideas together with the rule-based control point out the leverage of declarative knowledge representation within this domain. Used appropriately, a declarative knowledge representation preserves information better, is more problem-oriented and change-tolerant than procedural representations. 74 refs

  1. Hyperspectral-imaging-based techniques applied to wheat kernels characterization

    Science.gov (United States)

    Serranti, Silvia; Cesare, Daniela; Bonifazi, Giuseppe

    2012-05-01

    Single kernels of durum wheat have been analyzed by hyperspectral imaging (HSI). Such an approach is based on the utilization of an integrated hardware and software architecture able to digitally capture and handle spectra as an image sequence, as they results along a pre-defined alignment on a surface sample properly energized. The study was addressed to investigate the possibility to apply HSI techniques for classification of different types of wheat kernels: vitreous, yellow berry and fusarium-damaged. Reflectance spectra of selected wheat kernels of the three typologies have been acquired by a laboratory device equipped with an HSI system working in near infrared field (1000-1700 nm). The hypercubes were analyzed applying principal component analysis (PCA) to reduce the high dimensionality of data and for selecting some effective wavelengths. Partial least squares discriminant analysis (PLS-DA) was applied for classification of the three wheat typologies. The study demonstrated that good classification results were obtained not only considering the entire investigated wavelength range, but also selecting only four optimal wavelengths (1104, 1384, 1454 and 1650 nm) out of 121. The developed procedures based on HSI can be utilized for quality control purposes or for the definition of innovative sorting logics of wheat.

  2. Positron emission tomography, physical bases and comparaison with other techniques

    International Nuclear Information System (INIS)

    Positron emission tomography (PET) is a medical imaging technique that measures the three-dimensional distribution of molecules marked by a positron-emitting particle. PET has grown significantly in clinical fields, particularly in oncology for diagnosis and therapeutic follow purposes. The technical evolutions of this technique are fast. Among the technical improvements, is the coupling of the PET scan with computed tomography (CT). PET is obtained by intravenous injection of a radioactive tracer. The marker is usually fluorine (18F) embedded in a glucose molecule forming the 18-fluorodeoxyglucose (FDG-18). This tracer, similar to glucose, binds to tissues that consume large quantities of the sugar such cancerous tissue, cardiac muscle or brain. Detection using scintillation crystals (BGO, LSO, LYSO) suitable for high energy (511keV) recognizes the lines of the gamma photons originating from the annihilation of a positron with an electron. The electronics of detection or coincidence circuit is based on two criteria: a time window, of about 6 to 15 ns, and an energy window. This system measures the true coincidences that correspond to the detection of two photons of 511 kV from the same annihilation. Most PET devices are constituted by a series of elementary detectors distributed annularly around the patient. Each detector comprises a scintillation crystal matrix coupled to a finite number (4 or 6) of photomultipliers. The electronic circuit, or the coincidence circuit, determines the projection point of annihilation by means of two elementary detectors. The processing of such information must be extremely fast, considering the count rates encountered in practice. The information measured by the coincidence circuit is then positioned in a matrix or sinogram, which contains a set of elements of a projection section of the object. Images are obtained by tomographic reconstruction by powerful computer stations equipped with a software tools allowing the analysis and

  3. Image-based Virtual Exhibit and Its Extension to 3D

    Institute of Scientific and Technical Information of China (English)

    Ming-Min Zhang; Zhi-Geng Pan; Li-Feng Ren; Peng Wang

    2007-01-01

    In this paper we introduce an image-based virtual exhibition system especially for clothing product. It can provide a powerful material substitution function, which is very useful for customization clothing-built. A novel color substitution algorithm and two texture morphing methods are designed to ensure realistic substitution result. To extend it to 3D, we need to do the model reconstruction based on photos. Thus we present an improved method for modeling human body. It deforms a generic model with shape details extracted from pictures to generate a new model. Our method begins with model image generation followed by silhouette extraction and segmentation. Then it builds a mapping between pixels inside every pair of silhouette segments in the model image and in the picture. Our mapping algorithm is based on a slice space representation that conforms to the natural features of human body.

  4. Whole Genome Sequencing Based Characterization of Extensively Drug-Resistant Mycobacterium tuberculosis Isolates from Pakistan

    KAUST Repository

    Ali, Asho

    2015-02-26

    Improved molecular diagnostic methods for detection drug resistance in Mycobacterium tuberculosis (MTB) strains are required. Resistance to first- and second- line anti-tuberculous drugs has been associated with single nucleotide polymorphisms (SNPs) in particular genes. However, these SNPs can vary between MTB lineages therefore local data is required to describe different strain populations. We used whole genome sequencing (WGS) to characterize 37 extensively drug-resistant (XDR) MTB isolates from Pakistan and investigated 40 genes associated with drug resistance. Rifampicin resistance was attributable to SNPs in the rpoB hot-spot region. Isoniazid resistance was most commonly associated with the katG codon 315 (92%) mutation followed by inhA S94A (8%) however, one strain did not have SNPs in katG, inhA or oxyR-ahpC. All strains were pyrazimamide resistant but only 43% had pncA SNPs. Ethambutol resistant strains predominantly had embB codon 306 (62%) mutations, but additional SNPs at embB codons 406, 378 and 328 were also present. Fluoroquinolone resistance was associated with gyrA 91-94 codons in 81% of strains; four strains had only gyr B mutations, while others did not have SNPs in either gyrA or gyrB. Streptomycin resistant strains had mutations in ribosomal RNA genes; rpsL codon 43 (42%); rrs 500 region (16%), and gidB (34%) while six strains did not have mutations in any of these genes. Amikacin/kanamycin/capreomycin resistance was associated with SNPs in rrs at nt1401 (78%) and nt1484 (3%), except in seven (19%) strains. We estimate that if only the common hot-spot region targets of current commercial assays were used, the concordance between phenotypic and genotypic testing for these XDR strains would vary between rifampicin (100%), isoniazid (92%), flouroquinolones (81%), aminoglycoside (78%) and ethambutol (62%); while pncA sequencing would provide genotypic resistance in less than half the isolates. This work highlights the importance of expanded

  5. Security extension for the Canetti-Krawczyk model in identity-based systems

    Institute of Scientific and Technical Information of China (English)

    LI Xinghua; MA Jianfeng; SangJae Moon

    2005-01-01

    The Canetti-Krawczyk (CK) model is a formalism for the analysis of keyexchange protocols, which can guarantee many security properties for the protocols proved secure by this model. But we find this model lacks the ability to guarantee key generation center (KGC) forward secrecy, which is an important security property for key-agreement protocols based on Identity. The essential reason leading to this weakness is that it does not fully consider the attacker's capabilities. In this paper, the CK model is accordingly extended with a new additional attacker's capability of the KGC corruption in Identity-based systems, which enables it to support KGC forward secrecy.

  6. Extension of information entropy-based measures in incomplete information systems

    Institute of Scientific and Technical Information of China (English)

    LI Ren-pu; HUANG Dao; GAO Mao-ting

    2005-01-01

    It is helpful for people to understand the essence of rough set theory to study the concepts and operations of rough set theory from its information view. In this paper we address knowledge expression and knowledge reduction in incomplete information systems from the information view of rough set theory. First, by extending information entropy-based measures in complete information systems, two new measures of incomplete entropy and incomplete conditional entropy are presented for incomplete information systems. And then, based on these measures the problem of knowledge reduction in incomplete information systems is analyzed and the reduct definitions in incomplete information system and incomplete decision table are proposed respectively. Finally,the reduct definitions based on incomplete entropy and the reduct definitions based on similarity relation are compared. Two equivalent relationships between them are proved by theorems and an in equivalent relationship between them is illustrated by an example. The work of this paper extends the research of rough set theory from information view to incomplete information systems and establishes the theoretical basis for seeking efficient algorithm of knowledge acquisition in incomplete information systems.

  7. Presentation of Medical Knowledge by Arden Syntax with Extensions Based on Fuzzy Set Theory

    OpenAIRE

    Tiffe, Sven; Adlassnig, Klaus-Peter

    2001-01-01

    The Arden Syntax for Medical Logic Systems (Arden) is an approach to present medical knowledge by defining rules that include crisp decision logic. Clinical rules based on natural language usually contain uncertainty by way of vagueness which cannot be handled by Arden. Uncertain knowledge can be presented by extending Arden to include concepts of fuzzy set theory.

  8. Depth-based coding of MVD data for 3D video extension of H.264/AVC

    Science.gov (United States)

    Rusanovskyy, Dmytro; Hannuksela, Miska M.; Su, Wenyi

    2013-06-01

    This paper describes a novel approach of using depth information for advanced coding of associated video data in Multiview Video plus Depth (MVD)-based 3D video systems. As a possible implementation of this conception, we describe two coding tools that have been developed for H.264/AVC based 3D Video Codec as response to Moving Picture Experts Group (MPEG) Call for Proposals (CfP). These tools are Depth-based Motion Vector Prediction (DMVP) and Backward View Synthesis Prediction (BVSP). Simulation results conducted under JCT-3V/MPEG 3DV Common Test Conditions show, that proposed in this paper tools reduce bit rate of coded video data by 15% of average delta bit rate reduction, which results in 13% of bit rate savings on total for the MVD data over the state-of-the-art MVC+D coding. Moreover, presented in this paper conception of depth-based coding of video has been further developed by MPEG 3DV and JCT-3V and this work resulted in even higher compression efficiency, bringing about 20% of delta bit rate reduction on total for coded MVD data over the reference MVC+D coding. Considering significant gains, proposed in this paper coding approach can be beneficial for development of new 3D video coding standards. [Figure not available: see fulltext.

  9. Extensive simulation studies on the reconstructed image resolution of a position sensitive detector based on pixelated CdTe crystals

    CERN Document Server

    Zachariadou, K; Kaissas, I; Seferlis, S; Lambropoulos, C; Loukas, D; Potiriadis, C

    2011-01-01

    We present results on the reconstructed image resolution of a position sensitive radiation instrument (COCAE) based on extensive simulation studies. The reconstructed image resolution has been investigated in a wide range of incident photon energies emitted by point-like sources located at different source-to-detector distances on and off the detector's symmetry axis. The ability of the detector to distinguish multiple radioactive sources observed simultaneously is investigating by simulating point-like sources of different energies located on and off the detector's symmetry axis and at different positions

  10. Ground-based intercomparison of two isoprene measurement techniques

    Directory of Open Access Journals (Sweden)

    E. Leibrock

    2003-01-01

    Full Text Available An informal intercomparison of two isoprene (C5H8 measurement techniques was carried out during Fall of 1998 at a field site located approximately 3 km west of Boulder, Colorado, USA. A new chemical ionization mass spectrometric technique (CIMS was compared to a well-established gas chromatographic technique (GC. The CIMS technique utilized benzene cation chemistry to ionize isoprene. The isoprene levels measured by the CIMS were often larger than those obtained with the GC. The results indicate that the CIMS technique suffered from an anthropogenic interference associated with air masses from the Denver, CO metropolitan area as well as an additional interference occurring in clean conditions. However, the CIMS technique is also demonstrated to be sensitive and fast. Especially after introduction of a tandem mass spectrometric technique, it is therefore a candidate for isoprene measurements in remote environments near isoprene sources.

  11. Application of rule-based data mining techniques to real time ATLAS Grid job monitoring data

    Science.gov (United States)

    Ahrens, R.; Harenberg, T.; Kalinin, S.; Mättig, P.; Sandhoff, M.; dos Santos, T.; Volkmer, F.

    2012-12-01

    The Job Execution Monitor (JEM) is a job-centric grid job monitoring software developed at the University of Wuppertal and integrated into the pilot-based PanDA job brokerage system leveraging physics analysis and Monte Carlo event production for the ATLAS experiment on the Worldwide LHC Computing Grid (WLCG). With JEM, job progress and grid worker node health can be supervised in real time by users, site admins and shift personnel. Imminent error conditions can be detected early and countermeasures can be initiated by the Job's owner immedeatly. Grid site admins can access aggregated data of all monitored jobs to infer the site status and to detect job and Grid worker node misbehavior. Shifters can use the same aggregated data to quickly react to site error conditions and broken production tasks. In this work, the application of novel data-centric rule based methods and data-mining techniques to the real time monitoring data is discussed. The usage of such automatic inference techniques on monitoring data to provide job and site health summary information to users and admins is presented. Finally, the provision of a secure real-time control and steering channel to the job as extension of the presented monitoring software is considered and a possible model of such the control method is presented.

  12. Application of rule-based data mining techniques to real time ATLAS Grid job monitoring data

    International Nuclear Information System (INIS)

    The Job Execution Monitor (JEM) is a job-centric grid job monitoring software developed at the University of Wuppertal and integrated into the pilot-based PanDA job brokerage system leveraging physics analysis and Monte Carlo event production for the ATLAS experiment on the Worldwide LHC Computing Grid (WLCG). With JEM, job progress and grid worker node health can be supervised in real time by users, site admins and shift personnel. Imminent error conditions can be detected early and countermeasures can be initiated by the Job's owner immedeatly. Grid site admins can access aggregated data of all monitored jobs to infer the site status and to detect job and Grid worker node misbehavior. Shifters can use the same aggregated data to quickly react to site error conditions and broken production tasks. In this work, the application of novel data-centric rule based methods and data-mining techniques to the real time monitoring data is discussed. The usage of such automatic inference techniques on monitoring data to provide job and site health summary information to users and admins is presented. Finally, the provision of a secure real-time control and steering channel to the job as extension of the presented monitoring software is considered and a possible model of such the control method is presented.

  13. Extension VIKOR for Priority Orders Based on Three Parameters Interval Fuzzy Number

    Directory of Open Access Journals (Sweden)

    Qian Zhang

    2013-05-01

    Full Text Available In this study, an improved VIKOR method was presented to deal with multi-attribute decision-making based on three parameters interval fuzzy number. The attribute weights were unknown but alternative priority of object preference was given. A new non-linear rewards and punishment method in positive interval was proposed to make the attributes normal, information covered reliability and relative superiority degree two methods were used to compare and sort the Three Parameters Interval Fuzzy Number (TPIFN and a quadratic programming based on contribution was constructed to get attribute weights, then defined the information entropy distance between TPIFN and the optimum object orders was obtained by VIKOR. The numerical example was provided to demonstrate the feasibility and validity.

  14. Extension of direct displacement-based design methodology for bridges to account for higher mode effects

    OpenAIRE

    Kappos, A. J.; Gkatzogias, K.I.; Gidaris, I.G.

    2013-01-01

    An improvement is suggested to the direct displacement-based design (DDBD) procedure for bridges to account for higher mode effects, the key idea being not only the proper prediction of a target-displacement profile through the effective mode shape (EMS) method (wherein all significant modes are considered), but also the proper definition of the corresponding peak structural response. The proposed methodology is then applied to an actual concrete bridge wherein the different pier heights and ...

  15. A GIS extension model to calculate urban heat island intensity based on urban geometry

    OpenAIRE

    Nakata, C. M.; Souza, Léa Cristina Lucas; Rodrigues, Daniel Souto

    2015-01-01

    This paper presents a simulation model, which was incorporated into a Geographic Information System (GIS), in order to calculate the maximum intensity of urban heat islands based on urban geometry data. The method-ology of this study stands on a theoretical-numerical basis (Okeâ s model), followed by the study and selection of existing GIS tools, the design of the calculation model, the incorporation of the resulting algorithm into the GIS platform and the application of the tool, developed ...

  16. SIP Extension and Implementation of Multimedia Communication System Based on SIP

    OpenAIRE

    Lin Hai-Yun; Wang Yu-Jiao

    2013-01-01

    This system adopts SIP as control signaling protocol of multimedia communication, combining the demand of multimedia communication and realizing the SIP-based multimedia communication system. This system is constructed on the basis of PC platform; the server uses Open Source Project, developed for core; Windows terminal signaling modules use GNU oSIP stacks; RTP transmission media stream uses oRTP stacks. Aiming at the support limitation of present multimedia ...

  17. WSN- and IOT-Based Smart Homes and Their Extension to Smart Buildings

    OpenAIRE

    Hemant Ghayvat; Subhas Mukhopadhyay; Xiang Gui; Nagender Suryadevara

    2015-01-01

    Our research approach is to design and develop reliable, efficient, flexible, economical, real-time and realistic wellness sensor networks for smart home systems. The heterogeneous sensor and actuator nodes based on wireless networking technologies are deployed into the home environment. These nodes generate real-time data related to the object usage and movement inside the home, to forecast the wellness of an individual. Here, wellness stands for how efficiently someone stays fit in the home...

  18. Signaling pathway-based identification of extensive prognostic gene signatures for lung adenocarcinoma

    OpenAIRE

    Wan, Ying-Wooi; Beer, David G.; Guo, Nancy Lan

    2011-01-01

    Tumor recurrence is the major cause of death in lung cancer treatment. To date, there is no clinically applied gene expression-based model to predict the risk for tumor recurrence in non-small cell lung cancer (NSCLC). We sought to embed crosstalk with major signaling pathways into biomarker identification. Three approaches were used to identify prognostic gene signatures from 442 lung adenocarcinoma samples. Candidate genes co-expressed with 6 or 7 major NSCLC signaling hallmarks were identi...

  19. An Extensible Agent Architecture for a Competitive Market-Based Allocation of Consumer Attention Space

    OpenAIRE

    Hoen, 't, Pieter Jan; Bohte, Sander; Gerding, Enrico; La Poutré, Han

    2002-01-01

    A competitive distributed recommendation mechanism is introduced based on adaptive software agents for efficiently allocating the ``customer attention space'', or banners. In the example of an electronic shopping mall, the task of correctly profiling and analyzing the customers is delegated to the individual shops that operate in a distributed, remote fashion. The evaluation and classification of customers for the bidding on banners is not handled by a central agency as is customary, but is a...

  20. Carbon Storage in an Extensive Karst-distributed Region of Southwestern China based on Multiple Methods

    Science.gov (United States)

    Guo, C.; Wu, Y.; Yang, H.; Ni, J.

    2015-12-01

    Accurate estimation of carbon storage is crucial to better understand the processes of global and regional carbon cycles and to more precisely project ecological and economic scenarios for the future. Southwestern China has broadly and continuously distribution of karst landscapes with harsh and fragile habitats which might lead to rocky desertification, an ecological disaster which has significantly hindered vegetation succession and economic development in karst regions of southwestern China. In this study we evaluated the carbon storage in eight political divisions of southwestern China based on four methods: forest inventory, carbon density based on field investigations, CASA model driven by remote sensing data, and BIOME4/LPJ global vegetation models driven by climate data. The results show that: (1) The total vegetation carbon storage (including agricultural ecosystem) is 6763.97 Tg C based on the carbon density, and the soil organic carbon (SOC) storage (above 20cm depth) is 12475.72 Tg C. Sichuan Province (including Chongqing) possess the highest carbon storage in both vegetation and soil (1736.47 Tg C and 4056.56 Tg C, respectively) among the eight political divisions because of the higher carbon density and larger distribution area. The vegetation carbon storage in Hunan Province is the smallest (565.30 Tg C), and the smallest SOC storage (1127.40 Tg C) is in Guangdong Province; (2) Based on forest inventory data, the total aboveground carbon storage in the woody vegetation is 2103.29 Tg C. The carbon storage in Yunnan Province (819.01 Tg C) is significantly higher than other areas while tropical rainforests and seasonal forests in Yunnan contribute the maximum of the woody vegetation carbon storage (account for 62.40% of the total). (3) The net primary production (NPP) simulated by the CASA model is 68.57 Tg C/yr, while the forest NPP in the non-karst region (account for 72.50% of the total) is higher than that in the karst region. (4) BIOME4 and LPJ

  1. Development of a Flexible and Extensible Computer-based Simulation Platform for Healthcare Students.

    Science.gov (United States)

    Bindoff, Ivan; Cummings, Elizabeth; Ling, Tristan; Chalmers, Leanne; Bereznicki, Luke

    2015-01-01

    Accessing appropriate clinical placement positions for all health profession students can be expensive and challenging. Increasingly simulation, in a range of modes, is being used to enhance student learning and prepare them for clinical placement. Commonly these simulations are focused on the use of simulated patient mannequins which typically presented as single-event scenarios, difficult to organise, and usually scenarios include only a single healthcare profession. Computer based simulation is relatively under-researched and under-utilised but is beginning to demonstrate potential benefits. This paper describes the development and trialling of an entirely virtual 3D simulated environment for inter-professional student education. PMID:25676952

  2. Extensive validation of the code FUROM based on the IFPE database

    International Nuclear Information System (INIS)

    The fuel modelling code FUROM (FUel ROd Model), suitable for calculating the normal operation condition behaviour of PWR and WWER fuels, has been developed at AEKI for several years. The validation of the code has so far been based on the individual calculation of many relevant experiments. This, however, was a time consuming process that could give rise to errors both at the input and at the comparison stage. A new methodology is implemented to build up a uniform database from the IFPE data and run automated validation tasks depending on the model or phenomenon of interest. The general problems encountered and some results are presented here. (authors)

  3. Image content authentication technique based on Laplacian Pyramid

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    This paper proposes a technique of image content authentication based on the Laplacian Pyramid to verify the authenticity of image content.First,the image is decomposed into Laplacian Pyramid before the transformation.Next,the smooth and detail properties of the original image are analyzed according to the Laplacian Pyramid,and the properties are classified and encoded to get the corresponding characteristic values.Then,the signature derived from the encrypted characteristic values is embedded in the original image as a watermark.After the reception,the characteristic values of the received image are compared with the watermark drawn out from the image.The algorithm automatically identifies whether the content is tampered by means of morphologic filtration.The information of tampered location is Presented at the same time.Experimental results show that the pro posed authentication algorithm can effectively detect the event and location when the original image content is tampered.Moreover,it can tolerate some distortions produced by compression,filtration and noise degradation.

  4. WORMHOLE ATTACK MITIGATION IN MANET: A CLUSTER BASED AVOIDANCE TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Subhashis Banerjee

    2014-01-01

    Full Text Available A Mobile Ad-Hoc Network (MANET is a self configuring, infrastructure less network of mobile devices connected by wireless links. Loopholes like wireless medium, lack of a fixed infrastructure, dynamic topology, rapid deployment practices, and the hostile environments in which they may be deployed, make MANET vulnerable to a wide range of security attacks and Wormhole attack is one of them. During this attack a malicious node captures packets from one location in the network, and tunnels them to another colluding malicious node at a distant point, which replays them locally. This paper presents a cluster based Wormhole attack avoidance technique. The concept of hierarchical clustering with a novel hierarchical 32- bit node addressing scheme is used for avoiding the attacking path during the route discovery phase of the DSR protocol, which is considered as the under lying routing protocol. Pinpointing the location of the wormhole nodes in the case of exposed attack is also given by using this method.

  5. A response surface methodology based damage identification technique

    International Nuclear Information System (INIS)

    Response surface methodology (RSM) is a combination of statistical and mathematical techniques to represent the relationship between the inputs and outputs of a physical system by explicit functions. This methodology has been widely employed in many applications such as design optimization, response prediction and model validation. But so far the literature related to its application in structural damage identification (SDI) is scarce. Therefore this study attempts to present a systematic SDI procedure comprising four sequential steps of feature selection, parameter screening, primary response surface (RS) modeling and updating, and reference-state RS modeling with SDI realization using the factorial design (FD) and the central composite design (CCD). The last two steps imply the implementation of inverse problems by model updating in which the RS models substitute the FE models. The proposed method was verified against a numerical beam, a tested reinforced concrete (RC) frame and an experimental full-scale bridge with the modal frequency being the output responses. It was found that the proposed RSM-based method performs well in predicting the damage of both numerical and experimental structures having single and multiple damage scenarios. The screening capacity of the FD can provide quantitative estimation of the significance levels of updating parameters. Meanwhile, the second-order polynomial model established by the CCD provides adequate accuracy in expressing the dynamic behavior of a physical system

  6. Research on technique of wavefront retrieval based on Foucault test

    Science.gov (United States)

    Yuan, Lvjun; Wu, Zhonghua

    2010-05-01

    During finely grinding the best fit sphere and initial stage of polishing, surface error of large aperture aspheric mirrors is too big to test using common interferometer. Foucault test is widely used in fabricating large aperture mirrors. However, the optical path is disturbed seriously by air turbulence, and changes of light and dark zones can not be identified, which often lowers people's judging ability and results in making mistake to diagnose surface error of the whole mirror. To solve the problem, the research presents wavefront retrieval based on Foucault test through digital image processing and quantitative calculation. Firstly, real Foucault image can be gained through collecting a variety of images by CCD, and then average these image to eliminate air turbulence. Secondly, gray values are converted into surface error values through principle derivation, mathematical modeling, and software programming. Thirdly, linear deviation brought by defocus should be removed by least-square method to get real surface error. At last, according to real surface error, plot wavefront map, gray contour map and corresponding pseudo color contour map. The experimental results indicates that the three-dimensional wavefront map and two-dimensional contour map are able to accurately and intuitively show surface error on the whole mirrors under test, and they are beneficial to grasp surface error as a whole. The technique can be used to guide the fabrication of large aperture and long focal mirrors during grinding and initial stage of polishing the aspheric surface, which improves fabricating efficiency and precision greatly.

  7. A new membrane-based crystallization technique: tests on lysozyme

    Science.gov (United States)

    Curcio, Efrem; Profio, Gianluca Di; Drioli, Enrico

    2003-01-01

    The great importance of protein science both in industrial and scientific fields, in conjunction with the intrinsic difficulty to grow macromolecular crystals, stimulates the development of new observations and ideas that can be useful in initiating more systematic studies using novel approaches. In this regard, an innovative technique, based on the employment of microporous hydrophobic membranes in order to promote the formation of lysozyme crystals from supersaturated solutions, is introduced in this work. Operational principles and possible advantages, both in terms of controlled extraction of solvent by acting on the concentration of the stripping solution and reduced induction times, are outlined. Theoretical developments and experimental results concerning the mass transfer, in vapour phase, through the membrane are presented, as well as the results from X-ray diffraction to 1.7 Å resolution of obtained lysozyme crystals using NaCl as the crystallizing agent and sodium acetate as the buffer. Crystals were found to be tetragonal with unit cell dimensions of a= b=79.1 Å and c=37.9 Å; the overall Rmerge on intensities in the resolution range from 25 to 1.7 Å was, in the best case, 4.4%.

  8. A formal model for integrity protection based on DTE technique

    Institute of Scientific and Technical Information of China (English)

    JI Qingguang; QING Sihan; HE Yeping

    2006-01-01

    In order to provide integrity protection for the secure operating system to satisfy the structured protection class' requirements, a DTE technique based integrity protection formalization model is proposed after the implications and structures of the integrity policy have been analyzed in detail. This model consists of some basic rules for configuring DTE and a state transition model, which are used to instruct how the domains and types are set, and how security invariants obtained from initial configuration are maintained in the process of system transition respectively. In this model, ten invariants are introduced, especially, some new invariants dealing with information flow are proposed, and their relations with corresponding invariants described in literatures are also discussed.The thirteen transition rules with well-formed atomicity are presented in a well-operational manner. The basic security theorems correspond to these invariants and transition rules are proved. The rationalities for proposing the invariants are further annotated via analyzing the differences between this model and ones described in literatures. At last but not least, future works are prospected, especially, it is pointed out that it is possible to use this model to analyze SE-Linux security.

  9. Orientation of student entrepreneurial practices based on administrative techniques

    Directory of Open Access Journals (Sweden)

    Héctor Horacio Murcia Cabra

    2005-07-01

    Full Text Available As part of the second phase of the research project «Application of a creativity model to update the teaching of the administration in Colombian agricultural entrepreneurial systems» it was decided to re-enforce student planning and execution of the students of the Agricultural business Administration Faculty of La Salle University. Those finishing their studies were given special attention. The plan of action was initiated in the second semester of 2003. It was initially defined as a model of entrepreneurial strengthening based on a coherent methodology that included the most recent administration and management techniques. Later, the applicability of this model was tested in some organizations of the agricultural sector that had asked for support in their planning processes. Through an investigation-action process the methodology was redefined in order to arrive at a final model that could be used by faculty students and graduates. The results obtained were applied to the teaching of Entrepreneurial Laboratory of ninth semester students with the hope of improving administrative support to agricultural enterprises. Following this procedure more than 100 students and 200 agricultural producers have applied this procedure between June 2003 and July 2005. The methodology used and the results obtained are presented in this article.

  10. Study of coherent synchrotron radiation effects by means of a new simulation code based on the non-linear extension of the operator splitting method

    Science.gov (United States)

    Dattoli, G.; Migliorati, M.; Schiavi, A.

    2007-05-01

    The coherent synchrotron radiation (CSR) is one of the main problems limiting the performance of high-intensity electron accelerators. The complexity of the physical mechanisms underlying the onset of instabilities due to CSR demands for accurate descriptions, capable of including the large number of features of an actual accelerating device. A code devoted to the analysis of these types of problems should be fast and reliable, conditions that are usually hardly achieved at the same time. In the past, codes based on Lie algebraic techniques have been very efficient to treat transport problems in accelerators. The extension of these methods to the non-linear case is ideally suited to treat CSR instability problems. We report on the development of a numerical code, based on the solution of the Vlasov equation, with the inclusion of non-linear contribution due to wake field effects. The proposed solution method exploits an algebraic technique that uses the exponential operators. We show that the integration procedure is capable of reproducing the onset of instability and the effects associated with bunching mechanisms leading to the growth of the instability itself. In addition, considerations on the threshold of the instability are also developed.

  11. Study of coherent synchrotron radiation effects by means of a new simulation code based on the non-linear extension of the operator splitting method

    International Nuclear Information System (INIS)

    The coherent synchrotron radiation (CSR) is one of the main problems limiting the performance of high-intensity electron accelerators. The complexity of the physical mechanisms underlying the onset of instabilities due to CSR demands for accurate descriptions, capable of including the large number of features of an actual accelerating device. A code devoted to the analysis of these types of problems should be fast and reliable, conditions that are usually hardly achieved at the same time. In the past, codes based on Lie algebraic techniques have been very efficient to treat transport problems in accelerators. The extension of these methods to the non-linear case is ideally suited to treat CSR instability problems. We report on the development of a numerical code, based on the solution of the Vlasov equation, with the inclusion of non-linear contribution due to wake field effects. The proposed solution method exploits an algebraic technique that uses the exponential operators. We show that the integration procedure is capable of reproducing the onset of instability and the effects associated with bunching mechanisms leading to the growth of the instability itself. In addition, considerations on the threshold of the instability are also developed

  12. Study of coherent Synchrotron Radiation effects by means of a new simulation code based on the non-linear extension of the operator splitting method

    International Nuclear Information System (INIS)

    The coherent synchrotron radiation (CSR) is one of the main problems limiting the performance of high intensity electron accelerators. The complexity of the physical mechanisms underlying the onset of instabilities due to CSR demands for accurate descriptions, capable of including the large number of features of an actual accelerating device. A code devoted to the analysis of this type of problems should be fast and reliable, conditions that are usually hardly achieved at the same rime. In the past, codes based on Lie algebraic techniques , have been very efficient to treat transport problems in accelerators. The extension of these methods to the non-linear case is ideally suited to treat CSR instability problems. We report on the development of a numerical code, based on the solution of the Vlasov equation, with the inclusion of non-linear contribution due to wake field effects. The proposed solution method exploits an algebraic technique, using exponential operators. We show that the integration procedure is capable of reproducing the onset of an instability and the effects associated with bunching mechanisms leading to the growth of the instability itself. In addition, considerations on the threshold of the instability are also developed

  13. The Numerical Simulation of the Crack Elastoplastic Extension Based on the Extended Finite Element Method

    Directory of Open Access Journals (Sweden)

    Xia Xiaozhou

    2013-01-01

    Full Text Available In the frame of the extended finite element method, the exponent disconnected function is introduced to reflect the discontinuous characteristic of crack and the crack tip enrichment function which is made of triangular basis function, and the linear polar radius function is adopted to describe the displacement field distribution of elastoplastic crack tip. Where, the linear polar radius function form is chosen to decrease the singularity characteristic induced by the plastic yield zone of crack tip, and the triangle basis function form is adopted to describe the displacement distribution character with the polar angle of crack tip. Based on the displacement model containing the above enrichment displacement function, the increment iterative form of elastoplastic extended finite element method is deduced by virtual work principle. For nonuniform hardening material such as concrete, in order to avoid the nonsymmetry characteristic of stiffness matrix induced by the non-associate flowing of plastic strain, the plastic flowing rule containing cross item based on the least energy dissipation principle is adopted. Finally, some numerical examples show that the elastoplastic X-FEM constructed in this paper is of validity.

  14. A study on development of a rule based expert system for steam generator life extension

    International Nuclear Information System (INIS)

    The need of predicting the integrity of the steam generator(SG) tubes and environmental conditions that affect their integrity is growing to secure nuclear power plant(NPP) safety and enhance plant availability. To achieve their objectives it is important to diagnose the integrity of the SG tubes. An expert system called FEMODES(failure mode diagnosis expert system) has been developed for diagnosis of such tube degradation phenomena as denting, intergranular attack(IGA) and stress corrosion cracking(SCC) in the secondary side of the SG. It is possible with use of FEMODES to estimate possibilities of SG tube degradation and diagnosis environmental conditions that influence such tube degradation. The method of certainty factor theory(CFT) and the rule based backward reasoning inference strategy are used to develop FEMODES. The information required for diagnosis is acquired from SG tube degradation experiences of two local reference plants, some limited oversea plants and technical reports/research papers about such tube degradation. Overall results estimated with use of FEMODES are in reasonable agreement with actual SG tube degradation. Some discrepancy observed in several estimated values of SG tube degradation appears to be due to insufficient heuristic knowledge for knowledge data base of FEMODES

  15. An Architecture for Intrusion Detection Based on an Extension of the Method of Remaining Elements

    Directory of Open Access Journals (Sweden)

    P. Velarde-Alvarado

    2010-08-01

    Full Text Available This paper introduces an Anomaly-based Intrusion Detection architecture based on behavioral traffic profiles createdby using our enhanced version of the Method of Remaining Elements (MRE. This enhanced version includes: aredefinition of the exposure threshold through the entropy and cardinality of residual sequences, a dualcharacterization for two types of traffic slots, the introduction of the Anomaly Level Exposure (ALE that gives a betterquantification of anomalies for a given traffic slot and r-feature, an alternative support that extends its detectioncapabilities, and a new procedure to obtain the exposure threshold through an analysis of outliers on the trainingdataset. Regarding the original MRE, we incorporate the refinements outlined resulting in a reliable method, whichgives an improved sensitivity to the detection of a broader range of attacks. The experiments were conducted on theMIT-DARPA dataset and also on an academic LAN by implementing real attacks. The results show that the proposedarchitecture is effective in early detection of intrusions, as well as some kind of attacks designed to bypass detectionmeasures.

  16. Structuring Task-based Interaction through Collaborative Learning Techniques (2)

    Institute of Scientific and Technical Information of China (English)

    William Littlewood

    2004-01-01

    @@ Techniques for collaborative learning In this section the focus will move from broad strategies to specific techniques (often also called "structures") through which the strategies can be realized. It gives a selection of techniques which have proved (in my own experience as well as that of others) particularly useful in pro-viding contexts for practice, exploration and /or interaction in the second language classroom.

  17. Park-based and zero sequence-based relaying techniques with application to transformers protection

    Energy Technology Data Exchange (ETDEWEB)

    Diaz, G.; Arboleya, P.; Gomez-Aleixandre, J. [University of Oviedo (Spain). Dept. of Electrical Engineering

    2004-09-01

    Two relaying techniques for protecting power transformers are presented and discussed. Very often, differential relaying is used for this purpose. A comparison between the two proposed techniques and conventional differential relaying is thus presented. The first technique, based on the measurements of zero sequence current within a delta winding, performs best in multiwinding transformers, since only measurement of the coil currents is needed. Thus, great simplicity is achieved. The second one is based on the differential procedure, but its analysis of asymmetries in the plot in Park's plane avoids problems related to spectral analysis in conventional differential relaying. The technique is justified from the analysis of symmetrical components. Misoperation in conventional differential relaying has been observed in some cases as a function of switching instant and fault location. This issue is discussed in the paper, and a statistical analysis of a large number of laboratory tests, in which both factors were controlled, is presented. As a conclusion, both relaying techniques proposed succeed in protecting the transformer. Additionally, the Park-based relay exhibits three characteristics of most importance: fastest performance, robustness and simplicity in its formulation. (author)

  18. Spatial extension of excitons in triphenylene based polymers given by range-separated functionals

    CERN Document Server

    Kociper, B

    2013-01-01

    Motivated by an experiment in which the singlet-triplet gap in triphenylene based copolymers was effectively tuned, we used time dependent density functional theory (TDDFT) to reproduce the main results. By means of conventional and long-range corrected exchange correlation functionals, the luminescence energies and the exciton localization were calculated for a triphenylene homopolymer and several different copolymers. The phosphorescence energy of the pure triphenylene chain is predicted accurately by means of the optimally tuned long-range corrected LC-PBE functional and slightly less accurate by the global hybrid B3LYP. However, the experimentally observed fixed phosphorescence energy could not be reproduced because the localization pattern is different to the expectations: Instead of localizing on the triphenylene moiety - which is present in all types of polymers - the triplet state localizes on the different bridging units in the TDDFT calculations. This leads to different triplet emission energies for...

  19. Size measurement of gold and silver nanostructures based on their extinction spectrum: limitations and extensions

    Directory of Open Access Journals (Sweden)

    A A Ashkarran

    2013-09-01

    Full Text Available  This paper reports on physical principles and the relations between extinction cross section and geometrical properties of silver and gold nanostructures. We introduce some simple relations for determining geometrical properties of silver and gold nanospheres based on the situation of their plasmonic peak. We also applied, investigated and compared the accuracy of these relations using other published works in order to make clear the effects of shape, size distribution and refractive index of particles’ embedding medium. Finally, we extended the equations to non-spherical particles and investigated their accuracy. We found that modified forms of the equations may lead to more exact results for non-spherical metal particles, but for better results, modified equations should depend on shape and size distribution of particles. It seems that these equations are not applicable to particles with corners sharper than cubes' corners i.e. nanostructures with spatial angles less than π/2 sr.

  20. Research Extension and Education Programs on Bio-based Energy Technologies and Products

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, Sam [University of Tennessee, Knoxville, TN (United States). Tennessee Agricultural Experiment Station; Harper, David [University of Tennessee, Knoxville, TN (United States). Tennessee Agricultural Experiment Station; Womac, Al [University of Tennessee, Knoxville, TN (United States). Tennessee Agricultural Experiment Station

    2010-03-02

    The overall objectives of this project were to provide enhanced educational resources for the general public, educational and development opportunities for University faculty in the Southeast region, and enhance research knowledge concerning biomass preprocessing and deconstruction. All of these efforts combine to create a research and education program that enhances the biomass-based industries of the United States. This work was broken into five primary objective areas: • Task A - Technical research in the area of biomass preprocessing, analysis, and evaluation. • Tasks B&C - Technical research in the areas of Fluidized Beds for the Chemical Modification of Lignocellulosic Biomass and Biomass Deconstruction and Evaluation. • Task D - Analyses for the non-scientific community to provides a comprehensive analysis of the current state of biomass supply, demand, technologies, markets and policies; identify a set of feasible alternative paths for biomass industry development and quantify the impacts associated with alternative path. • Task E - Efforts to build research capacity and develop partnerships through faculty fellowships with DOE national labs The research and education programs conducted through this grant have led to three primary results. They include: • A better knowledge base related to and understanding of biomass deconstruction, through both mechanical size reduction and chemical processing • A better source of information related to biomass, bioenergy, and bioproducts for researchers and general public users through the BioWeb system. • Stronger research ties between land-grant universities and DOE National Labs through the faculty fellowship program. In addition to the scientific knowledge and resources developed, funding through this program produced a minimum of eleven (11) scientific publications and contributed to the research behind at least one patent.

  1. Evaluation of a school-based diabetes education intervention, an extension of Program ENERGY

    Science.gov (United States)

    Conner, Matthew David

    Background: The prevalence of both obesity and type 2 diabetes in the United States has increased over the past two decades and rates remain high. The latest data from the National Center for Health Statistics estimates that 36% of adults and 17% of children and adolescents in the US are obese (CDC Adult Obesity, CDC Childhood Obesity). Being overweight or obese greatly increases one's risk of developing several chronic diseases, such as type 2 diabetes. Approximately 8% of adults in the US have diabetes, type 2 diabetes accounts for 90-95% of these cases. Type 2 diabetes in children and adolescents is still rare, however clinical reports suggest an increase in the frequency of diagnosis (CDC Diabetes Fact Sheet, 2011). Results from the Diabetes Prevention Program show that the incidence of type 2 diabetes can be reduced through the adoption of a healthier lifestyle among high-risk individuals (DPP, 2002). Objectives: This classroom-based intervention included scientific coverage of energy balance, diabetes, diabetes prevention strategies, and diabetes management. Coverage of diabetes management topics were included in lesson content to further the students' understanding of the disease. Measurable short-term goals of the intervention included increases in: general diabetes knowledge, diabetes management knowledge, and awareness of type 2 diabetes prevention strategies. Methods: A total of 66 sixth grade students at Tavelli Elementary School in Fort Collins, CO completed the intervention. The program consisted of nine classroom-based lessons; students participated in one lesson every two weeks. The lessons were delivered from November of 2005 to May of 2006. Each bi-weekly lesson included a presentation and interactive group activities. Participants completed two diabetes knowledge questionnaires at baseline and post intervention. A diabetes survey developed by Program ENERGY measured general diabetes knowledge and awareness of type 2 diabetes prevention strategies

  2. Evidence-based surgical techniques for caesarean section

    DEFF Research Database (Denmark)

    Aabakke, Anna J M; Secher, Niels Jørgen; Krebs, Lone

    2014-01-01

    Caesarean section (CS) is a common surgical procedure, and in Denmark 21% of deliveries is by CS. There is an increasing amount of scientific evidence to support the different surgical techniques used at CS. This article reviews the literature regarding CS techniques. There is still a lack of evi...

  3. WSN- and IOT-Based Smart Homes and Their Extension to Smart Buildings.

    Science.gov (United States)

    Ghayvat, Hemant; Mukhopadhyay, Subhas; Gui, Xiang; Suryadevara, Nagender

    2015-01-01

    Our research approach is to design and develop reliable, efficient, flexible, economical, real-time and realistic wellness sensor networks for smart home systems. The heterogeneous sensor and actuator nodes based on wireless networking technologies are deployed into the home environment. These nodes generate real-time data related to the object usage and movement inside the home, to forecast the wellness of an individual. Here, wellness stands for how efficiently someone stays fit in the home environment and performs his or her daily routine in order to live a long and healthy life. We initiate the research with the development of the smart home approach and implement it in different home conditions (different houses) to monitor the activity of an inhabitant for wellness detection. Additionally, our research extends the smart home system to smart buildings and models the design issues related to the smart building environment; these design issues are linked with system performance and reliability. This research paper also discusses and illustrates the possible mitigation to handle the ISM band interference and attenuation losses without compromising optimum system performance. PMID:25946630

  4. Viable yet Protected for Future Generations? An Examination of the Extensive Forest-Based Tourism Market

    Directory of Open Access Journals (Sweden)

    Hana Sakata

    2012-12-01

    Full Text Available Abstract This article focuses on forest tourism and rainforests in particular, and explores their potential to contribute to the global tourism industry. The specific objectives of the study were to develop a profile, including motivations, of tourists visiting the Wet Tropics rainforest of Australia and to identify previous patterns of forest visitation in both Australia and other global destinations. A survey of 1,408 visitors conducted at a number of Wet Tropics rainforest sites in the tropical north region of Australia found that over 37% of the sample had previously visited forests while on holidays indicating that forest-based tourism is a major component to the nature-based market. Countries and forested sites in South-East Asia were the most popular as holiday attractions with over 13% of respondents having visited these sites. This was followed by countries of the South Pacific, North America, South America, Central America, Africa, South Asia and China, the Caribbean and Europe. While overall this is a promising result, forest-based tourism faces a number of pressures including urban settlement, extractive industries and in the near future climate change. Keywords: forests; rainforests; nature-based tourism; Tropical North Queensland; Wet Tropics rainforest. Resumo Este artigo enfoca o turismo de florestas e florestas tropicais em particular e explora seu potencial em contribuir para a indústria de turismo global. Os objetivos específicos deste estudo foram: desenvolver um perfil, incluindo as motivações, dos turistas que visitam a Wet Tropics, floresta tropical da Austrália e identificar padrões anteriores de visitação de florestas tanto na Austrália quanto em outros destinos globais. Uma pesquisa com 1.408 visitantes conduzida em vários locais com florestas tropicais Wet Tropics na região tropical norte da Austrália concluiu que mais de 37% da amostra já tinham visitado previamente as florestas quando estavam em f

  5. WSN- and IOT-Based Smart Homes and Their Extension to Smart Buildings

    Directory of Open Access Journals (Sweden)

    Hemant Ghayvat

    2015-05-01

    Full Text Available Our research approach is to design and develop reliable, efficient, flexible, economical, real-time and realistic wellness sensor networks for smart home systems. The heterogeneous sensor and actuator nodes based on wireless networking technologies are deployed into the home environment. These nodes generate real-time data related to the object usage and movement inside the home, to forecast the wellness of an individual. Here, wellness stands for how efficiently someone stays fit in the home environment and performs his or her daily routine in order to live a long and healthy life. We initiate the research with the development of the smart home approach and implement it in different home conditions (different houses to monitor the activity of an inhabitant for wellness detection. Additionally, our research extends the smart home system to smart buildings and models the design issues related to the smart building environment; these design issues are linked with system performance and reliability. This research paper also discusses and illustrates the possible mitigation to handle the ISM band interference and attenuation losses without compromising optimum system performance.

  6. Creative Conceptual Design Based on Evolutionary DNA Computing Technique

    Science.gov (United States)

    Liu, Xiyu; Liu, Hong; Zheng, Yangyang

    Creative conceptual design is an important area in computer aided innovation. Typical design methodology includes exploration and optimization by evolutionary techniques such as EC and swarm intelligence. Although there are many proposed algorithms and applications for creative design by these techniques, the computing models are implemented mostly by traditional von Neumann’s architecture. On the other hand, the possibility of using DNA as a computing technique arouses wide interests in recent years with huge built-in parallel computing nature and ability to solve NP complete problems. This new computing technique is performed by biological operations on DNA molecules rather than chips. The purpose of this paper is to propose a simulated evolutionary DNA computing model and integrate DNA computing with creative conceptual design. The proposed technique will apply for large scale, high parallel design problems potentially.

  7. A Technique for Volumetric CSG Based on Morphology

    DEFF Research Database (Denmark)

    Bærentzen, Jakob Andreas; Christensen, Niels Jørgen

    2001-01-01

    In this paper, a new technique for volumetric CSG is presented. The technique requires the input volumes to correspond to solids which fulfill a voxelization suitability criterion. Assume the CSG operation is union. The volumetric union of two such volumes is defined in terms of the voxelization ...... the union of the two original solids. The theory behind the new technique is discussed, the algorithm and implementation are presented. Finally, we present images and timings.......In this paper, a new technique for volumetric CSG is presented. The technique requires the input volumes to correspond to solids which fulfill a voxelization suitability criterion. Assume the CSG operation is union. The volumetric union of two such volumes is defined in terms of the voxelization of...

  8. A Lossless Data Hiding Technique based on AES-DWT

    Directory of Open Access Journals (Sweden)

    Gustavo Fernandaacute;ndez Torres2

    2012-09-01

    Full Text Available In this paper we propose a new data hiding technique. The new technique uses steganography and cryptography on images with a size of 256x256 pixels and an 8-bit grayscale format. There are design restrictions such as a fixed-size cover image, and reconstruction without error of the hidden image. The steganography technique uses a Haar-DWT (Discrete Wavelet Transform with hard thresholding and LSB (Less Significant Bit technique on the cover image. The algorithms used for compressing and ciphering the secret image are lossless JPG and AES, respectively. The proposed technique is used to generate a stego image which provides a double type of security that is robust against attacks. Results are reported for different thresholds levels in terms of PSNR.

  9. SPAM CLASSIFICATION BASED ON SUPERVISED LEARNING USING MACHINE LEARNING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    T. Hamsapriya

    2011-12-01

    Full Text Available E-mail is one of the most popular and frequently used ways of communication due to its worldwide accessibility, relatively fast message transfer, and low sending cost. The flaws in the e-mail protocols and the increasing amount of electronic business and financial transactions directly contribute to the increase in e-mail-based threats. Email spam is one of the major problems of the today’s Internet, bringing financial damage to companies and annoying individual users. Spam emails are invading users without their consent and filling their mail boxes. They consume more network capacity as well as time in checking and deleting spam mails. The vast majority of Internet users are outspoken in their disdain for spam, although enough of them respond to commercial offers that spam remains a viable source of income to spammers. While most of the users want to do right think to avoid and get rid of spam, they need clear and simple guidelines on how to behave. In spite of all the measures taken to eliminate spam, they are not yet eradicated. Also when the counter measures are over sensitive, even legitimate emails will be eliminated. Among the approaches developed to stop spam, filtering is the one of the most important technique. Many researches in spam filtering have been centered on the more sophisticated classifier-related issues. In recent days, Machine learning for spam classification is an important research issue. The effectiveness of the proposed work is explores and identifies the use of different learning algorithms for classifying spam messages from e-mail. A comparative analysis among the algorithms has also been presented.

  10. Development of acoustic model-based iterative reconstruction technique for thick-concrete imaging

    Science.gov (United States)

    Almansouri, Hani; Clayton, Dwight; Kisner, Roger; Polsky, Yarom; Bouman, Charles; Santos-Villalobos, Hector

    2016-02-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well's health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.1

  11. Development of Acoustic Model-Based Iterative Reconstruction Technique for Thick-Concrete Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Almansouri, Hani [Purdue University; Clayton, Dwight A [ORNL; Kisner, Roger A [ORNL; Polsky, Yarom [ORNL; Bouman, Charlie [Purdue University; Santos-Villalobos, Hector J [ORNL

    2016-01-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well's health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.

  12. Development of Acoustic Model-Based Iterative Reconstruction Technique for Thick-Concrete Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Almansouri, Hani [Purdue University; Clayton, Dwight A [ORNL; Kisner, Roger A [ORNL; Polsky, Yarom [ORNL; Bouman, Charlie [Purdue University; Santos-Villalobos, Hector J [ORNL

    2015-01-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well s health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.

  13. Applying contact to individual silicon nanowires using a dielectrophoresis (DEP)-based technique

    International Nuclear Information System (INIS)

    One major challenge for the technological use of nanostructures is the control of their electrical and optoelectronic properties. For that purpose, extensive research into the electrical characterization and therefore a fast and reliable way of contacting these structures are needed. Here, we report on a new, dielectrophoresis (DEP)-based technique, which enables to apply sufficient and reliable contact to individual nanostructures, like semiconducting nanowires (NW), easily and without the need for lithography. The DEP contacting technique presented in this article can be done without high-tech equipment and monitored in situ with an optical microscope. In the presented experiments, individual SiNWs are trapped and subsequently welded between two photolithographically pre-patterned electrodes by applying varying AC voltages to the electrodes. To proof the quality of these contacts, I–V curves, photoresponse and photoconductivity of a single SiNW were measured. Furthermore, the measured photoconductivity in dependence on the wavelength of illuminated light and was compared with calculations predicting the absorption spectra of an individual SiNW.

  14. Applying contact to individual silicon nanowires using a dielectrophoresis (DEP)-based technique

    Energy Technology Data Exchange (ETDEWEB)

    Leiterer, Christian, E-mail: christian.leiterer@gmail.com [Institute of Photonic Technology (Germany); Broenstrup, Gerald [Max-Planck-Institute for the Science of Light (Germany); Jahr, Norbert; Urban, Matthias; Arnold, Cornelia; Christiansen, Silke; Fritzsche, Wolfgang [Institute of Photonic Technology (Germany)

    2013-05-15

    One major challenge for the technological use of nanostructures is the control of their electrical and optoelectronic properties. For that purpose, extensive research into the electrical characterization and therefore a fast and reliable way of contacting these structures are needed. Here, we report on a new, dielectrophoresis (DEP)-based technique, which enables to apply sufficient and reliable contact to individual nanostructures, like semiconducting nanowires (NW), easily and without the need for lithography. The DEP contacting technique presented in this article can be done without high-tech equipment and monitored in situ with an optical microscope. In the presented experiments, individual SiNWs are trapped and subsequently welded between two photolithographically pre-patterned electrodes by applying varying AC voltages to the electrodes. To proof the quality of these contacts, I-V curves, photoresponse and photoconductivity of a single SiNW were measured. Furthermore, the measured photoconductivity in dependence on the wavelength of illuminated light and was compared with calculations predicting the absorption spectra of an individual SiNW.

  15. [Eco-value level classification model of forest ecosystem based on modified projection pursuit technique].

    Science.gov (United States)

    Wu, Chengzhen; Hong, Wei; Hong, Tao

    2006-03-01

    To optimize the projection function and direction of projection pursuit technique, predigest its realization process, and overcome the shortcomings in long time calculation and in the difficulty of optimizing projection direction and computer programming, this paper presented a modified simplex method (MSM), and based on it, brought forward the eco-value level classification model (EVLCM) of forest ecosystem, which could integrate the multidimensional classification index into one-dimensional projection value, with high projection value denoting high ecosystem services value. Examples of forest ecosystem could be reasonably classified by the new model according to their projection value, suggesting that EVLCM driven directly by samples data of forest ecosystem was simple and feasible, applicable, and maneuverable. The calculating time and value of projection function were 34% and 143% of those with the traditional projection pursuit technique, respectively. This model could be applied extensively to classify and estimate all kinds of non-linear and multidimensional data in ecology, biology, and regional sustainable development. PMID:16724723

  16. Mount Davis Road Extension

    OpenAIRE

    Kumaraswamy, Mohan

    2002-01-01

    One element of the CIVCAL project Web-based resources containing images, tables, texts and associated data on the construction of Mt Davis Road Extension. The Mount Davis Road Extension was part of the Smithfield Extension and Associated Roadworks which was completed by the Highways Department (Hong Kong S.A.R. ) in December 1997. The considerable increase in traffic volume necessitated the construction of a road link between Smithfield and Pokfulam Road. The construction of an elevate...

  17. C. elegans lifespan extension by osmotic stress requires FUdR, base excision repair, FOXO, and sirtuins.

    Science.gov (United States)

    Anderson, Edward N; Corkins, Mark E; Li, Jia-Cheng; Singh, Komudi; Parsons, Sadé; Tucey, Tim M; Sorkaç, Altar; Huang, Huiyan; Dimitriadi, Maria; Sinclair, David A; Hart, Anne C

    2016-03-01

    Moderate stress can increase lifespan by hormesis, a beneficial low-level induction of stress response pathways. 5'-fluorodeoxyuridine (FUdR) is commonly used to sterilize Caenorhabditis elegans in aging experiments. However, FUdR alters lifespan in some genotypes and induces resistance to thermal and proteotoxic stress. We report that hypertonic stress in combination with FUdR treatment or inhibition of the FUdR target thymidylate synthase, TYMS-1, extends C. elegans lifespan by up to 30%. By contrast, in the absence of FUdR, hypertonic stress decreases lifespan. Adaptation to hypertonic stress requires diminished Notch signaling and loss of Notch co-ligands leads to lifespan extension only in combination with FUdR. Either FUdR treatment or TYMS-1 loss induced resistance to acute hypertonic stress, anoxia, and thermal stress. FUdR treatment increased expression of DAF-16 FOXO and the osmolyte biosynthesis enzyme GPDH-1. FUdR-induced hypertonic stress resistance was partially dependent on sirtuins and base excision repair (BER) pathways, while FUdR-induced lifespan extension under hypertonic stress conditions requires DAF-16, BER, and sirtuin function. Combined, these results demonstrate that FUdR, through inhibition of TYMS-1, activates stress response pathways in somatic tissues to confer hormetic resistance to acute and chronic stress. C. elegans lifespan studies using FUdR may need re-interpretation in light of this work. PMID:26854551

  18. FPGA IMPLEMENTATION OF RSEPD TECHNIQUE BASED IMPULSE NOISE REMOVAL

    Directory of Open Access Journals (Sweden)

    M. Rajadurai

    2013-05-01

    Full Text Available In the process of signals transmission and acquisition, image signals might be corrupted by impulse noise. Generally, digital images are corrupted by impulse noises. These are short duration noises, which degrade an image and are randomly distributed over the image. An efficient FPGA implementation for removing impulse noise in an image is presented in this paper. Existing techniques use standard median filter. These existing approaches changes the pixel values of both noise less and noisy pixels, so image might be blurred in nature. To avoid the changes on noise less pixels, an efficient FPGA implementation of Simple Edge Preserved De-noising technique (SEPD and Reduced Simple Edge Preserved De-noising technique (RSEPD are presented in this paper. In this technique, noise detection and noise removal operations are performed. This VLSI design gives better image quality. For 10 percentage noise added image, the obtained PSNR value of the image is 31.68 while de-noising it.

  19. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.

    The thesis describes and develops the theoretical foundations of the Random Decrement technique, while giving several examples of modal analysis of large building constructions (bridges). The connection between modal parameters and Random Decrement functions is described theoretically. The effici...

  20. The Statistical methods of Pixel-Based Image Fusion Techniques

    CERN Document Server

    Al-Wassai, Firouz Abdullah; Al-Zaky, Ali A

    2011-01-01

    There are many image fusion methods that can be used to produce high-resolution mutlispectral images from a high-resolution panchromatic (PAN) image and low-resolution multispectral (MS) of remote sensed images. This paper attempts to undertake the study of image fusion techniques with different Statistical techniques for image fusion as Local Mean Matching (LMM), Local Mean and Variance Matching (LMVM), Regression variable substitution (RVS), Local Correlation Modeling (LCM) and they are compared with one another so as to choose the best technique, that can be applied on multi-resolution satellite images. This paper also devotes to concentrate on the analytical techniques for evaluating the quality of image fusion (F) by using various methods including Standard Deviation (SD), Entropy(En), Correlation Coefficient (CC), Signal-to Noise Ratio (SNR), Normalization Root Mean Square Error (NRMSE) and Deviation Index (DI) to estimate the quality and degree of information improvement of a fused image quantitatively...

  1. A New Image Steganography Based On First Component Alteration Technique

    Directory of Open Access Journals (Sweden)

    Amanpreet Kaur

    2009-12-01

    Full Text Available In this paper, A new image steganography scheme is proposed which is a kind of spatial domain technique. In order to hide secret data in cover-image, the first component alteration technique is used. Techniques used so far focuses only on the two or four bits of a pixel in a image (at the most five bits at the edge of an image which results in less peak to signal noise ratio and high root mean square error. In this technique, 8 bits of blue components of pixels are replaced with secret data bits. Proposed scheme can embed more data than previous schemes and shows better image quality. To prove this scheme, several experiments are performed, and are compared the experimental results with the related previous works.Keywords—image; mean square error; Peak signal to noise ratio; steganography;

  2. Biogeosystem technique as a base of Sustainable Irrigated Agriculture

    Science.gov (United States)

    Batukaev, Abdulmalik

    2016-04-01

    The world water strategy is to be changed because the current imitational gravitational frontal isotropic-continual paradigm of irrigation is not sustainable. This paradigm causes excessive consumption of fresh water - global deficit - up to 4-15 times, adverse effects on soils and landscapes. Current methods of irrigation does not control the water spread throughout the soil continuum. The preferable downward fluxes of irrigation water are forming, up to 70% and more of water supply loses into vadose zone. The moisture of irrigated soil is high, soil loses structure in the process of granulometric fractions flotation decomposition, the stomatal apparatus of plant leaf is fully open, transpiration rate is maximal. We propose the Biogeosystem technique - the transcendental, uncommon and non-imitating methods for Sustainable Natural Resources Management. New paradigm of irrigation is based on the intra-soil pulse discrete method of water supply into the soil continuum by injection in small discrete portions. Individual volume of water is supplied as a vertical cylinder of soil preliminary watering. The cylinder position in soil is at depth form 10 to 30 cm. Diameter of cylinder is 1-2 cm. Within 5-10 min after injection the water spreads from the cylinder of preliminary watering into surrounding soil by capillary, film and vapor transfer. Small amount of water is transferred gravitationally to the depth of 35-40 cm. The soil watering cylinder position in soil profile is at depth of 5-50 cm, diameter of the cylinder is 2-4 cm. Lateral distance between next cylinders along the plant raw is 10-15 cm. The soil carcass which is surrounding the cylinder of non-watered soil remains relatively dry and mechanically stable. After water injection the structure of soil in cylinder restores quickly because of no compression from the stable adjoining volume of soil and soil structure memory. The mean soil thermodynamic water potential of watered zone is -0.2 MPa. At this potential

  3. Hiding of Speech based on Chaotic Steganography and Cryptography Techniques

    OpenAIRE

    Abbas Salman Hameed

    2015-01-01

    The technique of embedding secret information into cover media, like image, video, audio and text called Steganography, so that only the sender and the authorized recipient who have a key can detect the presence of secret information. In this paper Steganography and Cryptography techniques of speech present with Chaos. Fractional order Lorenz and Chua systems that provides an expanded in key space are used to encrypt speech message. The large key space addition to all ...

  4. Defect Characterization Based on Eddy Current Technique: Technical Review

    OpenAIRE

    Ruzlaini Ghoni; Mahmood Dollah; Aizat Sulaiman; Fadhil Mamat Ibrahim

    2014-01-01

    Eddy current testing is widely used for nondestructive evaluation of metallic structures in characterizing numerous types of defects occurring in various locations. It offers remarkable advantages over other nondestructive techniques because of its ease of implementation. This paper presents a technical review of Eddy current technique in various scope of defect detection. The first part presents Eddy current evaluation on various defects location and orientation such as steam generator tubes...

  5. SNMP Based Network Optimization Technique Using Genetic Algorithms

    OpenAIRE

    M. Mohamed Surputheen; G Ravi; Srinivasan, R.

    2012-01-01

    Genetic Algorithms (GAs) has innumerable applications through the optimization techniques and network optimization is one of them. SNMP (Simple Network Management Protocol) is used as the basic network protocol for monitoring the network activities health of the systems. This paper deals with adding Intelligence to the various aspects of SNMP by adding optimization techniques derived out of genetic algorithms, which enhances the performance of SNMP processes like routing.

  6. SNMP Based Network Optimization Technique Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    M. Mohamed Surputheen

    2012-03-01

    Full Text Available Genetic Algorithms (GAs has innumerable applications through the optimization techniques and network optimization is one of them. SNMP (Simple Network Management Protocol is used as the basic network protocol for monitoring the network activities health of the systems. This paper deals with adding Intelligence to the various aspects of SNMP by adding optimization techniques derived out of genetic algorithms, which enhances the performance of SNMP processes like routing.

  7. Towards a theoretically based Group Facilitation Technique for Project Teams

    OpenAIRE

    Witte, E.H.; Engelhardt, Gabriele

    2004-01-01

    A theoretical framework for developing the group facilitation technique PROMOD is presented here. The efficiency of this technique in improving group decision quality is supported by the results of three experimental studies involving different kinds of problem solving tasks. The author points towards the importance of integrating theoretical assumptions, theory testing and basic research with empirical application. Such a compelling strategy can lead to new insights in group performance dyna...

  8. Proposing a wiki-based technique for collaborative essay writing

    OpenAIRE

    Mabel Ortiz Navarrete; Anita Ferreira Cabrera

    2014-01-01

    This paper aims at proposing a technique for students learning English as a foreign language when they collaboratively write an argumentative essay in a wiki environment. A wiki environment and collaborative work play an important role within the academic writing task. Nevertheless, an appropriate and systematic work assignment is required in order to make use of both. In this paper the proposed technique when writing a collaborative essay mainly attempts to provide the most effective way to ...

  9. EDM COLLABORATIVE MANUFACTURING SYSTEM BASED ON MULTI-AGENT TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    Zhao Wansheng; Zhao Jinzhi; Song Yinghui; Yang Xiaodong

    2003-01-01

    A framework for building EDM collaborative manufacturing system using multi-agent technology to support organizations characterized by physically distributed, enterprise-wide, heterogeneous intelligent manufacturing system over Internet is proposed. Expert system theory is introduced.Design, manufacturing and technological knowledge are shared using artificial intelligence and web techniques by EDM-CADagent, EDM-CAMagent and EDM-CAPPagent. System structure, design process, network conditions, realization methods and other key techniques are discussed. Instances are also introduced to testify feasibility.

  10. Measurement based simulation of microscope deviations for evaluation of stitching algorithms for the extension of Fourier-based alignment

    Science.gov (United States)

    Engelke, Florian; Kästner, Markus; Reithmeier, Eduard

    2013-05-01

    Image stitching is a technique used to measure large surface areas with high resolution while maintaining a large field of view. We work on improving data fusion by stitching in the field of microscopic analysis of technical surfaces for structures and roughness. Guidance errors and imaging errors such as noise cause problems for seamless image fusion of technical surfaces. The optical imaging errors of 3D Microscopes, such as confocal microscopes and white light interferometers, as well as the guidance errors of their automated positioning systems have been measured to create a software to simulate automated measurements of known surfaces with specific deviations to test new stitching algorithms. We measured and incorporated radial image distortion, interferometer reference mirror shape deviations, statistical noise, drift of the positional axis, on-axis-accuracy and repeatability of the used positioning stages and misalignment of the CCD-Chip with respect to the axes of motion. We used the resulting simulation of the measurement process to test a new image registration technique that allows for the use of correlation of images by fast fourier transform for small overlaps between single measurements.

  11. MR-based field-of-view extension in MR/PET: B0 homogenization using gradient enhancement (HUGE).

    Science.gov (United States)

    Blumhagen, Jan O; Ladebeck, Ralf; Fenchel, Matthias; Scheffler, Klaus

    2013-10-01

    In whole-body MR/PET, the human attenuation correction can be based on the MR data. However, an MR-based field-of-view (FoV) is limited due to physical restrictions such as B0 inhomogeneities and gradient nonlinearities. Therefore, for large patients, the MR image and the attenuation map might be truncated and the attenuation correction might be biased. The aim of this work is to explore extending the MR FoV through B0 homogenization using gradient enhancement in which an optimal readout gradient field is determined to locally compensate B0 inhomogeneities and gradient nonlinearities. A spin-echo-based sequence was developed that computes an optimal gradient for certain regions of interest, for example, the patient's arms. A significant distortion reduction was achieved outside the normal MR-based FoV. This FoV extension was achieved without any hardware modifications. In-plane distortions in a transaxially extended FoV of up to 600 mm were analyzed in phantom studies. In vivo measurements of the patient's arms lying outside the normal specified FoV were compared with and without the use of B0 homogenization using gradient enhancement. In summary, we designed a sequence that provides data for reducing the image distortions due to B0 inhomogeneities and gradient nonlinearities and used the data to extend the MR FoV. PMID:23203976

  12. A novel technique for extracting clouds base height using ground based imaging

    Directory of Open Access Journals (Sweden)

    E. Hirsch

    2011-01-01

    Full Text Available The height of a cloud in the atmospheric column is a key parameter in its characterization. Several remote sensing techniques (passive and active, either ground-based or on space-borne platforms and in-situ measurements are routinely used in order to estimate top and base heights of clouds. In this article we present a novel method that combines thermal imaging from the ground and sounded wind profile in order to derive the cloud base height. This method is independent of cloud types, making it efficient for both low boundary layer and high clouds. In addition, using thermal imaging ensures extraction of clouds' features during daytime as well as at nighttime. The proposed technique was validated by comparison to active sounding by ceilometers (which is a standard ground based method, to lifted condensation level (LCL calculations, and to MODIS products obtained from space. As all passive remote sensing techniques, the proposed method extracts only the height of the lowest cloud layer, thus upper cloud layers are not detected. Nevertheless, the information derived from this method can be complementary to space-borne cloud top measurements when deep-convective clouds are present. Unlike techniques such as LCL, this method is not limited to boundary layer clouds, and can extract the cloud base height at any level, as long as sufficient thermal contrast exists between the radiative temperatures of the cloud and its surrounding air parcel. Another advantage of the proposed method is its simplicity and modest power needs, making it particularly suitable for field measurements and deployment at remote locations. Our method can be further simplified for use with visible CCD or CMOS camera (although nighttime clouds will not be observed.

  13. Chain extension and branching of poly(L-lactic acid produced by reaction with a DGEBA-based epoxy resin

    Directory of Open Access Journals (Sweden)

    2007-11-01

    Full Text Available Dicarboxylated poly(L-lactic acid (PLLA was synthesized by reacting succinic anhydride with L-lactic acid prepolymer prepared by melt polycondensation. PLLA and epoxy resin based on diglycidyl ether of bisphenol A (DGEBA copolymers were prepared by chain extension of dicarboxylated PLLA with DGEBA. Infrared spectra confirmed the formation of dicarboxylated PLLA and PLLA/DGEBA copolymer. Influences of reaction temperature, reaction time, and the amount of DGEBA on the molecular weight and gel content of PLLA/DGEBA copolymer were studied. The viscosity average molecular weight of PLLA/DGEBA copolymer reached 87 900 when reaction temperature, reaction time, and mol ratio of dicarboxylated PLLA to DGEBA is 150°C, 30 min, and 1:1 respectively, while gel content of PLLA/DGEBA copolymer is almost zero.

  14. The Assessment of Comprehensive Vulnerability of Chemical Industrial Park Based on Entropy Method and Matter-element Extension Model

    Directory of Open Access Journals (Sweden)

    Yan Jingyi

    2016-01-01

    Full Text Available The paper focuses on studying connotative meaning, evaluation methods and models for chemical industry park based on in-depth analysis of relevant research results in China and abroad, it summarizes and states the feature of menacing vulnerability and structural vulnerability and submits detailed influence factors such as personnel vulnerability, infrastructural vulnerability, environmental vulnerability and the vulnerability of safety managerial defeat. Using vulnerability scoping diagram establishes 21 evaluation indexes and an index system for the vulnerability evaluation of chemical industrial park. The comprehensive weights are calculated with entropy method, combining matter-element extension model to make the quantitative evaluation, then apply to evaluate some chemical industrial park successfully. This method provides a new ideas and ways for enhancing overall safety of the chemical industrial park.

  15. Lepton mass and mixing in a simple extension of the Standard Model based on T7 flavor symmetry

    CERN Document Server

    Vien, V V

    2016-01-01

    A simple Standard Model Extension based on $T_7$ flavor symmetry which accommodates lepton mass and mixing with non-zero $\\theta_{13}$ and CP violation phase is proposed. At the tree- level, the realistic lepton mass and mixing pattern is derived through the spontaneous symmetry breaking by just one vacuum expectation value ($v$) which is the same as in the Standard Model. Neutrinos get small masses from one $SU(2)_L$ doublet and two $SU(2)_L$ singlets in which one being in $\\underline{1}$ and the two others in $\\underline{3}$ and $\\underline{3}^*$ under $T_7$ , respectively. The model also gives a remarkable prediction of Dirac CP violation $\\delta_{CP}=172.598^\\circ$ in both normal and inverted hierarchies which is still missing in the neutrino mixing matrix.

  16. Retention of denture bases fabricated by three different processing techniques – An in vivo study

    Science.gov (United States)

    Chalapathi Kumar, V. H.; Surapaneni, Hemchand; Ravikiran, V.; Chandra, B. Sarat; Balusu, Srilatha; Reddy, V. Naveen

    2016-01-01

    Aim: Distortion due to Polymerization shrinkage compromises the retention. To evaluate the amount of retention of denture bases fabricated by conventional, anchorized, and injection molding polymerization techniques. Materials and Methods: Ten completely edentulous patients were selected, impressions were made, and master cast obtained was duplicated to fabricate denture bases by three polymerization techniques. Loop was attached to the finished denture bases to estimate the force required to dislodge them by retention apparatus. Readings were subjected to nonparametric Friedman two-way analysis of variance followed by Bonferroni correction methods and Wilcoxon matched-pairs signed-ranks test. Results: Denture bases fabricated by injection molding (3740 g), anchorized techniques (2913 g) recorded greater retention values than conventional technique (2468 g). Significant difference was seen between these techniques. Conclusions: Denture bases obtained by injection molding polymerization technique exhibited maximum retention, followed by anchorized technique, and least retention was seen in conventional molding technique. PMID:27382542

  17. Genotyping human ancient mtDNA control and coding region polymorphisms with a multiplexed Single-Base-Extension assay: the singular maternal history of the Tyrolean Iceman

    Directory of Open Access Journals (Sweden)

    Egarter-Vigl Eduard

    2009-06-01

    Full Text Available Abstract Background Progress in the field of human ancient DNA studies has been severely restricted due to the myriad sources of potential contamination, and because of the pronounced difficulty in identifying authentic results. Improving the robustness of human aDNA results is a necessary pre-requisite to vigorously testing hypotheses about human evolution in Europe, including possible admixture with Neanderthals. This study approaches the problem of distinguishing between authentic and contaminating sequences from common European mtDNA haplogroups by applying a multiplexed Single-Base-Extension assay, containing both control and coding region sites, to DNA extracted from the Tyrolean Iceman. Results The multiplex assay developed for this study was able to confirm that the Iceman's mtDNA belongs to a new European mtDNA clade with a very limited distribution amongst modern data sets. Controlled contamination experiments show that the correct results are returned by the multiplex assay even in the presence of substantial amounts of exogenous DNA. The overall level of discrimination achieved by targeting both control and coding region polymorphisms in a single reaction provides a methodology capable of dealing with most cases of homoplasy prevalent in European haplogroups. Conclusion The new genotyping results for the Iceman confirm the extreme fallibility of human aDNA studies in general, even when authenticated by independent replication. The sensitivity and accuracy of the multiplex Single-Base-Extension methodology forms part of an emerging suite of alternative techniques for the accurate retrieval of ancient DNA sequences from both anatomically modern humans and Neanderthals. The contamination of laboratories remains a pressing concern in aDNA studies, both in the pre and post-PCR environments, and the adoption of a forensic style assessment of a priori risks would significantly improve the credibility of results.

  18. An analysis of Greek seismicity based on Non Extensive Statistical Physics: The interdependence of magnitude, interevent time and interevent distance.

    Science.gov (United States)

    Efstathiou, Angeliki; Tzanis, Andreas; Vallianatos, Filippos

    2014-05-01

    The context of Non Extensive Statistical Physics (NESP) has recently been suggested to comprise an appropriate tool for the analysis of complex dynamic systems with scale invariance, long-range interactions, long-range memory and systems that evolve in a fractal-like space-time. This is because the active tectonic grain is thought to comprise a (self-organizing) complex system; therefore, its expression (seismicity) should be manifested in the temporal and spatial statistics of energy release rates. In addition to energy release rates expressed by the magnitude M, measures of the temporal and spatial interactions are the time (Δt) and hypocentral distance (Δd) between consecutive events. Recent work indicated that if the distributions of M, Δt and Δd are independent so that the joint probability p(M,Δt,Δd) factorizes into the probabilities of M, Δt and Δd, i.e. p(M,Δt,Δd)= p(M)p(Δt)p(Δd), then the frequency of earthquake occurrence is multiply related, not only to magnitude as the celebrated Gutenberg - Richter law predicts, but also to interevent time and distance by means of well-defined power-laws consistent with NESP. The present work applies these concepts to investigate the self-organization and temporal/spatial dynamics of seismicity in Greece and western Turkey, for the period 1964-2011. The analysis was based on the ISC earthquake catalogue which is homogenous by construction with consistently determined hypocenters and magnitude. The presentation focuses on the analysis of bivariate Frequency-Magnitude-Time distributions, while using the interevent distances as spatial constraints (or spatial filters) for studying the spatial dependence of the energy and time dynamics of the seismicity. It is demonstrated that the frequency of earthquake occurrence is multiply related to the magnitude and the interevent time by means of well-defined multi-dimensional power-laws consistent with NESP and has attributes of universality,as its holds for a broad

  19. A Predicate Based Fault Localization Technique Based On Test Case Reduction

    Directory of Open Access Journals (Sweden)

    Rohit Mishra

    2015-08-01

    Full Text Available ABSTRACT In todays world software testing with statistical fault localization technique is one of most tedious expensive and time consuming activity. In faulty program a program element contrast dynamic spectra that estimate location of fault. There may have negative impact from coincidental correctness with these technique because in non failed run the fault can also be triggered out and if so disturb the assessment of fault location. Now eliminating of confounding rules on the recognizing the accuracy. In this paper coincidental correctness which is an effective interface is the reason of success of fault location. We can find out fault predicates by distribution overlapping of dynamic spectrum in failed runs and non failed runs and slacken the area by referencing the inter class distances of spectra to clamp the less suspicious candidate. After that we apply coverage matrix base reduction approach to reduce the test cases of that program and locate the fault in that program. Finally empirical result shows that our technique outshine with previous existing predicate based fault localization technique with test case reduction.

  20. Lagrangian study of surface transport in the Kuroshio Extension area based on simulation of propagation of Fukushima-derived radionuclides

    CERN Document Server

    Prants, S V; Uleysky, M Yu

    2013-01-01

    Lagrangian approach is applied to study near-surface large-scale transport in the Kuroshio Extension area using a simulation with synthetic particles advected by AVISO altimetric velocity field. A material line technique is applied to find the origin of water masses in cold-core cyclonic rings pinched off from the jet in summer 2011. Tracking and Lagrangian maps provide the evidence of cross-jet transport. Fukushima derived caesium isotopes are used as Lagrangian tracers to study transport and mixing in the area a few months after the March of 2011 tsunami that caused a heavy damage of the Fukushima nuclear power plant (FNPP). Tracking maps are computed to trace the origin of water parcels with measured levels of Cs-134 and Cs-137 concentrations collected in two R/V cruises in June and July 2011 in the large area of the Northwest Pacific. It is shown that Lagrangian simulation is useful to finding the surface areas that are potentially dangerous due to the risk of radioactive contamination. The results of sim...

  1. RP-based Abrading Technique for Graphite EDM Electrode

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Traditional processes for machining mold cavities are lengthy and costly. EDM (electro-discharge machining) is the most commonly used technique to obtain complex mold cavities. However, some electrodes are difficult to fabricate because of the complexity. Applying RP (rapid prototyping) technology to fabricate an abrading tool which is used to abrade graphite EDM electrodes, the cost and cycle time can greatly be reduced. The paper describes the work being conducted in this area by the authors. This technique will find widespread application in rapid steel mold manufacturing.

  2. FUZZY ENTROPY BASED OPTIMAL THRESHOLDING TECHNIQUE FOR IMAGE ENHANCEMENT

    Directory of Open Access Journals (Sweden)

    U.Sesadri

    2015-06-01

    Full Text Available Soft computing is likely to play aprogressively important role in many applications including image enhancement. The paradigm for soft computing is the human mind. The soft computing critique has been particularly strong with fuzzy logic. The fuzzy logic is facts representationas a rule for management of uncertainty. Inthis paperthe Multi-Dimensional optimized problem is addressed by discussing the optimal thresholding usingfuzzyentropyfor Image enhancement. This technique is compared with bi-level and multi-level thresholding and obtained optimal thresholding values for different levels of speckle noisy and low contrasted images. The fuzzy method has produced better results compared to bi-level and multi-level thresholding techniques.

  3. An Empirical Comparative Study of Checklist-based and Ad Hoc Code Reading Techniques in a Distributed Groupware Environment

    Directory of Open Access Journals (Sweden)

    Adenike O. Osofisan

    2009-09-01

    Full Text Available Software inspection is a necessary and important tool for software quality assurance. Since it was introduced by Fagan at IBM in 1976, arguments exist as to which method should be adopted to carry out the exercise, whether it should be paper-based or tool-based, and what reading technique should be used on the inspection document. Extensive works have been done to determine the effectiveness of reviewers in paper-based environment when using ad hoc and checklist reading techniques. In this work, we take the software inspection research further by examining whether there is going to be any significant difference in defect detection effectiveness of reviewers when they use either ad hoc or checklist reading techniques in a distributed groupware environment. Twenty final year undergraduate students of computer science, divided into ad hoc and checklist reviewers groups of ten members each were employed to inspect a medium-sized java code synchronously on groupware deployed on the Internet. The data obtained were subjected to tests of hypotheses using independent t-test and correlation coefficients. Results from the study indicate that there are no significant differences in the defect detection effectiveness, effort in terms of time taken in minutes and false positives reported by the reviewers using either ad hoc or checklist based reading techniques in the distributed groupware environment studied.Key words: Software Inspection, Ad hoc, Checklist, groupware.

  4. Techniques for Scaling Up Analyses Based on Pre-interpretations

    DEFF Research Database (Denmark)

    Gallagher, John Patrick; Henriksen, Kim Steen; Banda, Gourinath

    to be applied to more complex pre-interpretations and larger programs. There are two main techniques presented: the first is a novel algorithm for determinising finite tree automata, yielding a compact ``product" form of the transitions of the result automaton, that is often orders of magnitude...

  5. A color-based visualization technique for multielectrode spike trains.

    Science.gov (United States)

    Jurjut, Ovidiu F; Nikolić, Danko; Pipa, Gordon; Singer, Wolf; Metzler, Dirk; Mureşan, Raul C

    2009-12-01

    Multi electrode recordings of neuronal activity provide an overwhelming amount of data that is often difficult to analyze and interpret. Although various methods exist for treating multielectrode datasets quantitatively, there is a particularly prominent lack of techniques that enable a quick visual exploration of such datasets. Here, by using Kohonen self-organizing maps, we propose a simple technique that allows for the representation of multiple spike trains through a sequence of color-coded population activity vectors. When multiple color sequences are grouped according to a certain criterion, e.g., by stimulation condition or recording time, one can inspect an entire dataset visually and extract quickly information about the identity, stimulus-locking and temporal distribution of multi-neuron activity patterns. Color sequences can be computed on various time scales revealing different aspects of the temporal dynamics and can emphasize high-order correlation patterns that are not detectable with pairwise techniques. Furthermore, this technique is useful for determining the stability of neuronal responses during a recording session. Due to its simplicity and reliance on perceptual grouping, the method is useful for both quick on-line visualization of incoming data and for more detailed post hoc analyses. PMID:19846620

  6. A novel image inpainting technique based on median diffusion

    Indian Academy of Sciences (India)

    Rajkumar L Biradar; Vinayadatt V Kohir

    2013-08-01

    Image inpainting is the technique of filling-in the missing regions and removing unwanted objects from an image by diffusing the pixel information from the neighbourhood pixels. Image inpainting techniques are in use over a long time for various applications like removal of scratches, restoring damaged/missing portions or removal of objects from the images, etc. In this study, we present a simple, yet unexplored (digital) image inpainting technique using median filter, one of the most popular nonlinear (order statistics) filters. The median is maximum likelihood estimate of location for the Laplacian distribution. Hence, the proposed algorithm diffuses median value of pixels from the exterior area into the inner area to be inpainted. The median filter preserves the edge which is an important property needed to inpaint edges. This technique is stable. Experimental results show remarkable improvements and works for homogeneous as well as heterogeneous background. PSNR (quantitative assessment) is used to compare inpainting results.

  7. Bit-depth extension using spatiotemporal microdither based on models of the equivalent input noise of the visual system

    Science.gov (United States)

    Daly, Scott J.; Feng, Xiaofan

    2003-01-01

    Continuous tone, or "contone", imagery usually has 24 bits/pixel as a minimum, with eight bits each for the three primaries in typical displays. However, lower-cost displays constrain this number because of various system limitations. Conversely, high quality displays seek to achieve 9-10 bits/pixel/color, though there may be system bottlenecks limited at 8. The two main artifacts from reduced bit-depth are contouring and loss of amplitude detail; these can be prevented by dithering the image prior to these bit-depth losses. Early work in this area includes Roberts" noise modulation technique, Mista"s blue noise mask, Tyler"s technique of bit-stealing, and Mulligan"s use of the visual system"s spatiotemporal properties for spatiotemporal dithering. However, most halftoning/dithering work was primarily directed to displays at the lower end of bits/pixel (e.g., 1 bit as in halftoning) and higher ppi. Like Tyler, we approach the problem from the higher end of bits/pixel/color, say 6-8, and use available high frequency color content to generate even higher luminance amplitude resolution. Bit-depth extension with a high starting bit-depth (and often lower spatial resolution) changes the game substantially from halftoning experience. For example, complex algorithms like error diffusion and annealing are not needed, just the simple addition of noise. Instead of a spatial dither, it is better to use an amplitude dither, termed microdither by Pappas. We have looked at methods of generating the highest invisible opponent color spatiotemporal noise and other patterns, and have used Ahumada"s concept of equivalent input noise to guide our work. This paper will report on techniques and observations made in achieving contone quality on ~100 ppi 6 bits/pixel/color LCD displays with no visible dither patterns, noise, contours, or loss of amplitude detail at viewing distances as close as the near focus limit (~120 mm). These include the interaction of display nonlinearities and

  8. Activities of colistin- and minocycline-based combinations against extensive drug resistant Acinetobacter baumannii isolates from intensive care unit patients

    Directory of Open Access Journals (Sweden)

    Li Jian

    2011-04-01

    Full Text Available Abstract Background Extensive drug resistance of Acinetobacter baumannii is a serious problem in the clinical setting. It is therefore important to find active antibiotic combinations that could be effective in the treatment of infections caused by this problematic 'superbug'. In this study, we analyzed the in vitro activities of three colistin-based combinations and a minocycline-based combination against clinically isolated extensive drug resistant Acinetobacter baumannii (XDR-AB strains. Methods Fourteen XDR-AB clinical isolates were collected. The clonotypes were determined by polymerase chain reaction-based fingerprinting. Susceptibility testing was carried out according to the standards of the Clinical and Laboratory Standards Institute. Activities of drug combinations were investigated against four selected strains and analyzed by mean survival time over 12 hours (MST12 h in a time-kill study. Results The time-kill studies indicated that the minimum inhibitory concentration (MIC of colistin (0.5 or 0.25 μg/mL completely killed all strains at 2 to 4 hours, but 0.5×MIC colistin showed no bactericidal activity. Meropenem (8 μg/mL, minocycline (1 μg/mL or rifampicin (0.06 μg/mL did not show bactericidal activity. However, combinations of colistin at 0.5×MIC (0.25 or 0.125 μg/mL with each of the above were synergistic and shown bactericidal activities against all test isolates. A combination of meropenem (16 μg/mL with minocycline (0.5×MIC, 4 or 2 μg/mL was synergitic to all test isolates, but neither showed bactericidal activity alone. The MST12 h values of drug combinations (either colistin- or minocycline-based combinations were significantly shorter than those of the single drugs (p Conclusions This study indicates that combinations of colistin/meropenem, colistin/rifampicin, colistin/minocycline and minocycline/meropenem are synergistic in vitro against XDR-AB strains.

  9. Fabrication techniques development of test blanket module based on CLAM

    International Nuclear Information System (INIS)

    The Reduced Activation Ferritic/Martensitic steels (RAFMs) are considered as the primary candidate structural material for the DEMO fusion reactor and the first fusion power plant. China Low Activation Martensitic (CLAM) steel, a version of RAFMs, is being developed in ASIPP (Institute of Plasma Physics, Chinese Academy of Sciences), under wide collaboration with many institutes and universities in China and overseas. The designs of FDS (Fusion Design Study) series liquid LiPb blankets for fusion reactors and corresponding Dual Functional Lithium Lead (DFLL) Test Blanket Module (TBM) in International Thermonuclear Experimental Reactor (ITER) are being carried out in ASIPP. And CLAM steel is chosen as the primary candidate structural material in these designs. So the fabrication techniques for DFLL TBM with CLAM are or urgently needed to be studied in detail. The fabrication of DFLL TBM mainly includes the manufacturing of the First Wall (FW), the Cooling Plates (CP) and the joining of the FW and CPs. Currently, solid Hot Isostatic Pressing (HIP) bonding and uniaxial diffusion bonding method are the most promising candidate fabrication method for the FW and CP. Experiments of HIP and unixial diffusion bonding of CLAM/CLAM were carried out and good joints were obtained. As for the joining technique of FW and CPs, the fusion welding techniques such as Tungsten Inert Gas welding, Laser welding and Electron Beam welding are candidates. Preliminary experiments on these welding techniques were performed. The simulation of thermal process by Gleeble 2000 was also carried out. Results of these experiments are summarized and further R and D plan on blanket fabrication techniques is also stated. (authors)

  10. Assessing the Utility of the Nominal Group Technique as a Consensus-Building Tool in Extension-Led Avian Influenza Response Planning

    Science.gov (United States)

    Kline, Terence R.

    2013-01-01

    The intent of the project described was to apply the Nominal Group Technique (NGT) to achieve a consensus on Avian Influenza (AI) planning in Northeastern Ohio. Nominal Group Technique is a process first developed by Delbecq, Vande Ven, and Gustafsen (1975) to allow all participants to have an equal say in an open forum setting. A very diverse…

  11. Adaptive Landmark-Based Navigation System Using Learning Techniques

    DEFF Research Database (Denmark)

    Zeidan, Bassel; Dasgupta, Sakyasingha; Wörgötter, Florentin;

    2014-01-01

    . Inspired by this, we develop an adaptive landmark-based navigation system based on sequential reinforcement learning. In addition, correlation-based learning is also integrated into the system to improve learning performance. The proposed system has been applied to simulated simple wheeled and more complex...

  12. Wavelet-based techniques for the gamma-ray sky

    Science.gov (United States)

    McDermott, Samuel D.; Fox, Patrick J.; Cholis, Ilias; Lee, Samuel K.

    2016-07-01

    We demonstrate how the image analysis technique of wavelet decomposition can be applied to the gamma-ray sky to separate emission on different angular scales. New structures on scales that differ from the scales of the conventional astrophysical foreground and background uncertainties can be robustly extracted, allowing a model-independent characterization with no presumption of exact signal morphology. As a test case, we generate mock gamma-ray data to demonstrate our ability to extract extended signals without assuming a fixed spatial template. For some point source luminosity functions, our technique also allows us to differentiate a diffuse signal in gamma-rays from dark matter annihilation and extended gamma-ray point source populations in a data-driven way.

  13. Wavelet-Based Techniques for the Gamma-Ray Sky

    CERN Document Server

    McDermott, Samuel D; Cholis, Ilias; Lee, Samuel K

    2015-01-01

    We demonstrate how the image analysis technique of wavelet decomposition can be applied to the gamma-ray sky to separate emission on different angular scales. New structures on scales that differ from the scales of the conventional astrophysical foreground and background uncertainties can be robustly extracted, allowing a model-independent characterization with no presumption of exact signal morphology. As a test case, we generate mock gamma-ray data to demonstrate our ability to extract extended signals without assuming a fixed spatial template. For some point source luminosity functions, our technique also allows us to differentiate a diffuse signal in gamma-rays from dark matter annihilation and extended gamma-ray point source populations in a data-driven way.

  14. An Empirical Evaluation of Density-Based Clustering Techniques

    OpenAIRE

    Glory H. Shah; C K Bhensdadia; Amit P. Ganatra

    2012-01-01

    Emergence of modern techniques for scientific data collection has resulted in large scale accumulation of data pertaining to diverse fields. Conventional database querying methods are inadequate to extract useful information from huge data banks. Cluster analysis is one of the major data analysis methods. It is the art of detecting groups of similar objects in large data sets without having specified groups by means of explicit features. The problem of detecting clusters of points is challeng...

  15. EFFECTIVENESS OF TEST CASE PRIORITIZATION TECHNIQUES BASED ON REGRESSION TESTING

    Directory of Open Access Journals (Sweden)

    Thillaikarasi Muthusamy

    2014-12-01

    Full Text Available Regression testing concentrates on finding defects after a major code change has occurred. Specifically, it exposes software regressions or old bugs that have reappeared. It is an expensive testing process that has been estimated to account for almost half of the cost of software maintenance. To improve the regression testing process, test case prioritization techniques organizes the execution level of test cases. Further, it gives an improved rate of fault identification, when test suites cannot run to completion.

  16. A Novel Histogram Based Robust Image Registration Technique

    OpenAIRE

    Karthikeyan, V.

    2014-01-01

    In this paper, a method for Automatic Image Registration (AIR) through histogram is proposed. Automatic image registration is one of the crucial steps in the analysis of remotely sensed data. A new acquired image must be transformed, using image registration techniques, to match the orientation and scale of previous related images. This new approach combines several segmentations of the pair of images to be registered. A relaxation parameter on the histogram modes delineation is introduced. I...

  17. Failure Mechanism of Rock Bridge Based on Acoustic Emission Technique

    OpenAIRE

    Guoqing Chen; Yan Zhang; Runqiu Huang; Fan Guo; Guofeng Zhang

    2015-01-01

    Acoustic emission (AE) technique is widely used in various fields as a reliable nondestructive examination technology. Two experimental tests were carried out in a rock mechanics laboratory, which include (1) small scale direct shear tests of rock bridge with different lengths and (2) large scale landslide model with locked section. The relationship of AE event count and record time was analyzed during the tests. The AE source location technology and comparative analysis with its actual failu...

  18. FUZZY ENTROPY BASED OPTIMAL THRESHOLDING TECHNIQUE FOR IMAGE ENHANCEMENT

    OpenAIRE

    U.Sesadri; B. Siva Sankar; C. Nagaraju

    2015-01-01

    Soft computing is likely to play aprogressively important role in many applications including image enhancement. The paradigm for soft computing is the human mind. The soft computing critique has been particularly strong with fuzzy logic. The fuzzy logic is facts representationas a rule for management of uncertainty. Inthis paperthe Multi-Dimensional optimized problem is addressed by discussing the optimal thresholding usingfuzzyentropyfor Image enhancement. This technique is compared with bi...

  19. An Automatic Technique for MRI Based Murine Abdominal Fat Measurement

    Directory of Open Access Journals (Sweden)

    R. A. Moats

    2011-12-01

    Full Text Available Because of the well-known relationship between obesity and high incidence of diseases, fat related research using mice models is being widely investigated in preclinical experiments. In the present study, we developed a technique to automatically measure mice abdominal adipose volume and determine the depot locations using Magnetic Resonance Imaging (MRI. Our technique includes an innovative method to detect fat tissues from MR images which not only utilizes the T1 weighted intensity information, but also takes advantage of the transverse relaxation time(T2 calculated from the multiple echo data. The technique contains both a fat optimized MRI imaging acquisition protocol that works well at 7T and a newly designed post processing methodology that can automatically accomplish the fat extraction and depot recognition without user intervention in the segmentation procedure. The post processing methodology has been integrated into easy-to-use software that we have made available via free download. The method was validated by comparing automated results with two independent manual analyses in 26 mice exhibiting different fat ratios from the obesity research project. The comparison confirms a close agreement between the results in total adipose tissue size and voxel-by-voxel overlaps.

  20. A Novel Threshold Estimation Based Face Recognition Technique

    Directory of Open Access Journals (Sweden)

    Aparna Tiwari

    2015-12-01

    Full Text Available In recent times biometric based authentication gained lot of attention due to its advantages. The traditional approaches are PIN or password based which are now not so secure due to the hacking etc. Moreover, password and PIN can be stolen. To counteract such problems, a biometric based identifier can be used which is unique to each user. In the similar context face and finger are most preferred biometrics. In this paper, a face recognition based algorithm Principal Component Analysis (PCA is discussed which is very famous in face recognition. This method is based on Eigen values and Eigen functions. In this method only principal components are considered and other components which are away from the principal axis are discarded, thus dimension reduces significantly. The major problem with biometric is the selection of proper threshold and in general selected heuristically. In this paper an formula based on the Eigen value is derived and obtained results improves significantly.

  1. Applying Knowledge-Based Techniques to Software Development.

    Science.gov (United States)

    Harandi, Mehdi T.

    1986-01-01

    Reviews overall structure and design principles of a knowledge-based programming support tool, the Knowledge-Based Programming Assistant, which is being developed at University of Illinois Urbana-Champaign. The system's major units (program design program coding, and intelligent debugging) and additional functions are described. (MBR)

  2. A Secured Communication Based On Knowledge Engineering Technique

    OpenAIRE

    M. W. Youssef; Hazem El-Gendy

    2012-01-01

    Communication security has become the keynote of the "e" world. Industries like eComm, eGov were built on the technology of computer networks. Those industries cannot afford security breaches. This paper presents a methodology of securing computer communication based on identifying typical communication behavior of each system user based on the dominant set of protocols utilized between the network nodes.

  3. FP-Growth Based New Normalization Technique for Subgraph Ranking

    Directory of Open Access Journals (Sweden)

    E.R.Naganathan

    2011-03-01

    Full Text Available The various problems in large volume of data area have been solved using frequent itemset discoveryalgorithms. As data mining techniques are being introduced and widely applied to non-traditionalitemsets, existing approaches for finding frequent itemsets were out of date as they cannot satisfy therequirement of these domains. Hence, an alternate method of modeling the objects in the said data set, isgraph. Modeling objects using graphs allows us to represent an arbitrary relation among entities. Thegraph is used to model the database objects. Within that model, the problem of finding frequent patternsbecomes that of finding subgraphs that occur frequently over the entire set of graphs. In this paper, wepresent an efficient algorithm for ranking of such frequent subgraphs. This proposed ranking method isapplied to the FP-growth method for discovering frequent subgraphs. In order to find out the ranking ofsubgraphs we present a new normalization technique which is the modified normalization techniqueapplied at each position for a chosen value of Discounted Cumulative Gain (DCG of a subgraph.Instead of DCG another modified approach called Modified Discounted Cumulative Gain (MDCG isintroduced. The MDCG alone cannot be used to achieve the performance from one query to the next inthe search engine’s algorithm. To obtain the new normalization technique an ideal ordering of MDCG(IMDCG at each position is to be found out. A Modified Discounted Cumulative Gain (MDCG iscalculated using “lift” as a new approach. IMDCG is also evaluated. Then the new approach forfinding the normalized values are to be computed. Finally, the values for all rules can be averaged toget an average performance of a ranking algorithm. And also the ordering of obtained values as a resultat each position will provide the order of evaluation of rules which in turn gives an efficient ranking ofmined subgraphs.

  4. Indirect Fluorescent Antibody Technique based Prevalence of Surra in Equines

    Directory of Open Access Journals (Sweden)

    Ahsan Nadeem, Asim Aslam*, Zafar Iqbal Chaudhary, Kamran Ashraf1, Khalid Saeed1, Nisar Ahmad1, Ishtiaq Ahmed and Habib ur Rehman2

    2011-04-01

    Full Text Available This project was carried out to find the prevalence of trypanosomiasis in equine in District Gujranwala by using indirect fluorescent antibody technique and thin smear method. Blood samples were collected from a total of 200 horses and donkeys of different ages and either sex. Duplicate thin blood smears were prepared from each sample and remaining blood samples were centrifuged to separate the serum. Smears from each animal were processed for giemsa staining and indirect fluorescent antibody test (IFAT. Giemsa stained smears revealed Trypanosome infection in 4/200 (2.0% samples and IFAT in 12/200 (6.0% animals.

  5. HPS Electronic Ballast Based on CIC-CPPFC Technique

    Institute of Scientific and Technical Information of China (English)

    王卫; 苏勤; 高国安

    2002-01-01

    Investigates the application of CIC-CPPFC techniques to high-pressure sodium(HPS) lamp electronic ballast. In order to ensure a unity power factor, different power electronic ballasts are studied by PSpice simulation. A dynamic model of HPS lamp with simple and accurate features is proposed for further study of characteristics. Experimental results verify the feasibility of HPS lamp operating at high frequency. It is shown that the presented electronic ballast has 0.99 power factor and 9% total harmonic distortion(THD).

  6. GPU-Based Techniques for Global Illumination Effects

    CERN Document Server

    Szirmay-Kalos, László; Sbert, Mateu

    2008-01-01

    This book presents techniques to render photo-realistic images by programming the Graphics Processing Unit (GPU). We discuss effects such as mirror reflections, refractions, caustics, diffuse or glossy indirect illumination, radiosity, single or multiple scattering in participating media, tone reproduction, glow, and depth of field. This book targets game developers, graphics programmers, and also students with some basic understanding of computer graphics algorithms, rendering APIs like Direct3D or OpenGL, and shader programming. In order to make this book self-contained, the most important c

  7. Hiding of Speech based on Chaotic Steganography and Cryptography Techniques

    Directory of Open Access Journals (Sweden)

    Abbas Salman Hameed

    2015-04-01

    Full Text Available The technique of embedding secret information into cover media, like image, video, audio and text called Steganography, so that only the sender and the authorized recipient who have a key can detect the presence of secret information. In this paper Steganography and Cryptography techniques of speech present with Chaos. Fractional order Lorenz and Chua systems that provides an expanded in key space are used to encrypt speech message. The large key space addition to all properties of randomness and nonlinearity which are possessed these chaotic systems ensure a highly robustness and security for cryptography process. As well as Modified Android Cat Map (MACM offers additional space and security for steganography process. The irregular outputs of the MACM are used in this paper to embed a secret message in a digital cover image. The results show a large key sensitivity to a small change in the secret key or parameters of MACM. Therefore, highly security hiding for speech will be guaranteed by using this system

  8. New techniques in ground-based ionospheric sounding and studies

    Science.gov (United States)

    Reinisch, Bodo W.

    1986-05-01

    Rapid progress in the integrated circuit market has led to new advanced techniques in the remote probing of the ionosphere with HF radio waves. The classical ionosonde which measured virtual height as a function of frequency expanded into a geophysical research tool by measuring all the observables contained in the electromagnetic signals reflected from the ionosphere: amplitude, phase, Doppler, incidence angle, and polarization. A receiving antenna array and high speed digital processing provide the desired spatial and temporal resolution. The current emphasis is on both the on-line and off-line postprocessing of the multiparameter ionogram data to extract the geophysically important ionospheric characteristics: the vertical electron density profiles, horizontal gradients (tilts and waves), plasma drift, the mid-latitude F region trough, and auroral and equatorial spread F. Digital ionosonds deployed in the polar cap and the auroral zone have helped to obtain a better understanding of some of the high-latitude features, and measurements of the equatorial spread F have shown the development and motion of the F region bubbles. HF coherent radar techniques for studying ionospheric irregularity structures measure the velocity of irregularities with scale sizes of one half the radio wavelength. They have mainly been used in the northern auroral zone.

  9. Computer-vision-based registration techniques for augmented reality

    Science.gov (United States)

    Hoff, William A.; Nguyen, Khoi; Lyon, Torsten

    1996-10-01

    Augmented reality is a term used to describe systems in which computer-generated information is superimposed on top of the real world; for example, through the use of a see- through head-mounted display. A human user of such a system could still see and interact with the real world, but have valuable additional information, such as descriptions of important features or instructions for performing physical tasks, superimposed on the world. For example, the computer could identify and overlay them with graphic outlines, labels, and schematics. The graphics are registered to the real-world objects and appear to be 'painted' onto those objects. Augmented reality systems can be used to make productivity aids for tasks such as inspection, manufacturing, and navigation. One of the most critical requirements for augmented reality is to recognize and locate real-world objects with respect to the person's head. Accurate registration is necessary in order to overlay graphics accurately on top of the real-world objects. At the Colorado School of Mines, we have developed a prototype augmented reality system that uses head-mounted cameras and computer vision techniques to accurately register the head to the scene. The current system locates and tracks a set of pre-placed passive fiducial targets placed on the real-world objects. The system computes the pose of the objects and displays graphics overlays using a see-through head-mounted display. This paper describes the architecture of the system and outlines the computer vision techniques used.

  10. A novel fast full inversion based breast ultrasound elastography technique

    International Nuclear Information System (INIS)

    Cancer detection and classification have been the focus of many imaging and therapeutic research studies. Elastography is a non-invasive technique to visualize suspicious soft tissue areas where tissue stiffness is used as image contrast mechanism. In this study, a breast ultrasound elastography system including software and hardware is proposed. Unlike current elastography systems that image the tissue strain and present it as an approximation to relative tissue stiffness, this system is capable of imaging the breast absolute Young’s modulus in fast fashion. To improve the quality of elastography images, a novel system consisting of two load cells has been attached to the ultrasound probe. The load cells measure the breast surface forces to be used for calculating the tissue stress distribution throughout the breast. To facilitate fast imaging, this stress calculation is conducted by an accelerated finite element method. Acquired tissue displacements and surface force data are used as input to the proposed Young’s modulus reconstruction technique. Numerical and tissue mimicking phantom studies were conducted for validating the proposed system. These studies indicated that fast imaging of breast tissue absolute Young’s modulus using the proposed ultrasound elastography system is feasible. The tissue mimicking phantom study indicated that the system is capable of providing reliable absolute Young’s modulus values for both normal tissue and tumour as the maximum Young’s modulus reconstruction error was less than 6%. This demonstrates that the proposed system has a good potential to be used for clinical breast cancer assessment. (paper)

  11. Study of systems and techniques for data base management

    Science.gov (United States)

    1976-01-01

    Data management areas were studied to identify pertinent problems and issues that will affect future NASA data users in terms of performance and cost. Specific topics discussed include the identifications of potential NASA data users other than those normally discussed, consideration affecting the clustering of minicomputers, low cost computer system for information retrieval and analysis, the testing of minicomputer based data base management systems, ongoing work related to the use of dedicated systems for data base management, and the problems of data interchange among a community of NASA data users.

  12. An Empirical Analysis Over the Four Different Feature-Based Face and Iris Biometric Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Deepak Sharma

    2012-10-01

    Full Text Available Recently, multimodal biometric systems have been widely accepted, which has shown increased accuracy and population coverage, while reducing vulnerability to spoofing. The main feature to multimodal biometrics is the amalgamation of different biometric modality data at the feature extraction, matching score, or decision levels. Recently, a lot of works are presented in the literature for multi-modal biometric recognition. In this paper, we have presented comparative analysis of four different feature extraction approaches, such as LBP, LGXP, EMD and PCA. The main steps involved in such four approaches are: 1 Feature extraction from face image, 2 Feature extraction from iris image and 3 Fusion of face and iris features. The performance of the feature extraction methods in multi-modal recognition is analyzed using FMR and FNMR to study the recognition behavior of these approaches. Then, an extensive analysis is carried out to find the effectiveness of different approaches using two different databases. The experimental results show the equal error rate of different feature extraction approaches in multi-modal biometric recognition. From the ROC curve plotted, the performance of the LBP and LGXP method is better compared to PCA-based technique.

  13. Soft Computing Technique Based Enhancement of Transmission System Lodability Incorporating Facts

    Directory of Open Access Journals (Sweden)

    T. Vara Prasad,

    2014-07-01

    Full Text Available Due to the growth of electricity demands and transactions in power markets, existing power networks need to be enhanced in order to increase their loadability. The problem of determining the best locations for network reinforcement can be formulated as a mixed discrete-continuous nonlinear optimization problem (MDCP. The complexity of the problem makes extensive simulations necessary and the computational requirement is high. This paper compares the effectiveness of Evolutionary Programming (EP and an ordinal optimization (OO technique is proposed in this paper to solve the MDCP involving two types of flexible ac transmission systems (FACTS devices, namely static var compensator (SVC and thyristor controlled series compensator (TCSC, for system loadability enhancement. In this approach, crude models are proposed to cope with the complexity of the problem and speed up the simulations with high alignment confidence. The test and Validation of the proposed algorithm are conducted on IEEE 14–bus system and 22-bus Indian system.Simulation results shows that the proposed models permit the use of OO-based approach for finding good enough solutions with less computational efforts.

  14. Surgical technique for repair of complex anterior skull base defects

    Directory of Open Access Journals (Sweden)

    Kevin Reinard

    2015-01-01

    Conclusion: The layered reconstruction of large anterior cranial fossa defects resulted in postoperative CSF leak in only 5% of the patients and represents a simple and effective closure option for skull base surgeons.

  15. Hybrid-Based Compressed Domain Video Fingerprinting Technique

    OpenAIRE

    Abbass S. Abbass; Aliaa A. A. Youssif; Atef Z. Ghalwash

    2012-01-01

    Video fingerprinting is a newer research area. It is also called “content-based video copy detection” or “content-based video identification” in literature. The goal is to locate videos with segments substantially identical to segments of a query video while tolerating common artifacts in video processing. Its value as a tool to curb piracy and legally monetize contents becomes more and more apparent in recent years with the wide spread of Internet videos through user generated content (UGC)...

  16. A Secured Communication Based On Knowledge Engineering Technique

    Directory of Open Access Journals (Sweden)

    M. W. Youssef

    2012-10-01

    Full Text Available Communication security has become the keynote of the "e" world. Industries like eComm, eGov were built on the technology of computer networks. Those industries cannot afford security breaches. This paper presents a methodology of securing computer communication based on identifying typical communication behavior of each system user based on the dominant set of protocols utilized between the network nodes.

  17. A Robust Non-Blind Watermarking Technique for Color Video Based on Combined DWT-DFT Transforms and SVD Technique

    Directory of Open Access Journals (Sweden)

    Nandeesh B

    2014-08-01

    Full Text Available The rise of popularity of Digital video in the past decade has been tremendous thereby leading to malicious copying and distribution. So the need for preservation of ownership and in tackling copyright issues has become an imminent issue. Digital Video Watermarking has been in existence as a solution for this. The paper proposes a non-blind watermarking technique based on combined DWT-DFT transforms using singular values of SVD matrix in YCbCr color space. The technique uses Fibonacci series for selection of frames to enhance security and thereby maintaining quality of original video. Watermark encryption is done by scrambling the watermark using Arnold transform. Geometric and non-geometric attacks on watermarked video have been performed to test the robustness of the proposed technique. Quality of watermarked video is measured using PSNR and NC gives the similarity between extracted and the original watermark.

  18. A Comparative Analysis of Density Based Clustering Techniques for Outlier Mining

    OpenAIRE

    R.Prabahari*,; Dr.V.Thiagarasu

    2014-01-01

    Density based Clustering Algorithms such as Density Based Spatial Clustering of Applications with Noise (DBSCAN), Ordering Points to Identify the Clustering Structure (OPTICS) and DENsity based CLUstering (DENCLUE) are designed to discover clusters of arbitrary shape. DBSCAN grows clusters according to a density based connectivity analysis. OPTICS, which is an extension of DBSCAN used to produce clusters ordering obtained by setting range of parameter. DENCLUE clusters object ...

  19. An Automated Sorting System Based on Virtual Instrumentation Techniques

    Directory of Open Access Journals (Sweden)

    Rodica Holonec

    2008-07-01

    Full Text Available The application presented in this paper represents an experimental model and it refers to the implementing of an automated sorting system for pieces of same shape but different sizes and/or colors. The classification is made according to two features: the color and weight of these pieces. The system is a complex combination of NI Vision hardware and software tools, strain gauges transducers, signal conditioning connected to data acquisition boards, motion and control elements. The system is very useful for students to learn and experiment different virtual instrumentation techniques in order to be able to develop a large field of applications from inspection and process control to sorting and assembly

  20. Tornado wind-loading requirements based on risk assessment techniques

    International Nuclear Information System (INIS)

    Regulations require that nuclear power plants be protected from tornado winds. If struck by a tornado, a plant must be capable of safely shutting down and removing decay heat. Probabilistic techniques are used to show that risk to the public from the US Department of Energy (DOE) SP-100 reactor is acceptable without tornado hardening parts of the secondary system. Relaxed requirements for design wind loadings will result in significant cost savings. To demonstrate an acceptable level of risk, this document examines tornado-initiated accidents. The two tornado-initiated accidents examined in detail are loss of cooling resulting in core damage and loss of secondary system boundary integrity leading to sodium release. Loss of core cooling is analyzed using fault/event tree models. Loss of secondary system boundary integrity is analyzed by comparing the consequences to acceptance criteria for the release of radioactive material or alkali metal aerosol

  1. Tornado wind-loading requirements based on risk assessment techniques

    International Nuclear Information System (INIS)

    Regulations require that nuclear power plants be protected from tornado winds. If struck by a tornado, a plant must be capable of safely shutting down and removing decay heat. Probabilistic techniques are used to show that risk to the public from the US Department of Energy (DOE) SP-100 reactor is acceptable without tornado hardening parts of the secondary system. Relaxed requirements for design wind loadings will result in significant cost savings. To demonstrate an acceptable level of risk, this document examines tornado-initiated accidents. The two tornado-initiated accidents examined in detail are loss of cooling resulting in core damage and loss of secondary system boundary integrity leading to sodium release. Loss of core cooling is analyzed using fault/event tree models. Loss of secondary system boundary integrity is analyzed by comparing the consequences to acceptance criteria for the release of radioactive material or alkali metal aerosol. 4 refs., 4 figs

  2. Quartile Clustering: A quartile based technique for Generating Meaningful Clusters

    CERN Document Server

    Goswami, Saptarsi

    2012-01-01

    Clustering is one of the main tasks in exploratory data analysis and descriptive statistics where the main objective is partitioning observations in groups. Clustering has a broad range of application in varied domains like climate, business, information retrieval, biology, psychology, to name a few. A variety of methods and algorithms have been developed for clustering tasks in the last few decades. We observe that most of these algorithms define a cluster in terms of value of the attributes, density, distance etc. However these definitions fail to attach a clear meaning/semantics to the generated clusters. We argue that clusters having understandable and distinct semantics defined in terms of quartiles/halves are more appealing to business analysts than the clusters defined by data boundaries or prototypes. On the samepremise, we propose our new algorithm named as quartile clustering technique. Through a series of experiments we establish efficacy of this algorithm. We demonstrate that the quartile clusteri...

  3. Development of the accelerator-based technique for hadron therapy

    International Nuclear Information System (INIS)

    Hadron therapy with protons and carbon ions is one of the most effective branches in radiation oncology. It has advantages over therapy using gamma-radiation and electron beams. Fifty thousands of patients per year need such a treatment in Russia. Review of the main modern trends in the development of accelerators for therapy and treatment techniques concerned with respiratory gated irradiation and scanning with the intensity modulated pencil beams is given. Main stages of forming, time-structure and main parameters of the beams used in proton therapy as well as requirements to medicine accelerators are considered. Main results of testing with the beam of C235-V3 cyclotron for the first Russian specialized hospital proton therapy center in Dimitrovgrad are presented. Using of the superconducting accelerators and gantry systems for hadron therapy is considered

  4. Innovative instrumentation for VVERs based in non-invasive techniques

    International Nuclear Information System (INIS)

    Nuclear power plants such as VVERs can greatly benefit from innovative instrumentation to improve plant safety and efficiency. In recent years innovative instrumentation has been developed for PWRs with the aim of providing additional measurements of physical parameters on the primary and secondary circuits: the addition of new instrumentation is made possible by using non-invasive techniques such as ultrasonics and radiation detection. These innovations can be adapted for upgrading VVERs presently in operation and also in future VVERs. The following innovative instrumentation for the control, monitoring or testing at VVERs is described: 1. instrumentation for more accurate primary side direct measurements (for a better monitoring of the primary circuit); 2. instrumentation to monitor radioactivity leaks (for a safer plant); 3. instrumentation-related systems to improve the plant efficiency (for a cheaper kWh)

  5. A fast Stokes inversion technique based on quadratic regression

    Science.gov (United States)

    Teng, Fei; Deng, Yuan-Yong

    2016-05-01

    Stokes inversion calculation is a key process in resolving polarization information on radiation from the Sun and obtaining the associated vector magnetic fields. Even in the cases of simple local thermodynamic equilibrium (LTE) and where the Milne-Eddington approximation is valid, the inversion problem may not be easy to solve. The initial values for the iterations are important in handling the case with multiple minima. In this paper, we develop a fast inversion technique without iterations. The time taken for computation is only 1/100 the time that the iterative algorithm takes. In addition, it can provide available initial values even in cases with lower spectral resolutions. This strategy is useful for a filter-type Stokes spectrograph, such as SDO/HMI and the developed two-dimensional real-time spectrograph (2DS).

  6. Development, improvement and calibration of neutronic reaction rate measurements: elaboration of a base of standard techniques

    International Nuclear Information System (INIS)

    In order to improve and to validate the neutronic calculation schemes, perfecting integral measurements of neutronic parameters is necessary. This thesis focuses on the conception, the improvement and the development of neutronic reaction rates measurements, and aims at building a base of standard techniques. Two subjects are discussed. The first one deals with direct measurements by fission chambers. A short presentation of the different usual techniques is given. Then, those last ones are applied through the example of doubling time measurements on the EOLE facility during the MISTRAL 1 experimental programme. Two calibration devices of fission chambers are developed: a thermal column located in the central part of the MINERVE facility, and a calibration cell using a pulsed high flux neutron generator and based on the discrimination of the energy of the neutrons with a time-of-flight method. This second device will soon allow to measure the mass of fission chambers with a precision of about 1 %. Finally, the necessity of those calibrations will be shown through spectral indices measurements in core MISTRAL 1 (UO2) and MISTRAL 2 (MOX) of the EOLE facility. In each case, the associated calculation schemes, performed using the Monte Carlo MCNP code with the ENDF-BV library, will be validated. Concerning the second one, the goal is to develop a method for measuring the modified conversion ratio of 238U (defined as the ratio of 238U capture rate to total fission rate) by gamma-ray spectrometry of fuel rods. Within the framework of the MISTRAL 1 and MISTRAL 2 programmes, the measurement device, the experimental results and the spectrometer calibration are described. Furthermore, the MCNP calculations of neutron self-shielding and gamma self-absorption are validated. It is finally shown that measurement uncertainties are better than 1 %. The extension of this technique to future modified conversion ratio measurements for 242Pu (on MOX rods) and 232Th (on Thorium rods

  7. Practical Network-Based Techniques for Mobile Positioning in UMTS

    Directory of Open Access Journals (Sweden)

    Borkowski Jakub

    2006-01-01

    Full Text Available This paper presents results of research on network-based positioning for UMTS (universal mobile telecommunication system. Two new applicable network-based cellular location methods are proposed and assessed by field measurements and simulations. The obtained results indicate that estimation of the position at a sufficient accuracy for most of the location-based services does not have to involve significant changes in the terminals and in the network infrastructure. In particular, regular UMTS terminals can be used in the presented PCM (pilot correlation method, while the other proposed method - the ECID+RTT (cell identification + round trip time requires only minor software updates in the network and user equipment. The performed field measurements of the PCM reveal that in an urban network, of users can be located with an accuracy of m. In turn, simulations of the ECID+RTT report accuracy of m– m for of the location estimates in an urban scenario.

  8. Wavelet packet transform-based robust video watermarking technique

    Indian Academy of Sciences (India)

    Gaurav Bhatnagar; Balasubrmanian Raman

    2012-06-01

    In this paper, a wavelet packet transform (WPT)-based robust video watermarking algorithm is proposed. A visible meaningful binary image is used as the watermark. First, sequent frames are extracted from the video clip. Then, WPT is applied on each frame and from each orientation one sub-band is selected based on block mean intensity value called robust sub-band. Watermark is embedded in the robust sub-bands based on the relationship between wavelet packet coefficient and its 8-neighbour $(D_8)$ coefficients considering the robustness and invisibility. Experimental results and comparison with existing algorithms show the robustness and the better performance of the proposed algorithm.

  9. Efficient Identification Using a Prime-Feature-Based Technique

    DEFF Research Database (Denmark)

    Hussain, Dil Muhammad Akbar; Haq, Shaiq A.; Valente, Andrea

    2011-01-01

    , which are called minutiae points. Designing a reliable automatic fingerprint matching algorithm for minimal platform is quite challenging. In real-time systems, efficiency of the matching algorithm is of utmost importance. To achieve this goal, a prime-feature-based indexing algorithm is proposed in...

  10. Problem-Based Learning Supported by Semantic Techniques

    Science.gov (United States)

    Lozano, Esther; Gracia, Jorge; Corcho, Oscar; Noble, Richard A.; Gómez-Pérez, Asunción

    2015-01-01

    Problem-based learning has been applied over the last three decades to a diverse range of learning environments. In this educational approach, different problems are posed to the learners so that they can develop different solutions while learning about the problem domain. When applied to conceptual modelling, and particularly to Qualitative…

  11. Key techniques for space-based solar pumped semiconductor lasers

    Science.gov (United States)

    He, Yang; Xiong, Sheng-jun; Liu, Xiao-long; Han, Wei-hua

    2014-12-01

    In space, the absence of atmospheric turbulence, absorption, dispersion and aerosol factors on laser transmission. Therefore, space-based laser has important values in satellite communication, satellite attitude controlling, space debris clearing, and long distance energy transmission, etc. On the other hand, solar energy is a kind of clean and renewable resources, the average intensity of solar irradiation on the earth is 1353W/m2, and it is even higher in space. Therefore, the space-based solar pumped lasers has attracted much research in recent years, most research focuses on solar pumped solid state lasers and solar pumped fiber lasers. The two lasing principle is based on stimulated emission of the rare earth ions such as Nd, Yb, Cr. The rare earth ions absorb light only in narrow bands. This leads to inefficient absorption of the broad-band solar spectrum, and increases the system heating load, which make the system solar to laser power conversion efficiency very low. As a solar pumped semiconductor lasers could absorb all photons with energy greater than the bandgap. Thus, solar pumped semiconductor lasers could have considerably higher efficiencies than other solar pumped lasers. Besides, solar pumped semiconductor lasers has smaller volume chip, simpler structure and better heat dissipation, it can be mounted on a small satellite platform, can compose satellite array, which can greatly improve the output power of the system, and have flexible character. This paper summarizes the research progress of space-based solar pumped semiconductor lasers, analyses of the key technologies based on several application areas, including the processing of semiconductor chip, the design of small and efficient solar condenser, and the cooling system of lasers, etc. We conclude that the solar pumped vertical cavity surface-emitting semiconductor lasers will have a wide application prospects in the space.

  12. Orientation precision of TEM-based orientation mapping techniques

    International Nuclear Information System (INIS)

    Automatic orientation mapping is an important addition to standard capabilities of conventional transmission electron microscopy (TEM) as it facilitates investigation of crystalline materials. A number of different such mapping systems have been implemented. One of their crucial characteristics is the orientation resolution. The precision in determination of orientations and misorientations reached in practice by TEM-based automatic mapping systems is the main subject of the paper. The analysis is focused on two methods: first, using spot diffraction patterns and ‘template matching’, and second, using Kikuchi patterns and detection of reflections. In simple terms, for typical mapping conditions, their precisions in orientation determination with the confidence of 95% are, respectively, 1.1° and 0.3°. The results are illustrated by example maps of cellular structure in deformed Al, the case for which high orientation sensitivity matters. For more direct comparison, a novel approach to mapping is used: the same patterns are solved by each of the two methods. Proceeding from a classification of the mapping systems, the obtained results may serve as indicators of precisions of other TEM-based orientation mapping methods. The findings are of significance for selection of methods adequate to investigated materials. - Highlights: • Classification of the existing TEM-based orientation mapping systems. • Reliable data on orientation precision in TEM-based orientation maps. • Orientation precisions in spot and Kikuchi based maps estimated to be 1.1° and 0.3°. • New method of mapping by using spot and Kikuchi components of the same patterns

  13. Prediction Method of Speech Recognition Performance Based on HMM-based Speech Synthesis Technique

    Science.gov (United States)

    Terashima, Ryuta; Yoshimura, Takayoshi; Wakita, Toshihiro; Tokuda, Keiichi; Kitamura, Tadashi

    We describe an efficient method that uses a HMM-based speech synthesis technique as a test pattern generator for evaluating the word recognition rate. The recognition rates of each word and speaker can be evaluated by the synthesized speech by using this method. The parameter generation technique can be formulated as an algorithm that can determine the speech parameter vector sequence O by maximizing P(O¦Q,λ) given the model parameter λ and the state sequence Q, under a dynamic acoustic feature constraint. We conducted recognition experiments to illustrate the validity of the method. Approximately 100 speakers were used to train the speaker dependent models for the speech synthesis used in these experiments, and the synthetic speech was generated as the test patterns for the target speech recognizer. As a result, the recognition rate of the HMM-based synthesized speech shows a good correlation with the recognition rate of the actual speech. Furthermore, we find that our method can predict the speaker recognition rate with approximately 2% error on average. Therefore the evaluation of the speaker recognition rate will be performed automatically by using the proposed method.

  14. Detection and sizing of cracks using potential drop techniques based on electromagnetic induction

    International Nuclear Information System (INIS)

    The potential drop techniques based on electromagnetic induction are classified into induced current focused potential drop (ICFPD) technique and remotely induced current potential drop (RICPD) technique. The possibility of numerical simulation of the techniques is investigated and the applicability of these techniques to the measurement of defects in conductive materials is presented. Finite element analysis (FEA) for the RICPD measurements on the plate specimen containing back wall slits is performed and calculated results by FEA show good agreement with experimental results. Detection limit of the RICPD technique in depth of back wall slits can also be estimated by FEA. Detection and sizing of artificial defects in parent and welded materials are successfully performed by the ICFPD technique. Applicability of these techniques to detection of cracks in field components is investigated, and most of the cracks in the components investigated are successfully detected by the ICFPD and RICPD techniques. (author)

  15. New Density-Based Clustering Technique: GMDBSCAN-UR

    OpenAIRE

    Mohammed A. Alhanjouri; Rwand D. Ahmed

    2012-01-01

    Density Based Spatial Clustering of Applications of Noise (DBSCAN) is one of the most popular algorithms for cluster analysis. It can discover clusters with arbitrary shape and separate noises. But this algorithm cannot choose its parameter according to distribution of dataset. It simply uses the global minimum number of points (MinPts) parameter, so that the clustering result of multi-density database is inaccurate. In addition, when it used to cluster large databases, it will cost too much ...

  16. A novel Communication Technique for Nanobots based on acoustic signals

    OpenAIRE

    Loscri, Valeria; Natalizio, Enrico; Mannara, Valentina; Gianluca ALOI

    2012-01-01

    International audience In this work we present the simulation of a swarm of nanobots that behave in a distributed fashion and communicate through vibrations, permitting a decentralized control to treat endogenous diseases of the brain. Each nanobot is able to recognize a cancer cell, eliminate it and announces through a communication based on acoustic signals the presence of the cancer to the other nanobots. We assume that our nano-devices vibrate and these vibrations cause acoustic waves ...

  17. Evolutionary Computing Based Area Integration PWM Technique for Multilevel Inverters

    OpenAIRE

    S. Jeevananthan

    2007-01-01

    The existing multilevel carrier-based pulse width modulation (PWM) strategies have no special provisions to offer quality output, besides lower order harmonics are introduced in the spectrum, especially at low switching frequencies. This paper proposes a novel multilevel PWM strategy to corner the advantages of low frequency switching and reduced total harmonic distortion (THD). The basic idea of the proposed area integration PWM (AIPWM) method is that the area of the required sinusoidal (fun...

  18. DATA MINING BASED TECHNIQUE FOR IDS ALERT CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    Hany Nashat Gabra

    2015-06-01

    Full Text Available Intrusion detection systems (IDSs have become a widely used measure for security systems. The main problem for such systems is the irrelevant alerts. We propose a data mining based method for classification to distinguish serious and irrelevant alerts with a performance of 99.9%, which is better in comparison with the other recent data mining methods that achieved 97%. A ranked alerts list is also created according to the alert’s importance to minimize human interventions.

  19. DATA MINING BASED TECHNIQUE FOR IDS ALERT CLASSIFICATION

    OpenAIRE

    Hany Nashat Gabra; Bahaa-Eldin, Ayman M.; Hoda Korashy Mohammed

    2015-01-01

    Intrusion detection systems (IDSs) have become a widely used measure for security systems. The main problem for such systems is the irrelevant alerts. We propose a data mining based method for classification to distinguish serious and irrelevant alerts with a performance of 99.9%, which is better in comparison with the other recent data mining methods that achieved 97%. A ranked alerts list is also created according to the alert’s importance to minimize human interventions.

  20. Data Mining Based Technique for IDS Alerts Classification

    OpenAIRE

    Gabra, Hany N.; Bahaa-Eldin, Ayman M.; Mohamed, Hoda K.

    2012-01-01

    Intrusion detection systems (IDSs) have become a widely used measure for security systems. The main problem for those systems results is the irrelevant alerts on those results. We will propose a data mining based method for classification to distinguish serious alerts and irrelevant one with a performance of 99.9% which is better in comparison with the other recent data mining methods that have reached the performance of 97%. A ranked alerts list also created according to alerts importance to...

  1. Design and implementation techniques for location-based learning games

    OpenAIRE

    Melero Gallardo, Javier

    2014-01-01

    Over the past few years the use of computer-supported games for learning purposes has reported many educational benefits in terms of students’ motivation and engagement towards learning. However, the comprehensive integration of Game-Based Learning (GBL) environments in formal learning settings is still a challenge that shapes several interdisciplinary research problems in the domain of GBL. A main problem is that for games to be relevant in formal education they need to be aligned with the c...

  2. Shorter window DFT based technique for fault current filtering

    Energy Technology Data Exchange (ETDEWEB)

    Yu, C.S. [National Defence Univ., Taiwan (China); Lee, S.Y. [Northern Taiwan Inst. of Science and Technology, Taipei, Taiwan (China); Wang, S.C. [Lung Hwa Univ. of Science and Technology, Taoyuan, Taiwan (China); Chen, Y.L. [MingChi Univ. of Technology, Taipei, Taiwan (China)

    2006-07-01

    In computer protection relaying design, fault current filtering is one of the most important considerations. To overcome the reach problem caused by the decaying direct current (DC) component, researchers have focused on finding useful algorithms to remove this effect. However, in considering series compensated lines, the algorithms developed for the decaying DC component are not suitable for the subsynchronous frequency component. In addition, several accurate fault location algorithms have been proposed based on the accurate fundamental frequency phasor. However, the vital slow convergence extremely reduces the accuracy and response time of the relaying scheme. An accurate fundamental frequency phasor is therefore essential. This paper presented a damping filter design based on reiterative discrete Fourier transform (DFT) algorithm for fault current filtering in series compensated lines. To damp the measurement, the shorter window DFT based mimic filter was developed. To reconstruct the damped measurement and achieve further damping, a reiterative scheme was then proposed. The recursive form was developed to reduce the computation burden. It was concluded that the algorithm significantly reduced the time needed to obtain the accurate fundamental phasor and provided better performance than that of the conventional DFT algorithm. 9 refs., 14 figs.

  3. [A Terahertz Spectral Database Based on Browser/Server Technique].

    Science.gov (United States)

    Zhang, Zhuo-yong; Song, Yue

    2015-09-01

    With the solution of key scientific and technical problems and development of instrumentation, the application of terahertz technology in various fields has been paid more and more attention. Owing to the unique characteristic advantages, terahertz technology has been showing a broad future in the fields of fast, non-damaging detections, as well as many other fields. Terahertz technology combined with other complementary methods can be used to cope with many difficult practical problems which could not be solved before. One of the critical points for further development of practical terahertz detection methods depends on a good and reliable terahertz spectral database. We developed a BS (browser/server) -based terahertz spectral database recently. We designed the main structure and main functions to fulfill practical requirements. The terahertz spectral database now includes more than 240 items, and the spectral information was collected based on three sources: (1) collection and citation from some other abroad terahertz spectral databases; (2) collected from published literatures; and (3) spectral data measured in our laboratory. The present paper introduced the basic structure and fundament functions of the terahertz spectral database developed in our laboratory. One of the key functions of this THz database is calculation of optical parameters. Some optical parameters including absorption coefficient, refractive index, etc. can be calculated based on the input THz time domain spectra. The other main functions and searching methods of the browser/server-based terahertz spectral database have been discussed. The database search system can provide users convenient functions including user registration, inquiry, displaying spectral figures and molecular structures, spectral matching, etc. The THz database system provides an on-line searching function for registered users. Registered users can compare the input THz spectrum with the spectra of database, according to

  4. Development of high performance structure and ligand based virtual screening techniques

    OpenAIRE

    Shave, Steven R.

    2010-01-01

    Virtual Sreening (VS) is an in silico technique for drug discovery. An overview of VS methods is given and is seen to be approachable from two sides: structure based and ligand based. Structure based virtual screening uses explicit knowledge of the target receptor to suggest candidate receptor-ligand complexes. Ligand based virtual screening can infer required characteristics of binders from known ligands. A consideration for all virtual screening techniques is the amount of co...

  5. Risk-based evaluation of allowed outage time and surveillance test interval extensions for nuclear power plants

    International Nuclear Information System (INIS)

    The main goal of this work is, through the use of Probabilistic Safety Analysis (PSA), to evaluate Technical Specification (TS) Allowed Outage Times (AOT) and Surveillance Test Intervals (STI) extensions for Angra 1 nuclear power plant. PSA has been incorporated as an additional tool, required as part of NPP licensing process. The risk measure used in this work is the Core Damage Frequency (CDF), obtained from the Angra 1 PSA Level 1. AOT and STI extensions are calculated for the Safety Injection System (SIS), Service water System (SAS) and Auxiliary Feedwater System (AFS) through the use of SAPHIRE code. In order to compensate for the risk increase caused by the extensions, compensatory measures as test of redundant train prior to entering maintenance and staggered test strategy are proposed. Results have shown that the proposed AOT extensions are acceptable for the SIS and SAS with the implementation of compensatory measures. The proposed AOT extension is not acceptable for the AFS. The STI extensions are acceptable for all three systems. (author)

  6. Uncertainty and sensitivity analysis of TMI-2 accident scenario using simulation based techniques

    International Nuclear Information System (INIS)

    The Three Mile Island Unit 2 (TMI-2) accident has been studied extensively, as part of both post-accident technical assessment and follow-up computer code calculations. The models used in computer codes for severe accidents have improved significantly over the years due to better understanding. It was decided to reanalyze the severe accident scenario using current state of the art codes and methodologies. This reanalysis was adopted as a part of the joint standard problem exercise for the Atomic Energy Regulatory Board (AERB) - United States Regulatory Commission (USNRC) bilateral safety meet. The accident scenario was divided into four phases for analysis viz., Phase 1 covers from the accident initiation to the shutdown of the last Reactor Coolant Pumps (RCPs) (0 to 100 min), Phase 2 covers initial fuel heat up and core degradation (100 to 174 min), Phase 3 is the period of recovery of the core water level by operating the reactor coolant pump, and the core reheat that followed (174 to 200 min) and Phase 4 covers refilling of the core by high pressure injection (200 to 300 min). The base case analysis was carried out for all four phases. The majority of the predicted parameters are in good agreement with the observed data. However, some parameters have significant deviations compared to the observed data. These discrepancies have arisen from uncertainties in boundary conditions, such as makeup flow, flow during the RCP 2B transient (Phase 3), models used in the code, the adopted nodalisation schemes, etc. In view of this, uncertainty and sensitivity analyses are carried out using simulation based techniques. The paper deals with uncertainty and sensitivity analyses carried out for the first three phases of the accident scenario.

  7. A DCT And SVD based Watermarking Technique To Identify Tag

    OpenAIRE

    Ji, Ke; Lin, Jianbiao; Li, Hui; Wang, Ao; Tang, Tianjing

    2015-01-01

    With the rapid development of the multimedia,the secure of the multimedia is get more concerned. as far as we know , Digital watermarking is an effective way to protect copyright. The watermark must be generally hidden does not affect the quality of the original image. In this paper,a novel way based on discrete cosine transform(DCT) and singular value decomposition(SVD) .In the proposed way,we decomposition the image into 8*8 blocks, next we use the DCT to get the transformed block,then we c...

  8. Vision-based techniques for following paths with mobile robots

    OpenAIRE

    Cherubini, Andrea

    2008-01-01

    In this thesis, we focus on the use of robot vision sensors to solve the path following problem, which in recent years, has been a popular target for engaged researchers around the world. Initially, we shall present the working assumptions that we used, along with the relevant kinematic and sensor models of interest, and with the main characteristics of visual servoing. Then, we shall attempt to survey the research carried out in the field of vision-based path following. Following this, we sh...

  9. SPAM CLASSIFICATION BASED ON SUPERVISED LEARNING USING MACHINE LEARNING TECHNIQUES

    OpenAIRE

    T. Hamsapriya; D. Karthika Renuka; M. Raja Chakkaravarthi

    2011-01-01

    E-mail is one of the most popular and frequently used ways of communication due to its worldwide accessibility, relatively fast message transfer, and low sending cost. The flaws in the e-mail protocols and the increasing amount of electronic business and financial transactions directly contribute to the increase in e-mail-based threats. Email spam is one of the major problems of the today’s Internet, bringing financial damage to companies and annoying individual users. Spam emails are invadin...

  10. Choice probability for apple juice based on novel processing techniques

    DEFF Research Database (Denmark)

    Olsen, Nina Veflen; Menichelli, E.; Grunert, Klaus G.;

    2011-01-01

    , within the core of academic consumer research, MEC has been almost ignored. One plausible explanation for this lack of interest may be that studies linking MEC data to choice have been few. In this study, we are to investigate how values and consequences generated from a previous MEC study structure can...... be linked to likelihood of choice. Hypotheses about European consumers’ likelihood of choice for novel processed juice are stated and tested in a rating based conjoint study in Norway, Denmark, Hungary and Slovakia. In the study, consumers probability of choice for high pressure processed (HPP) juice...

  11. Image processing technique based on image understanding architecture

    Science.gov (United States)

    Kuvychko, Igor

    2000-12-01

    Effectiveness of image applications is directly based on its abilities to resolve ambiguity and uncertainty in the real images. That requires tight integration of low-level image processing with high-level knowledge-based reasoning, which is the solution of the image understanding problem. This article presents a generic computational framework necessary for the solution of image understanding problem -- Spatial Turing Machine. Instead of tape of symbols, it works with hierarchical networks dually represented as discrete and continuous structures. Dual representation provides natural transformation of the continuous image information into the discrete structures, making it available for analysis. Such structures are data and algorithms at the same time and able to perform graph and diagrammatic operations being the basis of intelligence. They can create derivative structures that play role of context, or 'measurement device,' giving the ability to analyze, and run top-bottom algorithms. Symbols naturally emerge there, and symbolic operations work in combination with new simplified methods of computational intelligence. That makes images and scenes self-describing, and provides flexible ways of resolving uncertainty. Classification of images truly invariant to any transformation could be done via matching their derivative structures. New proposed architecture does not require supercomputers, opening ways to the new image technologies.

  12. Sensitivity- and Uncertainty-Based Criticality Safety Validation Techniques

    International Nuclear Information System (INIS)

    The theoretical basis for the application of sensitivity and uncertainty (S/U) analysis methods to the validation of benchmark data sets for use in criticality safety applications is developed. Sensitivity analyses produce energy-dependent sensitivity coefficients that give the relative change in the system multiplication factor keff value as a function of relative changes in the cross-section data by isotope, reaction, and energy. Integral indices are then developed that utilize the sensitivity information to quantify similarities between pairs of systems, typically a benchmark experiment and design system. Uncertainty analyses provide an estimate of the uncertainties in the calculated values of the system keff due to cross-section uncertainties, as well as correlation in the keff uncertainties between systems. These uncertainty correlations provide an additional measure of system similarity. The use of the similarity measures from both S/U analyses in the formal determination of areas of applicability for benchmark experiments is developed. Furthermore, the use of these similarity measures as a trending parameter for the estimation of the computational bias and uncertainty is explored. The S/U analysis results, along with the calculated and measured keff values and estimates of uncertainties in the measurements, were used in this work to demonstrate application of the generalized linear-least-squares methodology (GLLSM) to data validation for criticality safety studies.An illustrative example is used to demonstrate the application of these S/U analysis procedures to actual criticality safety problems. Computational biases, uncertainties, and the upper subcritical limit for the example applications are determined with the new methods and compared to those obtained through traditional criticality safety analysis validation techniques.The GLLSM procedure is also applied to determine cutoff values for the similarity indices such that applicability of a benchmark

  13. Analysis of ISO/IEEE 11073 built-in security and its potential IHE-based extensibility.

    Science.gov (United States)

    Rubio, Óscar J; Trigo, Jesús D; Alesanco, Álvaro; Serrano, Luis; García, José

    2016-04-01

    The ISO/IEEE 11073 standard for Personal Health Devices (X73PHD) aims to ensure interoperability between Personal Health Devices and aggregators-e.g. health appliances, routers-in ambulatory setups. The Integrating the Healthcare Enterprise (IHE) initiative promotes the coordinated use of different standards in healthcare systems (e.g. Personal/Electronic Health Records, alert managers, Clinical Decision Support Systems) by defining profiles intended for medical use cases. X73PHD provides a robust syntactic model and a comprehensive terminology, but it places limited emphasis on security and on interoperability with IHE-compliant systems and frameworks. However, the implementation of eHealth/mHealth applications in environments such as health and fitness monitoring, independent living and disease management (i.e. the X73PHD domains) increasingly requires features such as secure connections to mobile aggregators-e.g. smartphones, tablets-, the sharing of devices among different users with privacy, and interoperability with certain IHE-compliant healthcare systems. This work proposes a comprehensive IHE-based X73PHD extension consisting of additive layers adapted to different eHealth/mHealth applications, after having analyzed the features of X73PHD (especially its built-in security), IHE profiles related with these applications and other research works. Both the new features proposed for each layer and the procedures to support them have been carefully chosen to minimize the impact on X73PHD, on its architecture (in terms of delays and overhead) and on its framework. Such implications are thoroughly analyzed in this paper. As a result, an extended model of X73PHD is proposed, preserving its essential features while extending them with added value. PMID:26883877

  14. The efficacy and toxicity of individualized intensity-modulated radiotherapy based on the tumor extension patterns of nasopharyngeal carcinoma

    Science.gov (United States)

    Zhou, Guan-Qun; Guo, Rui; Zhang, Fan; Zhang, Yuan; Xu, Lin; Zhang, Lu-Lu; Lin, Ai-Hua; Ma, Jun; Sun, Ying

    2016-01-01

    Background To evaluate the efficacy and toxicity of intensity-modulated radiotherapy (IMRT) using individualized clinical target volumes (CTVs) based on the loco-regional extension patterns of nasopharyngeal carcinoma (NPC). Methods From December 2009 to February 2012, 220 patients with histologically-proven, non-disseminated NPC were prospectively treated with IMRT according to an individualized delineation protocol. CTV1 encompassed the gross tumor volume, entire nasopharyngeal mucosa and structures within the pharyngobasilar fascia with a margin. CTV2 encompassed bilateral high risk anatomic sites and downstream anatomic sites adjacent to primary tumor, bilateral retropharyngeal regions, levels II, III and Va, and prophylactic irradiation was gave to one or two levels beyond clinical lymph nodes involvement. Clinical outcomes and toxicities were evaluated. Results Median follow-up was 50.8 (range, 1.3–68.0) months, four-year local relapse-free, regional relapse-free, distant metastasis-free, disease-free and overall survival rates were 94.7%, 97.0%, 91.7%, 87.2% and 91.9%, respectively. Acute severe (≥ grade 3) mucositis, dermatitis and xerostomia were observed in 27.6%, 3.6% and zero patients, respectively. At 1 year, xerostomia was mild, with frequencies of Grade 0, 1, 2 and 3 xerostomia of 27.9%, 63.3%, 8.3% and 0.5%, respectively. Conclusions IMRT using individualized CTVs provided high rates of local and regional control and a favorable toxicity profile in NPC. Individualized CTV delineation strategy is a promising one that may effectively avoid unnecessary or missed irradiation, and deserve optimization to define more precise individualized CTVs. PMID:26980744

  15. SPEECH/MUSIC CLASSIFICATION USING WAVELET BASED FEATURE EXTRACTION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Thiruvengatanadhan Ramalingam

    2014-01-01

    Full Text Available Audio classification serves as the fundamental step towards the rapid growth in audio data volume. Due to the increasing size of the multimedia sources speech and music classification is one of the most important issues for multimedia information retrieval. In this work a speech/music discrimination system is developed which utilizes the Discrete Wavelet Transform (DWT as the acoustic feature. Multi resolution analysis is the most significant statistical way to extract the features from the input signal and in this study, a method is deployed to model the extracted wavelet feature. Support Vector Machines (SVM are based on the principle of structural risk minimization. SVM is applied to classify audio into their classes namely speech and music, by learning from training data. Then the proposed method extends the application of Gaussian Mixture Models (GMM to estimate the probability density function using maximum likelihood decision methods. The system shows significant results with an accuracy of 94.5%.

  16. Quadrant Based WSN Routing Technique By Shifting Of Origin

    Directory of Open Access Journals (Sweden)

    Nandan Banerji

    2013-04-01

    Full Text Available A sensor is a miniaturized, low powered (basically battery powered, limited storage device which can sense the natural phenomenon or things and convert it into electrical energy or vice versa using transduction process. A Wireless Sensor Network (WSN is such a wireless network built using sensors. The sensors communicate with each other’s using wireless medium. They can be deployed in such an environment; inaccessible to human or difficult to reach. Basically there is a vast application on automated world such as robotics, avionics, oceanographic study, space, satellites etc. The routing of a packet from a source node to a destination should be efficient in such a way that must be efficient in case of energy, communication overhead, less intermediate hops. The scheme will help to route the packet with a lesser intermediate nodes as the neighbors are being selected based on their Quadrant position.

  17. Symbolic document image compression based on pattern matching techniques

    Science.gov (United States)

    Shiah, Chwan-Yi; Yen, Yun-Sheng

    2011-10-01

    In this paper, a novel compression algorithm for Chinese document images is proposed. Initially, documents are segmented into readable components such as characters and punctuation marks. Similar patterns within the text are found by shape context matching and grouped to form a set of prototype symbols. Text redundancies can be removed by replacing repeated symbols by their corresponding prototype symbols. To keep the compression visually lossless, we use a multi-stage symbol clustering procedure to group similar symbols and to ensure that there is no visible error in the decompressed image. In the encoding phase, the resulting data streams are encoded by adaptive arithmetic coding. Our results show that the average compression ratio is better than the international standard JBIG2 and the compressed form of a document image is suitable for a content-based keyword searching operation.

  18. Mapping technique based on elemental hair composition data

    International Nuclear Information System (INIS)

    The content of 24 elements has been determined in 2000 hair samples of the inhabitants of Uzbekistan. INAA was used for analysis. Reference material IAEA HH-1 and laboratory standards were used. Measurements were done using a Ge(Li) detector and a multi-channel analyzer. No correlation was found between the element content and hair color and ethnic group, whereas the content of some elements depended on sex, hemoglobin content, blood group, and occupation. Arithmetical and geometrical means and medians were calculated. Cumulative histograms show mainly lognormal distributions. Maps of selected regions, based on the elemental hair composition data, were made. The element content was shown to depend on the location of large cities and biogeochemical anomalies. The relationship between the death rate and the element content in hair in some countries has been shown

  19. Image restoration techniques based on fuzzy neural networks

    Institute of Scientific and Technical Information of China (English)

    刘普寅; 李洪兴

    2002-01-01

    By establishing some suitable partitions of input and output spaces, a novel fuzzy neuralnetwork (FNN) which is called selection type FNN is developed. Such a system is a multilayerfeedforward neural network, which can be a universal approximator with maximum norm. Based ona family of fuzzy inference rules that are of real senses, a simple and useful inference type FNN isconstructed. As a result, the fusion of selection type FNN and inference type FNN results in a novelfilter-FNN filter. It is simple in structure. And also it is convenient to design the learning algorithmfor structural parameters. Further, FNN filter can efficiently suppress impulse noise superimposed onimage and preserve fine image structure, simultaneously. Some examples are simulated to confirmthe advantages of FNN filter over other filters, such as median filter and adaptive weighted fuzzymean (AWFM) filter and so on, in suppression of noises and preservation of image structure.

  20. Ultrasound-based technique for intrathoracic surgical guidance

    Science.gov (United States)

    Huang, Xishi; Hill, Nicholas A.; Peters, Terry M.

    2005-04-01

    Image-guided procedures within the thoracic cavity require accurate registration of a pre-operative virtual model to the patient. Currently, surface landmarks are used for thoracic cavity registration; however, this approach is unreliable due to skin movement relative to the ribs. An alternative method for providing surgeons with image feedback in the operating room is to integrate images acquired during surgery with images acquired pre-operatively. This integration process is required to be automatic, fast, accurate and robust; however inter-modal image registration is difficult due to the lack of a direct relationship between the intensities of the two image sets. To address this problem, Computed Tomography (CT) was used to acquire pre-operative images and Ultrasound (US) was used to acquire peri-operative images. Since bone has a high electron density and is highly echogenic, the rib cage is visualized as a bright white boundary in both datasets. The proposed approach utilizes the ribs as the basis for an intensity-based registration method -- mutual information. We validated this approach using a thorax phantom. Validation results demonstrate that this approach is accurate and shows little variation between operators. The fiducial registration error, the registration error between the US and CT images, was < 1.5mm. We propose this registration method as a basis for precise tracking of minimally invasive thoracic procedures. This method will permit the planning and guidance of image-guided minimally invasive procedures for the lungs, as well as for both catheter-based and direct trans-mural interventions within the beating heart.

  1. Computational Intelligence based techniques for islanding detection of distributed generation in distribution network: A review

    International Nuclear Information System (INIS)

    Highlights: • Unintentional and intentional islanding, their causes, and solutions are presented. • Remote, passive, active and hybrid islanding detection techniques are discussed. • The limitation of these techniques in accurately detect islanding are discussed. • Computational intelligence techniques ability in detecting islanding is discussed. • Review of ANN, fuzzy logic control, ANFIS, Decision tree techniques is provided. - Abstract: Accurate and fast islanding detection of distributed generation is highly important for its successful operation in distribution networks. Up to now, various islanding detection technique based on communication, passive, active and hybrid methods have been proposed. However, each technique suffers from certain demerits that cause inaccuracies in islanding detection. Computational intelligence based techniques, due to their robustness and flexibility in dealing with complex nonlinear systems, is an option that might solve this problem. This paper aims to provide a comprehensive review of computational intelligence based techniques applied for islanding detection of distributed generation. Moreover, the paper compares the accuracies of computational intelligence based techniques over existing techniques to provide a handful of information for industries and utility researchers to determine the best method for their respective system

  2. Study of hydrogen in coals, polymers, oxides, and muscle water by nuclear magnetic resonance; extension of solid-state high-resolution techniques. [Hydrogen molybdenum bronze

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, L.M.

    1981-10-01

    Nuclear magnetic resonance (NMR) spectroscopy has been an important analytical and physical research tool for several decades. One area of NMR which has undergone considerable development in recent years is high resolution NMR of solids. In particular, high resolution solid state /sup 13/C NMR spectra exhibiting features similar to those observed in liquids are currently achievable using sophisticated pulse techniques. The work described in this thesis develops analogous methods for high resolution /sup 1/H NMR of rigid solids. Applications include characterization of hydrogen aromaticities in fossil fuels, and studies of hydrogen in oxides and bound water in muscle.

  3. A content-based image retrieval method for optical colonoscopy images based on image recognition techniques

    Science.gov (United States)

    Nosato, Hirokazu; Sakanashi, Hidenori; Takahashi, Eiichi; Murakawa, Masahiro

    2015-03-01

    This paper proposes a content-based image retrieval method for optical colonoscopy images that can find images similar to ones being diagnosed. Optical colonoscopy is a method of direct observation for colons and rectums to diagnose bowel diseases. It is the most common procedure for screening, surveillance and treatment. However, diagnostic accuracy for intractable inflammatory bowel diseases, such as ulcerative colitis (UC), is highly dependent on the experience and knowledge of the medical doctor, because there is considerable variety in the appearances of colonic mucosa within inflammations with UC. In order to solve this issue, this paper proposes a content-based image retrieval method based on image recognition techniques. The proposed retrieval method can find similar images from a database of images diagnosed as UC, and can potentially furnish the medical records associated with the retrieved images to assist the UC diagnosis. Within the proposed method, color histogram features and higher order local auto-correlation (HLAC) features are adopted to represent the color information and geometrical information of optical colonoscopy images, respectively. Moreover, considering various characteristics of UC colonoscopy images, such as vascular patterns and the roughness of the colonic mucosa, we also propose an image enhancement method to highlight the appearances of colonic mucosa in UC. In an experiment using 161 UC images from 32 patients, we demonstrate that our method improves the accuracy of retrieving similar UC images.

  4. Measurement of particle size based on digital imaging technique

    Institute of Scientific and Technical Information of China (English)

    CHEN Hong; TANG Hong-wu; LIU Yun; WANG Hao; LIU Gui-ping

    2013-01-01

    To improve the analysis methods for the measurement of the sediment particle sizes with a wide distribution and of irregular shapes,a sediment particle image measurement,an analysis system,and an extraction algorithm of the optimal threshold based on the gray histogram peak values are proposed.Recording the pixels of the sediment particles by labeling them,the algorithm can effectively separate the sediment particle images from the background images using the equivalent pixel circles with the same diameters to represent the sediment particles.Compared with the laser analyzer for the case of blue plastic sands,the measurement results of the system are shown to be reasonably similar.The errors are mainly due to the small size of the particles and the limitation of the apparatus.The measurement accuracy can be improved by increasing the Charge-Coupled Devices (CCD) camera resolution.The analysis method of the sediment particle images can provide a technical support for the rapid measurement of the sediment particle size and its distribution.

  5. PELAN - a transportable, neutron-based UXO identification technique

    International Nuclear Information System (INIS)

    An elemental characterization method is used to differentiate between inert projectiles and UXO's. This method identifies in a non-intrusive, nondestructive manner, the elemental composition of the projectile contents. Most major and minor chemical elements within the interrogated object (hydrogen, carbon, nitrogen, oxygen, fluorine, phosphorus, chlorine, arsenic, etc.) are identified and quantified. The method is based on PELAN - Pulsed Elemental Analysis with Neutrons. PELAN uses pulsed neutrons produced from a compact, sealed tube neutron generator. Using an automatic analysis computer program, the quantities of each major and minor chemical element are determined. A decision-making tree identifies the object by comparing its elemental composition with stored elemental composition libraries of substances that could be contained within the projectile. In a series of blind tests, PELAN was able to identify without failure, the contents of each shell placed in front of it. The PELAN probe does not need to be in contact with the interrogated projectile. If the object is buried, the interrogation can take place in situ provided the probe can be inserted a few centimeters from the object's surface. (author)

  6. Damage identification in beams by a response surface based technique

    Directory of Open Access Journals (Sweden)

    Teidj S.

    2014-01-01

    Full Text Available In this work, identification of damage in uniform homogeneous metallic beams was considered through the propagation of non dispersive elastic torsional waves. The proposed damage detection procedure consisted of the following sequence. Giving a localized torque excitation, having the form of a short half-sine pulse, the first step was calculating the transient solution of the resulting torsional wave. This torque could be generated in practice by means of asymmetric laser irradiation of the beam surface. Then, a localized defect assumed to be characterized by an abrupt reduction of beam section area with a given height and extent was placed at a known location of the beam. Next, the response in terms of transverse section rotation rate was obtained for a point situated afterwards the defect, where the sensor was positioned. This last could utilize in practice the concept of laser vibrometry. A parametric study has been conducted after that by using a full factorial design of experiments table and numerical simulations based on a finite difference characteristic scheme. This has enabled the derivation of a response surface model that was shown to represent adequately the response of the system in terms of the following factors: defect extent and severity. The final step was performing the inverse problem solution in order to identify the defect characteristics by using measurement.

  7. Automated tree crown delineation from imagery based on morphological techniques

    International Nuclear Information System (INIS)

    In current tree crown delineation from imagery, treetops and three dimensional (3D) radiometric shapes of tree crowns are frequently extracted from a spectral band or a brightness component of the image and taken as references to localize and delineate tree crowns. However, color components of the image are rarely used together with the brightness component of the image to facilitate localizing and delineating crowns. The 3D radiometric shape of a crown can be derived from a brightness or color component and may be taken as a half-ellipsoid. From top to bottom of such a half-ellipsoid, multiple horizontal slices can be drawn, contain the treetop, and indicate both the location and the horizontal extent of the crown. Based on such a concept of horizontal slices of crowns, a novel multi-scale method for individual tree crown delineation from imagery was proposed in this study. In this method, the brightness and color components of the image are morphologically opened within the scale range of target crowns, horizontal slices of target crowns are extracted from the resulting opened images and integrated together to localize crowns, and one component is segmented using the watershed approach with reference to the integrated slices. In an experiment on high spatial resolution aerial imagery over natural closed canopy forests, the proposed method correctly delineated approximately 74% of mixedwood tree crowns and 59% of deciduous crowns in the natural forests

  8. Whole genome sequencing-based characterization of extensively drug resistant (XDR) strains of Mycobacterium tuberculosis from Pakistan

    KAUST Repository

    Hasan, Zahra

    2015-03-01

    Objectives: The global increase in drug resistance in Mycobacterium tuberculosis (MTB) strains increases the focus on improved molecular diagnostics for MTB. Extensively drug-resistant (XDR) - TB is caused by MTB strains resistant to rifampicin, isoniazid, fluoroquinolone and aminoglycoside antibiotics. Resistance to anti-tuberculous drugs has been associated with single nucleotide polymorphisms (SNPs), in particular MTB genes. However, there is regional variation between MTB lineages and the SNPs associated with resistance. Therefore, there is a need to identify common resistance conferring SNPs so that effective molecular-based diagnostic tests for MTB can be developed. This study investigated used whole genome sequencing (WGS) to characterize 37 XDR MTB isolates from Pakistan and investigated SNPs related to drug resistance. Methods: XDR-TB strains were selected. DNA was extracted from MTB strains, and samples underwent WGS with 76-base-paired end fragment sizes using Illumina paired end HiSeq2000 technology. Raw sequence data were mapped uniquely to H37Rv reference genome. The mappings allowed SNPs and small indels to be called using SAMtools/BCFtools. Results: This study found that in all XDR strains, rifampicin resistance was attributable to SNPs in the rpoB RDR region. Isoniazid resistance-associated mutations were primarily related to katG codon 315 followed by inhA S94A. Fluoroquinolone resistance was attributable to gyrA 91-94 codons in most strains, while one did not have SNPs in either gyrA or gyrB. Aminoglycoside resistance was mostly associated with SNPs in rrs, except in 6 strains. Ethambutol resistant strains had embB codon 306 mutations, but many strains did not have this present. The SNPs were compared with those present in commercial assays such as LiPA Hain MDRTBsl, and the sensitivity of the assays for these strains was evaluated. Conclusions: If common drug resistance associated with SNPs evaluated the concordance between phenotypic and

  9. Influence of an extensive inquiry-based field experience on pre-service elementary student teachers' science teaching beliefs

    Science.gov (United States)

    Bhattacharyya, Sumita

    This study examined the effects of an extensive inquiry-based field experience on pre-service elementary teachers' personal agency beliefs (PAB) about teaching science and their ability to effectively implement science instruction. The research combined quantitative and qualitative approaches within an ethnographic research tradition. A comparison was made between the pre and posttest scores for two groups. The experimental group utilized the inquiry method; the control group did not. The experimental group had the stronger PAB pattern. The field experience caused no significant differences to the context beliefs of either groups, but did to the capability beliefs. The number of college science courses taken by pre-service elementary teachers' was positively related to their post capability belief (p = .0209). Qualitative information was collected through case studies which included observation of classrooms, assessment of lesson plans and open-ended, extended interviews of the participants about their beliefs in their teaching abilities (efficacy beliefs), and in teaching environments (context beliefs). The interview data were analyzed by the analytic induction method to look for themes. The emerging themes were then grouped under several attributes. Following a review of the attributes a number of hypotheses were formulated. Each hypothesis was then tested across all the cases by the constant comparative method. The pattern of relationship that emerged from the hypotheses testing clearly suggests a new hypothesis that there is a spiral relationship among the ability to establish communicative relationship with students, desire for personal growth and improvement, and greater content knowledge. The study concluded that inquiry based student teaching should be encouraged to train school science teachers. But the meaning and the practice of the inquiry method should be clearly delineated to ensure its correct implementation in the classroom. A survey should be

  10. Design of process displays based on risk analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lundtang Paulsen, J

    2004-05-01

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, especially in view of the enormous amount of information available in computer-based supervision systems. The state of the art is discussed: How are supervision systems designed today and why? Which strategies are used? What kind of research is going on? Four different plants and their display systems, designed by the author, are described and discussed. Next we outline different methods for eliciting knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chapter presents an overview of the various types of operation references: constitutive equations, set points, design parameters, component characteristics etc., and their validity in different situations. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engineer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described in some detail. Finally we address the problem of where to put the dot and the lines: when all information is on the table, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose was to develop a software tool for maintenance supervision of components in a nuclear power plant. (au)

  11. Design of process displays based on risk analysis techniques

    International Nuclear Information System (INIS)

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, especially in view of the enormous amount of information available in computer-based supervision systems. The state of the art is discussed: How are supervision systems designed today and why? Which strategies are used? What kind of research is going on? Four different plants and their display systems, designed by the author, are described and discussed. Next we outline different methods for eliciting knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chapter presents an overview of the various types of operation references: constitutive equations, set points, design parameters, component characteristics etc., and their validity in different situations. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engineer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described in some detail. Finally we address the problem of where to put the dot and the lines: when all information is on the table, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose was to develop a software tool for maintenance supervision of components in a nuclear power plant. (au)

  12. Arithmetic and Frequency Filtering Methods of Pixel-Based Image Fusion Techniques

    CERN Document Server

    Al-Wassai, Firouz Abdullah; Al-Zuky, Ali A

    2011-01-01

    In remote sensing, image fusion technique is a useful tool used to fuse high spatial resolution panchromatic images (PAN) with lower spatial resolution multispectral images (MS) to create a high spatial resolution multispectral of image fusion (F) while preserving the spectral information in the multispectral image (MS).There are many PAN sharpening techniques or Pixel-Based image fusion techniques that have been developed to try to enhance the spatial resolution and the spectral property preservation of the MS. This paper attempts to undertake the study of image fusion, by using two types of pixel-based image fusion techniques i.e. Arithmetic Combination and Frequency Filtering Methods of Pixel-Based Image Fusion Techniques. The first type includes Brovey Transform (BT), Color Normalized Transformation (CN) and Multiplicative Method (MLT). The second type include High-Pass Filter Additive Method (HPFA), High-Frequency-Addition Method (HFA) High Frequency Modulation Method (HFM) and The Wavelet transform-base...

  13. On-line hydrogen-isotope measurements of organic samples using elemental chromium: an extension for high temperature elemental-analyzer techniques.

    Science.gov (United States)

    Gehre, Matthias; Renpenning, Julian; Gilevska, Tetyana; Qi, Haiping; Coplen, Tyler B; Meijer, Harro A J; Brand, Willi A; Schimmelmann, Arndt

    2015-01-01

    The high temperature conversion (HTC) technique using an elemental analyzer with a glassy carbon tube and filling (temperature conversion/elemental analysis, TC/EA) is a widely used method for hydrogen isotopic analysis of water and many solid and liquid organic samples with analysis by isotope-ratio mass spectrometry (IRMS). However, the TC/EA IRMS method may produce inaccurate δ(2)H results, with values deviating by more than 20 mUr (milliurey = 0.001 = 1‰) from the true value for some materials. We show that a single-oven, chromium-filled elemental analyzer coupled to an IRMS substantially improves the measurement quality and reliability for hydrogen isotopic compositions of organic substances (Cr-EA method). Hot chromium maximizes the yield of molecular hydrogen in a helium carrier gas by irreversibly and quantitatively scavenging all reactive elements except hydrogen. In contrast, under TC/EA conditions, heteroelements like nitrogen or chlorine (and other halogens) can form hydrogen cyanide (HCN) or hydrogen chloride (HCl) and this can cause isotopic fractionation. The Cr-EA technique thus expands the analytical possibilities for on-line hydrogen-isotope measurements of organic samples significantly. This method yielded reproducibility values (1-sigma) for δ(2)H measurements on water and caffeine samples of better than 1.0 and 0.5 mUr, respectively. To overcome handling problems with water as the principal calibration anchor for hydrogen isotopic measurements, we have employed an effective and simple strategy using reference waters or other liquids sealed in silver-tube segments. These crimped silver tubes can be employed in both the Cr-EA and TC/EA techniques. They simplify considerably the normalization of hydrogen-isotope measurement data to the VSMOW-SLAP (Vienna Standard Mean Ocean Water-Standard Light Antarctic Precipitation) scale, and their use improves accuracy of the data by eliminating evaporative loss and associated isotopic fractionation while

  14. PDE-based Non-Linear Diffusion Techniques for Denoising Scientific and Industrial Images: An Empirical Study

    Energy Technology Data Exchange (ETDEWEB)

    Weeratunga, S K; Kamath, C

    2001-12-20

    Removing noise from data is often the first step in data analysis. Denoising techniques should not only reduce the noise, but do so without blurring or changing the location of the edges. Many approaches have been proposed to accomplish this; in this paper, they focus on one such approach, namely the use of non-linear diffusion operators. This approach has been studied extensively from a theoretical viewpoint ever since the 1987 work of Perona and Malik showed that non-linear filters outperformed the more traditional linear Canny edge detector. They complement this theoretical work by investigating the performance of several isotropic diffusion operators on test images from scientific domains. They explore the effects of various parameters such as the choice of diffusivity function, explicit and implicit methods for the discretization of the PDE, and approaches for the spatial discretization of the non-linear operator etc. They also compare these schemes with simple spatial filters and the more complex wavelet-based shrinkage techniques. The empirical results show that, with an appropriate choice of parameters, diffusion-based schemes can be as effective as competitive techniques.

  15. Maternal mortality in rural south Ethiopia: outcomes of community-based birth registration by health extension workers.

    Directory of Open Access Journals (Sweden)

    Yaliso Yaya

    Full Text Available Rural communities in low-income countries lack vital registrations to track birth outcomes. We aimed to examine the feasibility of community-based birth registration and measure maternal mortality ratio (MMR in rural south Ethiopia.In 2010, health extension workers (HEWs registered births and maternal deaths among 421,639 people in three districts (Derashe, Bonke, and Arba Minch Zuria. One nurse-supervisor per district provided administrative and technical support to HEWs. The primary outcomes were the feasibility of registration of a high proportion of births and measuring MMR. The secondary outcome was the proportion of skilled birth attendance. We validated the completeness of the registry and the MMR by conducting a house-to-house survey in 15 randomly selected villages in Bonke.We registered 10,987 births (81·4% of expected 13,492 births with annual crude birth rate of 32 per 1,000 population. The validation study showed that, of 2,401 births occurred in the surveyed households within eight months of the initiation of the registry, 71·6% (1,718 were registered with similar MMRs (474 vs. 439 between the registered and unregistered births. Overall, we recorded 53 maternal deaths; MMR was 489 per 100,000 live births and 83% (44 of 53 maternal deaths occurred at home. Ninety percent (9,863 births were at home, 4% (430 at health posts, 2·5% (282 at health centres, and 3·5% (412 in hospitals. MMR increased if: the male partners were illiterate (609 vs. 346; p= 0·051 and the villages had no road access (946 vs. 410; p= 0·039. The validation helped to increase the registration coverage by 10% through feedback discussions.It is possible to obtain a high-coverage birth registration and measure MMR in rural communities where a functional system of community health workers exists. The MMR was high in rural south Ethiopia and most births and maternal deaths occurred at home.

  16. Arrayed primer extension in the "array of arrays" format: a rational approach for microarray-based SNP genotyping

    DEFF Research Database (Denmark)

    Klitø, Niels G F; Tan, Qihua; Nyegaard, Mette;

    2007-01-01

    This study provides a new version of the arrayed primer extension (APEX) protocol adapted to the 'array of arrays' platform using an instrumental setup for microarray processing not previously described. The primary aim of the study is to implement a system for rational cost-efficient genotyping...

  17. Differential Cyclic Voltammetry - a Novel Technique for Selective and Simultaneous Detection using Redox Cycling Based Sensors

    OpenAIRE

    Odijk, M.; Wiedemair, J.; Megen, M.J.J; Olthuis, W.; Van den Berg, A.

    2010-01-01

    Redox cycling (RC) is an effect that is used to amplify electrochemical signals. However, traditional techniques such as cyclic voltammetry (CV) do not provide clear insight for a mixture of multiple redox couples while RC is applied. Thus, we have developed a new measurement technique which delivers electrochemical spectra of all reversible redox couples present based on concentrations and standard potentials. This technique has been named differential cyclic voltammetry (DCV). We have fabri...

  18. MVClustViz: A Novice Yet Simple Multivariate Cluster Visualization Technique for Centroid-based Clusters

    OpenAIRE

    Sagar S. De; Minati Mishra; Satchidananda Dehuri

    2013-01-01

    In the visual data mining, visualization of clusters is a challenging task. Although lots of techniques already have been developed, the challenges still remain to represent large volume of data with multiple dimension and overlapped clusters. In this paper, a multivariate clusters visualization technique (MVClustViz) has been presented to visualize the centroid-based clusters. The geographic projection technique supports multi-dimension, large volume, and both crisp and fuzzy clusters visual...

  19. Image based techniques for determining spread patterns of centrifugal fertilizer spreaders

    OpenAIRE

    Cool, Simon; Jan G Pieters; Koen C. Mertens; Nuyttens, D.; Hijazi, Bilal; Dubois, Julien; Cointault, Frédéric; Vangeyte, Jürgen

    2015-01-01

    Precision fertilization requires new techniques for determining the spread pattern of fertilizer spreaders. Because of the accuracy and non-intrusive nature, techniques based on digital image processing are most promising. Using image processing, dynamics of particles leaving the spreader can be determined. Combined with a ballistic flight model, this allows predicting the landing position of individual fertilizer particles. In a first approach, a two-dimensional imaging technique was used wi...

  20. Intrusion Detection Systems Based on Artificial Intelligence Techniques in Wireless Sensor Networks

    OpenAIRE

    Nabil Ali Alrajeh; Lloret, J

    2013-01-01

    Intrusion detection system (IDS) is regarded as the second line of defense against network anomalies and threats. IDS plays an important role in network security. There are many techniques which are used to design IDSs for specific scenario and applications. Artificial intelligence techniques are widely used for threats detection. This paper presents a critical study on genetic algorithm, artificial immune, and artificial neural network (ANN) based IDSs techniques used in wireless sensor netw...

  1. Wood lens design philosophy based on a binary additive manufacturing technique

    Science.gov (United States)

    Marasco, Peter L.; Bailey, Christopher

    2016-04-01

    Using additive manufacturing techniques in optical engineering to construct a gradient index (GRIN) optic may overcome a number of limitations of GRIN technology. Such techniques are maturing quickly, yielding additional design degrees of freedom for the engineer. How best to employ these degrees of freedom is not completely clear at this time. This paper describes a preliminary design philosophy, including assumptions, pertaining to a particular printing technique for GRIN optics. It includes an analysis based on simulation and initial component measurement.

  2. INTELLIGENT CAR STYLING TECHNIQUE AND SYSTEM BASED ON A NEW AERODYNAMIC-THEORETICAL MODEL

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Car styling technique based on a new theoretical model of automotive aerodynamics is introduced, which is proved to be feasible and effective by wind tunnel tests. Development of a multi-module software system from this technique, including modules of knowledge processing, referential styling and ANN aesthetic evaluation etc, capable of assisting car styling works in an intelligent way, is also presented and discussed.

  3. A new approach of binary addition and subtraction by non-linear material based switching technique

    Indian Academy of Sciences (India)

    Archan Kumar Das; Partha Partima Das; Sourangshu Mukhopadhyay

    2005-02-01

    Here, we refer a new proposal of binary addition as well as subtraction in all-optical domain by exploitation of proper non-linear material-based switching technique. In this communication, the authors extend this technique for both adder and subtractor accommodating the spatial input encoding system.

  4. Application of USP inlet extensions to the TSI impactor system 3306/3320 using HFA 227 based solution metered dose inhalers.

    Science.gov (United States)

    Mogalian, Erik; Myrdal, Paul Brian

    2005-12-01

    The objective of this study was to further evaluate the need for a vertical inlet extension when testing solution metered dose inhalers using the TSI Model 3306 Impactor Inlet in conjunction with the TSI Model 3320 Aerodynamic Particle Sizer (APS). The configurations tested using the TSI system were compared to baseline measurements that were performed using the Andersen Mark II 8-stage cascade impactor (ACI). Seven pressurized solution metered dose inhalers were tested using varied concentrations of beclomethasone dipropionate (BDP), ethanol, and HFA 227 propellant. The inhalers were tested with the cascade impactor, and with the TSI system. The TSI system had three different configurations as the manufacturer provided (0 cm) or with inlet extensions of 20 and 40 cm. The extensions were located between the USP inlet and the Model 3306 Impactor Inlet. There were no practical differences between each system for the stem, actuator, or USP inlet. The fine particle mass (aerodynamic mass < 4.7 microm) was affected by extension length and correlated well with the ACI when an extension was present. APS particle size measurements were unaffected by the extension lengths and correlated well to particle size determined from the ACI analysis. It has been confirmed that an inlet extension may be necessary for the TSI system in order to give mass results that correlate to the ACI, especially for formulations having significant concentrations of low volatility excipients. Additionally, the results generated from this study were used to evaluate the product performance of HFA 227 based solution formulations that contain varying concentrations of ethanol as a cosolvent. PMID:16316853

  5. Measuring glioma volumes: A comparison of linear measurement based formulae with the manual image segmentation technique

    Directory of Open Access Journals (Sweden)

    Sanjeev A Sreenivasan

    2016-01-01

    Conclusions: Manual region of interest-based image segmentation is the standard technique for measuring glioma volumes. For routine clinical use, the simple formula v = abc/2 (or the formula for volume of an ellipsoid could be used as alternatives.

  6. Novel Metaknowledge-based Processing Technique for Multimedia Big Data clustering challenges

    OpenAIRE

    Bari, Nima; Vichr, Roman; Kowsari, Kamran; Berkovich, Simon Y.

    2015-01-01

    Past research has challenged us with the task of showing relational patterns between text-based data and then clustering for predictive analysis using Golay Code technique. We focus on a novel approach to extract metaknowledge in multimedia datasets. Our collaboration has been an on-going task of studying the relational patterns between datapoints based on metafeatures extracted from metaknowledge in multimedia datasets. Those selected are significant to suit the mining technique we applied, ...

  7. Finite element modelling of non-bonded piezo sensors for biomedical health monitoring of bones based on EMI technique

    Science.gov (United States)

    Srivastava, Shashank; Bhalla, Suresh; Madan, Alok; Gupta, Ashok

    2016-04-01

    Extensive research is currently underway across the world for employing piezo sensors for biomedical health monitoring in view of their obvious advantages such as low cost,fast dynamics response and bio-compatibility.However,one of the limitations of the piezo sensor in bonded mode based on the electro-mechanical impedance (EMI) technique is that it can cause harmful effects to the humans in terms of irritation ,bone and skin disease. This paper which is in continuation of the recent demonstration of non-bonded configuration is a step towards simulating and analyzing the non-bonded configuration of the piezo sensor for gauging its effectiveness using FEA software. It has been noted that the conductance signatures obtained in non-bonded mode are significantly close to the conventional bonded configuration, thus giving a positive indication of its field use.

  8. DEVELOPMENT OF OBSTACLE AVOIDANCE TECHNIQUE IN WEB-BASED GEOGRAPHIC INFORMATION SYSTEM FOR TRAFFIC MANAGEMENT USING OPEN SOURCE SOFTWARE

    Directory of Open Access Journals (Sweden)

    Nik Mohd Ramli Nik Yusoff

    2014-01-01

    Full Text Available The shortest path routing is one of the well-known network analysis techniques implemented in road management systems. Pg Routing as an extension of Postgre SQL/Post GIS database is an open source library that implements the Dijkstra shortest path algorithm. However, the functionality to avoid obstacles in that analysis is still limited. Therefore, this study was conducted to enable obstacle avoidance function in the existing pgRouting algorithm using OpenStreetMap road network. By implementing this function, it enhances the Dijkstra algorithm ability in network analysis. In this study a dynamic restriction feature is added at the program level to represent the obstacles on the road. With this modification the algorithm is now able to generate an alternative route by avoiding the existence of obstacles on the roads. By using OpenLayers and PHP a web-based GIS platform was developed to ease the system’s usability.

  9. A quality control technique based on UV-VIS absorption spectroscopy for tequila distillery factories

    Science.gov (United States)

    Barbosa Garcia, O.; Ramos Ortiz, G.; Maldonado, J. L.; Pichardo Molina, J.; Meneses Nava, M. A.; Landgrave, Enrique; Cervantes, M. J.

    2006-02-01

    A low cost technique based on the UV-VIS absorption spectroscopy is presented for the quality control of the spirit drink known as tequila. It is shown that such spectra offer enough information to discriminate a given spirit drink from a group of bottled commercial tequilas. The technique was applied to white tequilas. Contrary to the reference analytic methods, such as chromatography, for this technique neither special personal training nor sophisticated instrumentations is required. By using hand-held instrumentation this technique can be applied in situ during the production process.

  10. Identification of a single base-pair mutation of TAA (Stop codon) → GAA (Glu) that causes light chain extension in a CHO cell derived IgG1

    Science.gov (United States)

    Zhang, Taylor; Huang, Yungfu; Chamberlain, Scott; Romeo, Tony; Zhu-Shimoni, Judith; Hewitt, Daniel; Zhu, Mary; Katta, Viswanatham; Mauger, Brad; Kao, Yung-Hsiang

    2012-01-01

    We describe here the identification of a stop codon TAA (Stop) → GAA (Glu) = Stop221E mutation on the light chain of a recombinant IgG1 antibody expressed in a Chinese hamster ovary (CHO) cell line. The extended light chain variants, which were caused by translation beyond the mutated stop codon to the next alternative in-frame stop codon, were observed by mass spectra analysis. The abnormal peptide peaks present in tryptic and chymotryptic LC–MS peptide mapping were confirmed by N-terminal sequencing as C-terminal light chain extension peptides. Furthermore, LC-MS/MS of Glu-C peptide mapping confirmed the stop221E mutation, which is consistent with a single base-pair mutation in TAA (stop codon) to GAA (Glu). The light chain variants were approximately 13.6% of wild type light chain as estimated by RP-HPLC analysis. DNA sequencing techniques determined a single base pair stop codon mutation, instead of a stop codon read-through, as the cause of this light chain extension. To our knowledge, the stop codon mutation has not been reported for IgGs expressed in CHO cells. These results demonstrate orthogonal techniques should be implemented to characterize recombinant proteins and select appropriate cell lines for production of therapeutic proteins because modifications could occur at unexpected locations. PMID:23018810

  11. Image Stitching System Based on ORB Feature-Based Technique and Compensation Blending

    Directory of Open Access Journals (Sweden)

    Ebtsam Adel

    2015-09-01

    Full Text Available The construction of a high-resolution panoramic image from a sequence of input overlapping images of the same scene is called image stitching/mosaicing. It is considered as an important, challenging topic in computer vision, multimedia, and computer graphics. The quality of the mosaic image and the time cost are the two primary parameters for measuring the stitching performance. Therefore, the main objective of this paper is to introduce a high-quality image stitching system with least computation time. First, we compare many different features detectors. We test Harris corner detector, SIFT, SURF, FAST, GoodFeaturesToTrack, MSER, and ORB techniques to measure the detection rate of the corrected keypoints and processing time. Second, we manipulate the implementation of different common categories of image blending methods to increase the quality of the stitching process. From experimental results, we conclude that ORB algorithm is the fastest, more accurate, and with higher performance. In addition, Exposure Compensation is the highest stitching quality blending method. Finally, we have generated an image stitching system based on ORB using Exposure Compensation blending method.

  12. Multi technique amalgamation for enhanced information identification with content based image data.

    Science.gov (United States)

    Das, Rik; Thepade, Sudeep; Ghosh, Saurav

    2015-01-01

    Image data has emerged as a resourceful foundation for information with proliferation of image capturing devices and social media. Diverse applications of images in areas including biomedicine, military, commerce, education have resulted in huge image repositories. Semantically analogous images can be fruitfully recognized by means of content based image identification. However, the success of the technique has been largely dependent on extraction of robust feature vectors from the image content. The paper has introduced three different techniques of content based feature extraction based on image binarization, image transform and morphological operator respectively. The techniques were tested with four public datasets namely, Wang Dataset, Oliva Torralba (OT Scene) Dataset, Corel Dataset and Caltech Dataset. The multi technique feature extraction process was further integrated for decision fusion of image identification to boost up the recognition rate. Classification result with the proposed technique has shown an average increase of 14.5 % in Precision compared to the existing techniques and the retrieval result with the introduced technique has shown an average increase of 6.54 % in Precision over state-of-the art techniques. PMID:26798574

  13. Practical Framework for an Electron Beam Induced Current Technique Based on a Numerical Optimization Approach

    Science.gov (United States)

    Yamaguchi, Hideshi; Soeda, Takeshi

    2015-03-01

    A practical framework for an electron beam induced current (EBIC) technique has been established for conductive materials based on a numerical optimization approach. Although the conventional EBIC technique is useful for evaluating the distributions of dopants or crystal defects in semiconductor transistors, issues related to the reproducibility and quantitative capability of measurements using this technique persist. For instance, it is difficult to acquire high-quality EBIC images throughout continuous tests due to variation in operator skill or test environment. Recently, due to the evaluation of EBIC equipment performance and the numerical optimization of equipment items, the constant acquisition of high contrast images has become possible, improving the reproducibility as well as yield regardless of operator skill or test environment. The technique proposed herein is even more sensitive and quantitative than scanning probe microscopy, an imaging technique that can possibly damage the sample. The new technique is expected to benefit the electrical evaluation of fragile or soft materials along with LSI materials.

  14. Android Access Control Extension

    Directory of Open Access Journals (Sweden)

    Anton Baláž

    2015-12-01

    Full Text Available The main objective of this work is to analyze and extend security model of mobile devices running on Android OS. Provided security extension is a Linux kernel security module that allows the system administrator to restrict program's capabilities with per-program profiles. Profiles can allow capabilities like network access, raw socket access, and the permission to read, write, or execute files on matching paths. Module supplements the traditional Android capability access control model by providing mandatory access control (MAC based on path. This extension increases security of access to system objects in a device and allows creating security sandboxes per application.

  15. Simulation-driven design by knowledge-based response correction techniques

    CERN Document Server

    Koziel, Slawomir

    2016-01-01

    Focused on efficient simulation-driven multi-fidelity optimization techniques, this monograph on simulation-driven optimization covers simulations utilizing physics-based low-fidelity models, often based on coarse-discretization simulations or other types of simplified physics representations, such as analytical models. The methods presented in the book exploit as much as possible any knowledge about the system or device of interest embedded in the low-fidelity model with the purpose of reducing the computational overhead of the design process. Most of the techniques described in the book are of response correction type and can be split into parametric (usually based on analytical formulas) and non-parametric, i.e., not based on analytical formulas. The latter, while more complex in implementation, tend to be more efficient. The book presents a general formulation of response correction techniques as well as a number of specific methods, including those based on correcting the low-fidelity model response (out...

  16. Nasal base narrowing of the caucasian nose through the cerclage technique

    Directory of Open Access Journals (Sweden)

    Mocellin, Marcos

    2010-06-01

    Full Text Available Introduction: Several techniques can be performed to reduce the nasal base (narrowing, as skin resection vestibular columellar skin resection, resection of skin in elliptical lip narinary, sloughing of skin and advancements (VY technique of Bernstein and the use of cerclage sutures in the nasal base. Objective: To evaluate the technique of cerclage performed in the nasal base, through endonasal rhinoplasty without delivery of basic technique, in the Caucasian nose, reducing the distance inter-alar flare and correcting the wing with consequent improvement in nasal harmony in the whole face. Methods: A retrospective analysis by analysis of clinical documents and photos of 43 patients in whom cerclage was made of the nasal base by resecting skin ellipse in the region of the vestibule and the nasal base (modified technique of Weir using colorless mononylon® 4 "0" with a straight cutting needle. The study was conducted in 2008 and 2009 at Hospital of Paraná Institute of Otolaryngology - IPO in Curitiba, Parana - Brazil. Patients had a follow up ranging 7-12 months. Results: In 100% of cases was achieved an improvement in nasal harmony, by decreasing the inter-alar distance. Conclusion: The encircling with minimal resection of vestibular skin and the nasal base is an effective method for the narrowing of the nasal base in the Caucasian nose, with predictable results and easy to perform.

  17. A Novel Graph Based Fuzzy Clustering Technique For Unsupervised Classification Of Remote Sensing Images

    Science.gov (United States)

    Banerjee, B.; Krishna Moohan, B.

    2014-11-01

    This paper addresses the problem of unsupervised land-cover classification of multi-spectral remotely sensed images in the context of self-learning by exploring different graph based clustering techniques hierarchically. The only assumption used here is that the number of land-cover classes is known a priori. Object based image analysis paradigm which processes a given image at different levels, has emerged as a popular alternative to the pixel based approaches for remote sensing image segmentation considering the high spatial resolution of the images. A graph based fuzzy clustering technique is proposed here to obtain a better merging of an initially oversegmented image in the spectral domain compared to conventional clustering techniques. Instead of using Euclidean distance measure, the cumulative graph edge weight is used to find the distance between a pair of points to better cope with the topology of the feature space. In order to handle uncertainty in assigning class labels to pixels, which is not always a crisp allocation for remote sensing data, fuzzy set theoretic technique is incorporated to the graph based clustering. Minimum Spanning Tree (MST) based clustering technique is used to over-segment the image at the first level. Furthermore, considering that the spectral signature of different land-cover classes may overlap significantly, a self-learning based Maximum Likelihood (ML) classifier coupled with the Expectation Maximization (EM) based iterative unsupervised parameter retraining scheme is used to generate the final land-cover classification map. Results on two medium resolution images establish the superior performance of the proposed technique in comparison to the traditional fuzzy c-means clustering technique.

  18. A GENERIC APPROACH TO CONTENT BASED IMAGE RETRIEVAL USING DCT AND CLASSIFICATION TECHNIQUES

    OpenAIRE

    RAMESH BABU DURAI C; Dr.V.DURAISAMY

    2010-01-01

    With the rapid development of technology, the traditional information retrieval techniques based on keywords are not sufficient, content - based image retrieval (CBIR) has been an active research topic.Content Based Image Retrieval (CBIR) technologies provide a method to find images in large databases by using unique descriptors from a trained image. The ability of the system to classify images based on the training set feature extraction is quite challenging.In this paper we propose to extra...

  19. Model-based fault diagnosis techniques design schemes, algorithms, and tools

    CERN Document Server

    Ding, Steven

    2008-01-01

    The objective of this book is to introduce basic model-based FDI schemes, advanced analysis and design algorithms, and the needed mathematical and control theory tools at a level for graduate students and researchers as well as for engineers. This is a textbook with extensive examples and references. Most methods are given in the form of an algorithm that enables a direct implementation in a programme. Comparisons among different methods are included when possible.

  20. An Analysis of the Risk in Discretely Rebalanced Option Hedges and Delta-Based Techniques

    OpenAIRE

    Russell P. Robins; Barry Schachter

    1994-01-01

    The stochastic properties of discretely rebalanced option hedges have been studied extensively beginning with Black and Scholes (1973). In each analysis hedges were "delta-neutral" after rebalancing. We argue that the distributional properties of discretely rebalanced hedges are such that delta-based hedging is not the variance minimizing strategy. This paper obtains analytical expressions for the variance minimizing option hedge ratios. We also evaluate the hedge variance to assess the magni...

  1. Rational extensions of the trigonometric Darboux-Pöschl-Teller potential based on para-Jacobi polynomials

    Science.gov (United States)

    Bagchi, B.; Grandati, Y.; Quesne, C.

    2015-06-01

    The possibility for the Jacobi equation to admit, in some cases, general solutions that are polynomials has been recently highlighted by Calogero and Yi, who termed them para-Jacobi polynomials. Such polynomials are used here to build seed functions of a Darboux-Bäcklund transformation for the trigonometric Darboux-Pöschl-Teller potential. As a result, one-step regular rational extensions of the latter depending both on an integer index n and on a continuously varying parameter λ are constructed. For each n value, the eigenstates of these extended potentials are associated with a novel family of λ-dependent polynomials, which are orthogonal on [-1,1].

  2. Effectiveness of Agricultural Extension Activities

    Directory of Open Access Journals (Sweden)

    Ali AL-Sharafat

    2012-01-01

    Full Text Available Problem statement: Jordans agricultural extension service is seriously under-staffed and its effectiveness is consequently compromised. Reservations are being expressed about the performance and capability of the agricultural extension system in Jordan. The performance of this sector has been disappointing and has failed to transfer agricultural technology to the farmers. The main objective of this study is to assess the effectiveness of Jordans agricultural extension services. Approach: The effect of extension services on olive productivity in the study area was investigated. A total number of 60 olive producers were selected to be interviewed for this study. This number was enough to achieve the study objectives. The interviewed producers were distributed almost equally within olive production locations in the study area. The sample obtained through the simple random sampling technique. The two groups had been chosen and distributed randomly into an experimental group (30 farmers; 10 for each source of extension service and control group (30 farmers. The experimental group received extension services and the control group received no extension services. Two interview-cum-structured questionnaires were designed and used to collect information and data for this study. The first instrument was designed for farmers who received extension services and the second from farmers who received no extension services. Another questionnaire was designed for administrators of extension organizations concerned with providing extension services to farmers. To find the differences that may exist between two studied groups, One Way Analysis of Variance (ANOVA, t-test and LSD test via Statistical Package for Social Sciences software (SPSS were used. The average net profit obtained from an area of one dynamo of olive farm was the main item to be considered in determining the effectiveness of agricultural extension activities. Results and Conclusion: The results of

  3. Advanced trap spectroscopy using the glow rate technique based on bleaching of color centers

    International Nuclear Information System (INIS)

    The glow rate technique (GRT) is the extension of the known heating rate method to the full glow curve. The GRT like the fractional glow technique (FGT) offers a procedure for evaluation of the mean activation energy as a function of temperature in the case of arbitrary thermostimulated relaxation kinetics represented by trap distribution function. The experimental procedure involves at least two subsequent measurements of thermostimulated recombination kinetics at different heating rates. The extension of the GRT to the direct measurements of thermostimulated bleaching of the radiation-induced color centers is presented. The experimental procedure involves measurements of the decay of radiation-induced absorption spectra of color centers in preliminary irradiated materials during linear heating. Procedure for evaluation of the trap energy and frequency factor spectrum is considered in the paper. Results of the application of GRT for analysis of the parameters of thermostimulated decay of color centers are presented in the case of decay of the radiation-induced defects in LiBaF3 crystals irradiated by X-rays. It is shown that decay of F-type centers occurs in two steps, the activation energy slightly decreasing from 0.660±0.003 eV in the first step (300-370 K) to 0.615±0.003 eV (400-480 K) in the second step

  4. Applications of synchrotron-based X-ray techniques in environmental science

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Synchrotron-based X-ray techniques have been widely applied to the fields of environmental science due to their element-specific and nondestructive properties and unique spectral and spatial resolution advantages.The techniques are capable of in situ investigating chemical speciation,microstructure and mapping of elements in question at the molecular or nanometer scale,and thus provide direct evidence for reaction mechanisms for various environmental processes.In this contribution,the applications of three types of the techniques commonly used in the fields of environmental research are reviewed,namely X-ray absorption spectroscopy (XAS),X-ray fluorescence (XRF) spectroscopy and scanning transmission X-ray microscopy (STXM).In particular,the recent advances of the techniques in China are elaborated,and a selection of the applied examples are provided in the field of environmental science.Finally,the perspectives of synchrotron-based X-ray techniques are discussed.With their great progress and wide application,the techniques have revolutionized our understanding of significant geo-and bio-chemical processes.It is anticipatable that synchrotron-based X-ray techniques will continue to play a significant role in the fields and significant advances will be obtained in decades ahead.

  5. A Comparative Analysis of Density Based Clustering Techniques for Outlier Mining

    Directory of Open Access Journals (Sweden)

    R.Prabahari*,

    2014-11-01

    Full Text Available Density based Clustering Algorithms such as Density Based Spatial Clustering of Applications with Noise (DBSCAN, Ordering Points to Identify the Clustering Structure (OPTICS and DENsity based CLUstering (DENCLUE are designed to discover clusters of arbitrary shape. DBSCAN grows clusters according to a density based connectivity analysis. OPTICS, which is an extension of DBSCAN used to produce clusters ordering obtained by setting range of parameter. DENCLUE clusters object is based on a set of density distribution functions. The comparison of the algorithms in terms of essential parameters such as complexity, clusters shape, input parameters, noise handle, cluster quality and run time are considered. The analysis is useful in finding which density based clustering algorithm is suitable in different criteria.

  6. Repeat Customer Success in Extension

    Science.gov (United States)

    Bess, Melissa M.; Traub, Sarah M.

    2013-01-01

    Four multi-session research-based programs were offered by two Extension specialist in one rural Missouri county. Eleven participants who came to multiple Extension programs could be called "repeat customers." Based on the total number of participants for all four programs, 25% could be deemed as repeat customers. Repeat customers had…

  7. Extension Worker as a Leader to Farmers: Influence of Extension Leadership Competencies and Organisational Commitment on Extension Workers’ Performance in Yemen

    Directory of Open Access Journals (Sweden)

    Prof. Dr. Turiman SUANDI

    2008-08-01

    Full Text Available Agriculture extension primarily deals with human resource development (HRD and the transfer of technology and knowledge from agricultural research centres to farmers. Improving human resource development (HRD within rural community is essential for agriculture and community development. Extension workers are professionals in the extension system responsible for developing individuals in the community. Hence, as the profession of extension work continues to expand, it is necessary to identify leadership skills possessed by the agricultural extension workers in order to gauge their performance in the extension system. Therefore, there is a need first to gauge performance of extension workers and a need to determine predictors of performance. This article examined the relationship between leadership competencies and organizational commitment with performance as perceived by the agricultural extension workers and the predictors of performance. The leadership competencies variables and commitment are important attributes of a leader. Data were collected from 290 respondents who were selected based on stratified sampling technique. The stratification was based on highlands, coastal and desert regions. The findings showed that four variables were significantly contributed to the level of extension workers’ performance. The variables are competencies in program implementation, program evaluation and program planning as well as organizational commitment. These predictor variables explain 45.3 % of the variance in the job performance of extension workers. The study suggests that relevant ministry such as the ministry of agriculture should take into account the leadership characteristics of extension workers and how to improve those competencies as well as organizational commitment in order to upgrade their performance in developing rural communities through extension services.

  8. Arithmetic and Frequency Filtering Methods of Pixel-Based Image Fusion Techniques

    Directory of Open Access Journals (Sweden)

    Firouz Abdullah Al-Wassai

    2011-05-01

    Full Text Available In remote sensing, image fusion technique is a useful tool used to fuse high spatial resolution panchromatic images (PAN with lower spatial resolution multispectral images (MS to create a high spatial resolution multispectral of image fusion (F while preserving the spectral information in the multispectral image (MS.There are many PAN sharpening techniques or Pixel-Based image fusion techniques that have been developed to try to enhance the spatial resolution and the spectral property preservation of the MS. This paper attempts to undertake the study of image fusion, by using two types of pixel -based image fusion techniques i.e. Arithmetic Combination and Frequency Filtering Methods of Pixel-Based Image Fusion Techniques. The first type includes Brovey Transform (BT, Color Normalized Transformation (CN and Multiplicative Method (MLT. The second type include High-Pass Filter Additive Method (HPFA, High -Frequency- Addition Method (HFA High Frequency Modulation Method (HFM and The Wavelet transform-based fusion method (WT. This paper also devotes to concentrate on the analytical techniques for evaluating the quality of image fusion (F by using various methods including Standard Deviation (SD, Entropy(En, Correlation Coefficient (CC, Signal-to Noise Ratio (SNR, Normalization Root Mean Square Error (NRMSE and Deviation Index (DI to estimate the quality and degree of information improvement of a fused image quantitatively.

  9. Laser-direct-write technique for rapid prototyping of multiplexed paper-based diagnostic sensors

    OpenAIRE

    He, Peijun; Katis, Ioannis; Eason, Robert; Sones, Collin

    2015-01-01

    We report the successful demonstration of a laser-based direct-write technique for patterning of various porous materials to fabricate more diversified and multifunctional paper-based microfluidic devices that find applications in affordable point-of-care medical diagnostics

  10. Simultaneous and integrated neutron-based techniques for material analysis of a metallic ancient flute

    Science.gov (United States)

    Festa, G.; Pietropaolo, A.; Grazzi, F.; Sutton, L. F.; Scherillo, A.; Bognetti, L.; Bini, A.; Barzagli, E.; Schooneveld, E.; Andreani, C.

    2013-09-01

    A metallic 19th century flute was studied by means of integrated and simultaneous neutron-based techniques: neutron diffraction, neutron radiative capture analysis and neutron radiography. This experiment follows benchmark measurements devoted to assessing the effectiveness of a multitask beamline concept for neutron-based investigation on materials. The aim of this study is to show the potential application of the approach using multiple and integrated neutron-based techniques for musical instruments. Such samples, in the broad scenario of cultural heritage, represent an exciting research field. They may represent an interesting link between different disciplines such as nuclear physics, metallurgy and acoustics.

  11. Improvement of medical education using web-based lecture repetition and extension: e-learning experiences of the Department of Obstetrics and Gynecology, University of Tuebingen

    OpenAIRE

    Wallwiener, Markus; Lammerding-Köppel, Maria; Schneider, Schneider; Schauf, Burkhard

    2006-01-01

    In order to improve the education of its medical students, the Department of Obstetrics and Gynecology of the University of Tuebingen established e-learning in terms of web-based lecture repetition and extension. Subsequent to lectures, questions are provided online. The participation is voluntary, but requires registration. The results of the analysed period (winter term 2004/2005, summer term 2005 and winter term 2005/2006) including more than 380 e-learning users are encouraging. An averag...

  12. A Predicate Based Fault Localization Technique Based On Test Case Reduction

    OpenAIRE

    Rohit Mishra; Dr.Raghav Yadav

    2015-01-01

    ABSTRACT In todays world software testing with statistical fault localization technique is one of most tedious expensive and time consuming activity. In faulty program a program element contrast dynamic spectra that estimate location of fault. There may have negative impact from coincidental correctness with these technique because in non failed run the fault can also be triggered out and if so disturb the assessment of fault location. Now eliminating of confounding rules on the recognizing t...

  13. Review of Fluorescence-Based Velocimetry Techniques to Study High-Speed Compressible Flows

    Science.gov (United States)

    Bathel, Brett F.; Johansen, Criag; Inman, Jennifer A.; Jones, Stephen B.; Danehy, Paul M.

    2013-01-01

    This paper reviews five laser-induced fluorescence-based velocimetry techniques that have been used to study high-speed compressible flows at NASA Langley Research Center. The techniques discussed in this paper include nitric oxide (NO) molecular tagging velocimetry (MTV), nitrogen dioxide photodissociation (NO2-to-NO) MTV, and NO and atomic oxygen (O-atom) Doppler-shift-based velocimetry. Measurements of both single-component and two-component velocity have been performed using these techniques. This paper details the specific application and experiment for which each technique has been used, the facility in which the experiment was performed, the experimental setup, sample results, and a discussion of the lessons learned from each experiment.

  14. Interferometric Dynamic Measurement: Techniques Based on High-Speed Imaging or a Single Photodetector

    Directory of Open Access Journals (Sweden)

    Yu Fu

    2014-01-01

    Full Text Available In recent years, optical interferometry-based techniques have been widely used to perform noncontact measurement of dynamic deformation in different industrial areas. In these applications, various physical quantities need to be measured in any instant and the Nyquist sampling theorem has to be satisfied along the time axis on each measurement point. Two types of techniques were developed for such measurements: one is based on high-speed cameras and the other uses a single photodetector. The limitation of the measurement range along the time axis in camera-based technology is mainly due to the low capturing rate, while the photodetector-based technology can only do the measurement on a single point. In this paper, several aspects of these two technologies are discussed. For the camera-based interferometry, the discussion includes the introduction of the carrier, the processing of the recorded images, the phase extraction algorithms in various domains, and how to increase the temporal measurement range by using multiwavelength techniques. For the detector-based interferometry, the discussion mainly focuses on the single-point and multipoint laser Doppler vibrometers and their applications for measurement under extreme conditions. The results show the effort done by researchers for the improvement of the measurement capabilities using interferometry-based techniques to cover the requirements needed for the industrial applications.

  15. Novel anti-jamming technique for OCDMA network through FWM in SOA based wavelength converter

    Science.gov (United States)

    Jyoti, Vishav; Kaler, R. S.

    2013-06-01

    In this paper, we propose a novel anti-jamming technique for optical code division multiple access (OCDMA) network through four wave mixing (FWM) in semiconductor optical amplifier (SOA) based wavelength converter. OCDMA signal can be easily jammed with high power jamming signal. It is shown that wavelength conversion through four wave mixing in SOA has improved capability of jamming resistance. It is observed that jammer has no effect on OCDMA network even at high jamming powers by using the proposed technique.

  16. On-line fault diagnosis of industrial processes based on artificial intelligence techniques

    OpenAIRE

    Calado, J. M. F.

    1996-01-01

    In this research the application of artificial intelligence techniques for on-line process control and fault detection and diagnosis are investigated. The majority of the research is on using artificial intelligence techniques in on-line fault detection and diagnosis of industrial processes. Several on-line approaches, including a rule based controller and several fault detection and diagnosis systems, have been developed and implemented and are described throughout this thesis. The research ...

  17. Post-Processing Techniques to Enhance Reliability of Assignment Algorithm Based Performance Measures

    OpenAIRE

    Peeta, Srinivas; Kumar, Amit; Sharma, Sushant

    2011-01-01

    This study develops an enhanced transportation planning framework by augmenting the sequential four-step planning process with post-processing techniques. The post-processing techniques are incorporated through a feedback mechanism and aim to improve the stability and convergence properties of the solution, thereby improving the reliability of the planning process. There are three building blocks of the proposed post-processing module: slope-based multi-path algorithm or SMPA, perturbation as...

  18. Design Technique of I2L Circuits Based on Multi—Valued Logic

    Institute of Scientific and Technical Information of China (English)

    吴训威; 杭国强

    1996-01-01

    This paper proposes the use of the current signal to express logic values and establishes th theory of grounded current switches suitable for I2L circuits.Based on the advantag that current signals are easy to be added,the design technique of I2L circuits by means of the multi-valued current signal is proposed.It is shown that simpler structure of I2L circuits can be obtained with this technique.

  19. A Low Cost Vision Based Hybrid Fiducial Mark Tracking Technique for Mobile Industrial Robots

    Directory of Open Access Journals (Sweden)

    Mohammed Y Aalsalem

    2012-07-01

    Full Text Available The field of robotic vision is developing rapidly. Robots can react intelligently and provide assistance to user activities through sentient computing. Since industrial applications pose complex requirements that cannot be handled by humans, an efficient low cost and robust technique is required for the tracking of mobile industrial robots. The existing sensor based techniques for mobile robot tracking are expensive and complex to deploy, configure and maintain. Also some of them demand dedicated and often expensive hardware. This paper presents a low cost vision based technique called “Hybrid Fiducial Mark Tracking” (HFMT technique for tracking mobile industrial robot. HFMT technique requires off-the-shelf hardware (CCD cameras and printable 2-D circular marks used as fiducials for tracking a mobile industrial robot on a pre-defined path. This proposed technique allows the robot to track on a predefined path by using fiducials for the detection of Right and Left turns on the path and White Strip for tracking the path. The HFMT technique is implemented and tested on an indoor mobile robot at our laboratory. Experimental results from robot navigating in real environments have confirmed that our approach is simple and robust and can be adopted in any hostile industrial environment where humans are unable to work.

  20. Development of IR-Based Short-Range Communication Techniques for Swarm Robot Applications

    Directory of Open Access Journals (Sweden)

    RAMLI, A. R.

    2010-11-01

    Full Text Available This paper proposes several designs for a reliable infra-red based communication techniques for swarm robotic applications. The communication system was deployed on an autonomous miniature mobile robot (AMiR, a swarm robotic platform developed earlier. In swarm applications, all participating robots must be able to communicate and share data. Hence a suitable communication medium and a reliable technique are required. This work uses infrared radiation for transmission of swarm robots messages. Infrared transmission methods such as amplitude and frequency modulations will be presented along with experimental results. Finally the effects of the modulation techniques and other parameters on collective behavior of swarm robots will be analyzed.