WorldWideScience

Sample records for previous techniques based

  1. An automated patient recognition method based on an image-matching technique using previous chest radiographs in the picture archiving and communication system environment

    International Nuclear Information System (INIS)

    Morishita, Junji; Katsuragawa, Shigehiko; Kondo, Keisuke; Doi, Kunio

    2001-01-01

    An automated patient recognition method for correcting 'wrong' chest radiographs being stored in a picture archiving and communication system (PACS) environment has been developed. The method is based on an image-matching technique that uses previous chest radiographs. For identification of a 'wrong' patient, the correlation value was determined for a previous image of a patient and a new, current image of the presumed corresponding patient. The current image was shifted horizontally and vertically and rotated, so that we could determine the best match between the two images. The results indicated that the correlation values between the current and previous images for the same, 'correct' patients were generally greater than those for different, 'wrong' patients. Although the two histograms for the same patient and for different patients overlapped at correlation values greater than 0.80, most parts of the histograms were separated. The correlation value was compared with a threshold value that was determined based on an analysis of the histograms of correlation values obtained for the same patient and for different patients. If the current image is considered potentially to belong to a 'wrong' patient, then a warning sign with the probability for a 'wrong' patient is provided to alert radiology personnel. Our results indicate that at least half of the 'wrong' images in our database can be identified correctly with the method described in this study. The overall performance in terms of a receiver operating characteristic curve showed a high performance of the system. The results also indicate that some readings of 'wrong' images for a given patient in the PACS environment can be prevented by use of the method we developed. Therefore an automated warning system for patient recognition would be useful in correcting 'wrong' images being stored in the PACS environment

  2. Multispecies Coevolution Particle Swarm Optimization Based on Previous Search History

    Directory of Open Access Journals (Sweden)

    Danping Wang

    2017-01-01

    Full Text Available A hybrid coevolution particle swarm optimization algorithm with dynamic multispecies strategy based on K-means clustering and nonrevisit strategy based on Binary Space Partitioning fitness tree (called MCPSO-PSH is proposed. Previous search history memorized into the Binary Space Partitioning fitness tree can effectively restrain the individuals’ revisit phenomenon. The whole population is partitioned into several subspecies and cooperative coevolution is realized by an information communication mechanism between subspecies, which can enhance the global search ability of particles and avoid premature convergence to local optimum. To demonstrate the power of the method, comparisons between the proposed algorithm and state-of-the-art algorithms are grouped into two categories: 10 basic benchmark functions (10-dimensional and 30-dimensional, 10 CEC2005 benchmark functions (30-dimensional, and a real-world problem (multilevel image segmentation problems. Experimental results show that MCPSO-PSH displays a competitive performance compared to the other swarm-based or evolutionary algorithms in terms of solution accuracy and statistical tests.

  3. Insertion of central venous catheters for hemodialysis using angiographic techniques in patients with previous multiple catheterizations

    International Nuclear Information System (INIS)

    Kotsikoris, Ioannis; Zygomalas, Apollon; Papas, Theofanis; Maras, Dimitris; Pavlidis, Polyvios; Andrikopoulou, Maria; Tsanis, Antonis; Alivizatos, Vasileios; Bessias, Nikolaos

    2012-01-01

    Introduction: Central venous catheter placement is an effective alternative vascular access for dialysis in patients with chronic renal failure. The purpose of this study was to evaluate the insertion of central venous catheters for hemodialysis using angiographic techniques in patients with previous multiple catheterizations in terms of efficacy of the procedure and early complications. Materials and methods: Between 2008 and 2010, the vascular access team of our hospital placed 409 central venous catheters in patients with chronic renal failure. The procedure was performed using the Seldinger blind technique. In 18 (4.4%) cases it was impossible to advance the guidewire, and so the patients were transported to the angiography suite. Results: Using the angiographic technique, the guidewire was advanced in order to position the central venous catheter. The latter was inserted into the subclavian vein in 12 (66.6%) cases, into the internal jugular vein in 4 (22.2%) and into the femoral vein in 2 (11.1%) cases. There was only one complicated case with severe arrhythmia in 1 (5.5%) patient. Conclusion: Our results suggest that insertion of central venous catheters using angiographic techniques in hemodialysis patients with previous multiple catheterizations is a safe and effective procedure with few complications and high success rates

  4. Insertion of central venous catheters for hemodialysis using angiographic techniques in patients with previous multiple catheterizations

    Energy Technology Data Exchange (ETDEWEB)

    Kotsikoris, Ioannis, E-mail: gkotsikoris@gmail.com [Department of Vascular Surgery, “Erythros Stauros” General Hospital (Greece); Zygomalas, Apollon, E-mail: azygomalas@upatras.gr [Department of General Surgery, University Hospital of Patras (Greece); Papas, Theofanis, E-mail: pfanis@otenet.gr [Department of Vascular Surgery, “Erythros Stauros” General Hospital (Greece); Maras, Dimitris, E-mail: dimmaras@gmail.com [Department of Vascular Surgery, “Erythros Stauros” General Hospital (Greece); Pavlidis, Polyvios, E-mail: polpavlidis@yahoo.gr [Department of Vascular Surgery, “Erythros Stauros” General Hospital (Greece); Andrikopoulou, Maria, E-mail: madric@gmail.com [Department of Vascular Surgery, “Erythros Stauros” General Hospital (Greece); Tsanis, Antonis, E-mail: atsanis@gmail.com [Department of Interventional Radiology, “Erythros Stauros” General Hospital (Greece); Alivizatos, Vasileios, E-mail: valiviz@hol.gr [Department of General Surgery and Artificial Nutrition Unit, “Agios Andreas” General Hospital of Patras (Greece); Bessias, Nikolaos, E-mail: bessias@otenet.gr [Department of Vascular Surgery, “Erythros Stauros” General Hospital (Greece)

    2012-09-15

    Introduction: Central venous catheter placement is an effective alternative vascular access for dialysis in patients with chronic renal failure. The purpose of this study was to evaluate the insertion of central venous catheters for hemodialysis using angiographic techniques in patients with previous multiple catheterizations in terms of efficacy of the procedure and early complications. Materials and methods: Between 2008 and 2010, the vascular access team of our hospital placed 409 central venous catheters in patients with chronic renal failure. The procedure was performed using the Seldinger blind technique. In 18 (4.4%) cases it was impossible to advance the guidewire, and so the patients were transported to the angiography suite. Results: Using the angiographic technique, the guidewire was advanced in order to position the central venous catheter. The latter was inserted into the subclavian vein in 12 (66.6%) cases, into the internal jugular vein in 4 (22.2%) and into the femoral vein in 2 (11.1%) cases. There was only one complicated case with severe arrhythmia in 1 (5.5%) patient. Conclusion: Our results suggest that insertion of central venous catheters using angiographic techniques in hemodialysis patients with previous multiple catheterizations is a safe and effective procedure with few complications and high success rates.

  5. Technique for sparing previously irradiated critical normal structures in salvage proton craniospinal irradiation

    International Nuclear Information System (INIS)

    McDonald, Mark W; Wolanski, Mark R; Simmons, Joseph W; Buchsbaum, Jeffrey C

    2013-01-01

    Cranial reirradiation is clinically appropriate in some cases but cumulative radiation dose to critical normal structures remains a practical concern. The authors developed a simple technique in 3D conformal proton craniospinal irradiation (CSI) to block organs at risk (OAR) while minimizing underdosing of adjacent target brain tissue. Two clinical cases illustrate the use of proton therapy to provide salvage CSI when a previously irradiated OAR required sparing from additional radiation dose. The prior radiation plan was coregistered to the treatment planning CT to create a planning organ at risk volume (PRV) around the OAR. Right and left lateral cranial whole brain proton apertures were created with a small block over the PRV. Then right and left lateral “inverse apertures” were generated, creating an aperture opening in the shape of the area previously blocked and blocking the area previously open. The inverse aperture opening was made one millimeter smaller than the original block to minimize the risk of dose overlap. The inverse apertures were used to irradiate the target volume lateral to the PRV, selecting a proton beam range to abut the 50% isodose line against either lateral edge of the PRV. Together, the 4 cranial proton fields created a region of complete dose avoidance around the OAR. Comparative photon treatment plans were generated with opposed lateral X-ray fields with custom blocks and coplanar intensity modulated radiation therapy optimized to avoid the PRV. Cumulative dose volume histograms were evaluated. Treatment plans were developed and successfully implemented to provide sparing of previously irradiated critical normal structures while treating target brain lateral to these structures. The absence of dose overlapping during irradiation through the inverse apertures was confirmed by film. Compared to the lateral X-ray and IMRT treatment plans, the proton CSI technique improved coverage of target brain tissue while providing the least

  6. Determination of sulfate in thorium salts using gravimetric technique with previous thorium separation

    International Nuclear Information System (INIS)

    Silva, C.M. da; Pires, M.A.F.

    1994-01-01

    Available as short communication only. A simple analytical method to analyze sulfates in thorium salt, is presented. The method is based on the thorium separation as hydroxide. The gravimetric technique is used to analyze the sulfate in the filtered as barium sulfate. Using this method, the sulfate separation from thorium has been reach 99,9% yield, and 0,1% precision. This method is applied to thorium salts specifically thorium sulfate, carbonate and nitrate. (author). 5 refs, 2 tabs

  7. [A brief history of resuscitation - the influence of previous experience on modern techniques and methods].

    Science.gov (United States)

    Kucmin, Tomasz; Płowaś-Goral, Małgorzata; Nogalski, Adam

    2015-02-01

    Cardiopulmonary resuscitation (CPR) is relatively novel branch of medical science, however first descriptions of mouth-to-mouth ventilation are to be found in the Bible and literature is full of descriptions of different resuscitation methods - from flagellation and ventilation with bellows through hanging the victims upside down and compressing the chest in order to stimulate ventilation to rectal fumigation with tobacco smoke. The modern history of CPR starts with Kouwenhoven et al. who in 1960 published a paper regarding heart massage through chest compressions. Shortly after that in 1961Peter Safar presented a paradigm promoting opening the airway, performing rescue breaths and chest compressions. First CPR guidelines were published in 1966. Since that time guidelines were modified and improved numerously by two leading world expert organizations ERC (European Resuscitation Council) and AHA (American Heart Association) and published in a new version every 5 years. Currently 2010 guidelines should be obliged. In this paper authors made an attempt to present history of development of resuscitation techniques and methods and assess the influence of previous lifesaving methods on nowadays technologies, equipment and guidelines which allow to help those women and men whose life is in danger due to sudden cardiac arrest. © 2015 MEDPRESS.

  8. Intralesional Osteophyte Regrowth Following Autologous Chondrocyte Implantation after Previous Treatment with Marrow Stimulation Technique.

    Science.gov (United States)

    Demange, Marco Kawamura; Minas, Tom; von Keudell, Arvind; Sodha, Sonal; Bryant, Tim; Gomoll, Andreas H

    2017-04-01

    Objective Bone marrow stimulation surgeries are frequent in the treatment of cartilage lesions. Autologous chondrocyte implantation (ACI) may be performed after failed microfracture surgery. Alterations to subchondral bone as intralesional osteophytes are commonly seen after previous microfracture and removed during ACI. There have been no reports on potential recurrence. Our purpose was to evaluate the incidence of intralesional osteophyte development in 2 cohorts: existing intralesional osteophytes and without intralesional osteophytes at the time of ACI. Study Design We identified 87 patients (157 lesions) with intralesional osteophytes among a cohort of 497 ACI patients. Osteophyte regrowth was analyzed on magnetic resonance imaging and categorized as small or large (less or more than 50% of the cartilage thickness). Twenty patients (24 defects) without intralesional osteophytes at the time of ACI acted as control. Results Osteophyte regrowth was observed in 39.5% of lesions (34.4% of small osteophytes and 5.1% of large osteophytes). In subgroup analyses, regrowth was observed in 45.8% of periosteal-covered defects and in 18.9% of collagen membrane-covered defects. Large osteophyte regrowth occurred in less than 5% in either group. Periosteal defects showed a significantly higher incidence for regrowth of small osteophytes. In the control group, intralesional osteophytes developed in 16.7% of the lesions. Conclusions Even though intralesional osteophytes may regrow after removal during ACI, most of them are small. Small osteophyte regrowth occurs almost twice in periosteum-covered ACI. Large osteophytes occur only in 5% of patients. Intralesional osteophyte formation is not significantly different in preexisting intralesional osteophytes and control groups.

  9. [Estimating child mortality using the previous child technique, with data from health centers and household surveys: methodological aspects].

    Science.gov (United States)

    Aguirre, A; Hill, A G

    1988-01-01

    2 trials of the previous child or preceding birth technique in Bamako, Mali, and Lima, Peru, gave very promising results for measurement of infant and early child mortality using data on survivorship of the 2 most recent births. In the Peruvian study, another technique was tested in which each woman was asked about her last 3 births. The preceding birth technique described by Brass and Macrae has rapidly been adopted as a simple means of estimating recent trends in early childhood mortality. The questions formulated and the analysis of results are direct when the mothers are visited at the time of birth or soon after. Several technical aspects of the method believed to introduce unforeseen biases have now been studied and found to be relatively unimportant. But the problems arising when the data come from a nonrepresentative fraction of the total fertile-aged population have not been resolved. The analysis based on data from 5 maternity centers including 1 hospital in Bamako, Mali, indicated some practical problems and the information obtained showed the kinds of subtle biases that can result from the effects of selection. The study in Lima tested 2 abbreviated methods for obtaining recent early childhood mortality estimates in countries with deficient vital registration. The basic idea was that a few simple questions added to household surveys on immunization or diarrheal disease control for example could produce improved child mortality estimates. The mortality estimates in Peru were based on 2 distinct sources of information in the questionnaire. All women were asked their total number of live born children and the number still alive at the time of the interview. The proportion of deaths was converted into a measure of child survival using a life table. Then each woman was asked for a brief history of the 3 most recent live births. Dates of birth and death were noted in month and year of occurrence. The interviews took only slightly longer than the basic survey

  10. SEM-based characterization techniques

    International Nuclear Information System (INIS)

    Russell, P.E.

    1986-01-01

    The scanning electron microscope is now a common instrument in materials characterization laboratories. The basic role of the SEM as a topographic imaging system has steadily been expanding to include a variety of SEM-based analytical techniques. These techniques cover the range of basic semiconductor materials characterization to live-time device characterization of operating LSI or VLSI devices. This paper introduces many of the more commonly used techniques, describes the modifications or additions to a conventional SEM required to utilize the techniques, and gives examples of the use of such techniques. First, the types of signals available from a sample being irradiated by an electron beam are reviewed. Then, where applicable, the type of spectroscopy or microscopy which has evolved to utilize the various signal types are described. This is followed by specific examples of the use of such techniques to solve problems related to semiconductor technology. Techniques emphasized include: x-ray fluorescence spectroscopy, electron beam induced current (EBIC), stroboscopic voltage analysis, cathodoluminescnece and electron beam IC metrology. Current and future trends of some of the these techniques, as related to the semiconductor industry are discussed

  11. Attribute and topology based change detection in a constellation of previously detected objects

    Science.gov (United States)

    Paglieroni, David W.; Beer, Reginald N.

    2016-01-19

    A system that applies attribute and topology based change detection to networks of objects that were detected on previous scans of a structure, roadway, or area of interest. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, size, elongation, orientation, etc. The topology of the network of previously detected objects is maintained in a constellation database that stores attributes of previously detected objects and implicitly captures the geometrical structure of the network. A change detection system detects change by comparing the attributes and topology of new objects detected on the latest scan to the constellation database of previously detected objects.

  12. Immediate breast reconstruction after skin- or nipple-sparing mastectomy for previously augmented patients: a personal technique.

    Science.gov (United States)

    Salgarello, Marzia; Rochira, Dario; Barone-Adesi, Liliana; Farallo, Eugenio

    2012-04-01

    Breast reconstruction for previously augmented patients differs from breast reconstruction for nonaugmented patients. Many surgeons regard conservation therapy as not feasible for these patients because of implant complications, whether radiotherapy-induced or not. Despite this, most authors agree that mastectomy with immediate breast reconstruction is the most suitable choice, ensuring both a good cosmetic result and a low complication rate. Implant retention or removal remains a controversial topic in addition to the best available surgical technique. This study reviewed the authors' experience with immediate breast reconstruction after skin-sparing mastectomy (SSM) and nipple-sparing mastectomy (NSM) with anatomically definitive implants. The retrospective records of 12 patients were examined (group A). These patients were among 254 patients who underwent SSM or NSM for breast carcinoma. The control group comprised 12 of the 254 patients submitted to SSM or NSM (group B) who best matched the 12 patients in the studied group. All of them underwent immediate breast reconstruction, with an anatomically definitive implant placed in a submuscular-subfascial pocket. The demographic, technical, and oncologic data of the two groups were compared as well as the aesthetic outcomes using the Breast Q score. The proportion of complications, the type of implant, the axillary lymph node procedure, and the histology were compared between the two groups using Fisher's exact test. Student's t test was used to compare the scores for the procedure-specific modules of the breast Q questionnaire in the two groups. A validated patient satisfaction score was obtained using the breast Q questionnaire after breast reconstruction. The demographic, technical, and oncologic characteristics were not significantly different between the two groups. The previously augmented patients reported a significantly higher level of satisfaction with their breast than the control patients. The scores

  13. Bases of technique of sprinting

    Directory of Open Access Journals (Sweden)

    Valeriy Druz

    2015-06-01

    Full Text Available Purpose: to determine the biomechanical consistent patterns of a movement of a body providing the highest speed of sprinting. Material and Methods: the analysis of scientific and methodical literature on the considered problem, the anthropometrical characteristics of the surveyed contingent of sportsmen, the analysis of high-speed shootings of the leading runners of the world. Results: the biomechanical bases of technique of sprinting make dispersal and movement of the general center of body weight of the sportsman on a parabolic curve in a start phase taking into account the initial height of its stay in a pose of a low start. Its further movement happens on a cycloidal trajectory which is formed due to a pendulum movement of the extremities creating the lifting power which provides flight duration more in a running step, than duration of a basic phase. Conclusions: the received biomechanical regularities of technique of sprinting allow increasing the efficiency of training of sportsmen in sprinting.

  14. IDENTIFICATION OF CANINE VISCERAL LEISHMANIASIS IN A PREVIOUSLY UNAFFECTED AREA BY CONVENTIONAL DIAGNOSTIC TECHNIQUES AND CELL-BLOCK FIXATION

    Directory of Open Access Journals (Sweden)

    Tuanne Rotti ABRANTES

    2016-01-01

    Full Text Available After the report of a second case of canine visceral leishmaniasis (CVL in São Bento da Lagoa, Itaipuaçu, in the municipality of Maricá, Rio de Janeiro State, an epidemiological survey was carried out, through active search, totaling 145 dogs. Indirect immunofluorescence assay (IFA, enzyme-linked immunosorbent assay (ELISA, and rapid chromatographic immunoassay based on dual-path platform (DPP(r were used to perform the serological examinations. The parasitological diagnosis of cutaneous fragments was performed by parasitological culture, histopathology, and immunohistochemistry. In the serological assessment, 21 dogs were seropositive by IFA, 17 by ELISA, and 11 by DPP(r, with sensitivity of 66.7%, 66.7% and 50%, and specificity of 87.2%, 90.2% and 94%, respectively for each technique. The immunohistochemistry of bone marrow using the cell-block technique presented the best results, with six positive dogs found, three of which tested negative by the other parasitological techniques. Leishmania sp. was isolated by parasitological culture in three dogs. The detection of autochthonous Leishmania infantum in Itaipuaçu, and the high prevalence of seropositive dogs confirm the circulation of this parasite in the study area and alert for the risk of expansion in the State of Rio de Janeiro.

  15. Cultivation-based multiplex phenotyping of human gut microbiota allows targeted recovery of previously uncultured bacteria

    DEFF Research Database (Denmark)

    Rettedal, Elizabeth; Gumpert, Heidi; Sommer, Morten

    2014-01-01

    The human gut microbiota is linked to a variety of human health issues and implicated in antibiotic resistance gene dissemination. Most of these associations rely on culture-independent methods, since it is commonly believed that gut microbiota cannot be easily or sufficiently cultured. Here, we...... microbiota. Based on the phenotypic mapping, we tailor antibiotic combinations to specifically select for previously uncultivated bacteria. Utilizing this method we cultivate and sequence the genomes of four isolates, one of which apparently belongs to the genus Oscillibacter; uncultivated Oscillibacter...

  16. In vivo dentate nucleus MRI relaxometry correlates with previous administration of Gadolinium-based contrast agents

    Energy Technology Data Exchange (ETDEWEB)

    Tedeschi, Enrico; Canna, Antonietta; Cocozza, Sirio; Russo, Carmela; Angelini, Valentina; Brunetti, Arturo [University ' ' Federico II' ' , Neuroradiology, Department of Advanced Biomedical Sciences, Naples (Italy); Palma, Giuseppe; Quarantelli, Mario [National Research Council, Institute of Biostructure and Bioimaging, Naples (Italy); Borrelli, Pasquale; Salvatore, Marco [IRCCS SDN, Naples (Italy); Lanzillo, Roberta; Postiglione, Emanuela; Morra, Vincenzo Brescia [University ' ' Federico II' ' , Department of Neurosciences, Reproductive and Odontostomatological Sciences, Naples (Italy)

    2016-12-15

    To evaluate changes in T1 and T2* relaxometry of dentate nuclei (DN) with respect to the number of previous administrations of Gadolinium-based contrast agents (GBCA). In 74 relapsing-remitting multiple sclerosis (RR-MS) patients with variable disease duration (9.8±6.8 years) and severity (Expanded Disability Status Scale scores:3.1±0.9), the DN R1 (1/T1) and R2* (1/T2*) relaxation rates were measured using two unenhanced 3D Dual-Echo spoiled Gradient-Echo sequences with different flip angles. Correlations of the number of previous GBCA administrations with DN R1 and R2* relaxation rates were tested, including gender and age effect, in a multivariate regression analysis. The DN R1 (normalized by brainstem) significantly correlated with the number of GBCA administrations (p<0.001), maintaining the same significance even when including MS-related factors. Instead, the DN R2* values correlated only with age (p=0.003), and not with GBCA administrations (p=0.67). In a subgroup of 35 patients for whom the administered GBCA subtype was known, the effect of GBCA on DN R1 appeared mainly related to linear GBCA. In RR-MS patients, the number of previous GBCA administrations correlates with R1 relaxation rates of DN, while R2* values remain unaffected, suggesting that T1-shortening in these patients is related to the amount of Gadolinium given. (orig.)

  17. Vaccinia-based influenza vaccine overcomes previously induced immunodominance hierarchy for heterosubtypic protection.

    Science.gov (United States)

    Kwon, Ji-Sun; Yoon, Jungsoon; Kim, Yeon-Jung; Kang, Kyuho; Woo, Sunje; Jung, Dea-Im; Song, Man Ki; Kim, Eun-Ha; Kwon, Hyeok-Il; Choi, Young Ki; Kim, Jihye; Lee, Jeewon; Yoon, Yeup; Shin, Eui-Cheol; Youn, Jin-Won

    2014-08-01

    Growing concerns about unpredictable influenza pandemics require a broadly protective vaccine against diverse influenza strains. One of the promising approaches was a T cell-based vaccine, but the narrow breadth of T-cell immunity due to the immunodominance hierarchy established by previous influenza infection and efficacy against only mild challenge condition are important hurdles to overcome. To model T-cell immunodominance hierarchy in humans in an experimental setting, influenza-primed C57BL/6 mice were chosen and boosted with a mixture of vaccinia recombinants, individually expressing consensus sequences from avian, swine, and human isolates of influenza internal proteins. As determined by IFN-γ ELISPOT and polyfunctional cytokine secretion, the vaccinia recombinants of influenza expanded the breadth of T-cell responses to include subdominant and even minor epitopes. Vaccine groups were successfully protected against 100 LD50 challenges with PR/8/34 and highly pathogenic avian influenza H5N1, which contained the identical dominant NP366 epitope. Interestingly, in challenge with pandemic A/Cal/04/2009 containing mutations in the dominant epitope, only the group vaccinated with rVV-NP + PA showed improved protection. Taken together, a vaccinia-based influenza vaccine expressing conserved internal proteins improved the breadth of influenza-specific T-cell immunity and provided heterosubtypic protection against immunologically close as well as distant influenza strains. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Microprocessor based techniques at CESR

    International Nuclear Information System (INIS)

    Giannini, G.; Cornell Univ., Ithaca, NY

    1981-01-01

    Microprocessor based systems succesfully used in connection with the High Energy Physics experimental program at the Cornell Electron Storage Ring are described. The multiprocessor calibration system for the CUSB calorimeter is analyzed in view of present and future applications. (orig.)

  19. The impact of previous knee injury on force plate and field-based measures of balance.

    Science.gov (United States)

    Baltich, Jennifer; Whittaker, Jackie; Von Tscharner, Vinzenz; Nettel-Aguirre, Alberto; Nigg, Benno M; Emery, Carolyn

    2015-10-01

    Individuals with post-traumatic osteoarthritis demonstrate increased sway during quiet stance. The prospective association between balance and disease onset is unknown. Improved understanding of balance in the period between joint injury and disease onset could inform secondary prevention strategies to prevent or delay the disease. This study examines the association between youth sport-related knee injury and balance, 3-10years post-injury. Participants included 50 individuals (ages 15-26years) with a sport-related intra-articular knee injury sustained 3-10years previously and 50 uninjured age-, sex- and sport-matched controls. Force-plate measures during single-limb stance (center-of-pressure 95% ellipse-area, path length, excursion, entropic half-life) and field-based balance scores (triple single-leg hop, star-excursion, unipedal dynamic balance) were collected. Descriptive statistics (mean within-pair difference; 95% confidence intervals) were used to compare groups. Linear regression (adjusted for injury history) was used to assess the relationship between ellipse-area and field-based scores. Injured participants on average demonstrated greater medio-lateral excursion [mean within-pair difference (95% confidence interval); 2.8mm (1.0, 4.5)], more regular medio-lateral position [10ms (2, 18)], and shorter triple single-leg hop distances [-30.9% (-8.1, -53.7)] than controls, while no between group differences existed for the remaining outcomes. After taking into consideration injury history, triple single leg hop scores demonstrated a linear association with ellipse area (β=0.52, 95% confidence interval 0.01, 1.01). On average the injured participants adjusted their position less frequently and demonstrated a larger magnitude of movement during single-limb stance compared to controls. These findings support the evaluation of balance outcomes in the period between knee injury and post-traumatic osteoarthritis onset. Copyright © 2015 Elsevier Ltd. All rights

  20. Bases en technique du vide

    CERN Document Server

    Rommel, Guy

    2017-01-01

    Cette seconde édition, 20 ans après la première, devrait continuer à aider les techniciens pour la réalisation de leur système de vide. La technologie du vide est utilisée, à présent, dans de nombreux domaines très différents les uns des autres et avec des matériels très fiables. Or, elle est souvent bien peu étudiée, de plus, c'est une discipline où le savoir-faire prend tout son sens. Malheureusement la transmission par des ingénieurs et techniciens expérimentés ne se fait plus ou trop rapidement. La technologie du vide fait appel à la physique, à la chimie, à la mécanique, à la métallurgie, au dessin industriel, à l'électronique, à la thermique, etc. Cette discipline demande donc de maîtriser des techniques de domaines très divers, et ce n'est pas chose facile. Chaque installation est en soi un cas particulier avec ses besoins, sa façon de traiter les matériaux et celle d'utiliser les matériels. Les systèmes de vide sont parfois copiés d'un laboratoire à un autre et le...

  1. Late preterm birth and previous cesarean section: a population-based cohort study.

    Science.gov (United States)

    Yasseen Iii, Abdool S; Bassil, Kate; Sprague, Ann; Urquia, Marcelo; Maguire, Jonathon L

    2018-02-21

    Late preterm birth (LPB) is increasingly common and associated with higher morbidity and mortality than term birth. Yet, little is known about the influence of previous cesarean section (PCS) and the occurrence of LPB in subsequent pregnancies. We aim to evaluate this association along with the potential mediation by cesarean sections in the current pregnancy. We use population-based birth registry data (2005-2012) to establish a cohort of live born singleton infants born between 34 and 41 gestational weeks to multiparous mothers. PCS was the primary exposure, LPB (34-36 weeks) was the primary outcome, and an unplanned or emergency cesarean section in the current pregnancy was the potential mediator. Associations were quantified using propensity weighted multivariable Poisson regression, and mediating associations were explored using the Baron-Kenny approach. The cohort included 481,531 births, 21,893 (4.5%) were LPB, and 119,983 (24.9%) were predated by at least one PCS. Among mothers with at least one PCS, 6307 (5.26%) were LPB. There was increased risk of LPB among women with at least one PCS (adjusted Relative Risk (aRR): 1.20 (95%CI [1.16, 1.23]). Unplanned or emergency cesarean section in the current pregnancy was identified as a strong mediator to this relationship (mediation ratio = 97%). PCS was associated with higher risk of LPB in subsequent pregnancies. This may be due to an increased risk of subsequent unplanned or emergency preterm cesarean sections. Efforts to minimize index cesarean sections may reduce the risk of LPB in subsequent pregnancies.

  2. Huffman-based code compression techniques for embedded processors

    KAUST Repository

    Bonny, Mohamed Talal

    2010-09-01

    The size of embedded software is increasing at a rapid pace. It is often challenging and time consuming to fit an amount of required software functionality within a given hardware resource budget. Code compression is a means to alleviate the problem by providing substantial savings in terms of code size. In this article we introduce a novel and efficient hardware-supported compression technique that is based on Huffman Coding. Our technique reduces the size of the generated decoding table, which takes a large portion of the memory. It combines our previous techniques, Instruction Splitting Technique and Instruction Re-encoding Technique into new one called Combined Compression Technique to improve the final compression ratio by taking advantage of both previous techniques. The instruction Splitting Technique is instruction set architecture (ISA)-independent. It splits the instructions into portions of varying size (called patterns) before Huffman coding is applied. This technique improves the final compression ratio by more than 20% compared to other known schemes based on Huffman Coding. The average compression ratios achieved using this technique are 48% and 50% for ARM and MIPS, respectively. The Instruction Re-encoding Technique is ISA-dependent. It investigates the benefits of reencoding unused bits (we call them reencodable bits) in the instruction format for a specific application to improve the compression ratio. Reencoding those bits can reduce the size of decoding tables by up to 40%. Using this technique, we improve the final compression ratios in comparison to the first technique to 46% and 45% for ARM and MIPS, respectively (including all overhead that incurs). The Combined Compression Technique improves the compression ratio to 45% and 42% for ARM and MIPS, respectively. In our compression technique, we have conducted evaluations using a representative set of applications and we have applied each technique to two major embedded processor architectures

  3. Efficacy of peg-interferon based treatment in patients with hepatitis C refractory to previous conventional interferon-based treatment

    International Nuclear Information System (INIS)

    Shaikh, S.; Devrajani, B.R.; Kalhoro, M.

    2012-01-01

    Objective: To determine the efficacy of peg-interferon-based therapy in patients refractory to previous conventional interferon-based treatment and factors predicting sustained viral response (SVR). Study Design: Analytical study. Place and Duration of Study: Medical Unit IV, Liaquat University Hospital, Jamshoro, from July 2009 to June 2011. Methodology: This study included consecutive patients of hepatitis C who were previously treated with conventional interferon-based treatment for 6 months but were either non-responders, relapsed or had virologic breakthrough and stage = 2 with fibrosis on liver biopsy. All eligible patients were provided peg-interferon at the dosage of 180 mu g weekly with ribavirin thrice a day for 6 months. Sustained Viral Response (SVR) was defined as absence of HCV RNA at twenty four week after treatment. All data was processed on SPSS version 16. Results: Out of 450 patients enrolled in the study, 192 were excluded from the study on the basis of minimal fibrosis (stage 0 and 1). Two hundred and fifty eight patients fulfilled the inclusion criteria and 247 completed the course of peg-interferon treatment. One hundred and sixty one (62.4%) were males and 97 (37.6%) were females. The mean age was 39.9 +- 6.1 years, haemoglobin was 11.49 +- 2.45 g/dl, platelet count was 127.2 +- 50.6 10/sup 3/ /mm/sup 3/, ALT was 99 +- 65 IU/L. SVR was achieved in 84 (32.6%). The strong association was found between SVR and the pattern of response (p = 0. 001), degree of fibrosis and early viral response (p = 0.001). Conclusion: Peg-interferon based treatment is an effective and safe treatment option for patients refractory to conventional interferon-based treatment. (author)

  4. Analysis of Product Buying Decision on Lazada E-commerce based on Previous Buyers’ Comments

    Directory of Open Access Journals (Sweden)

    Neil Aldrin

    2017-06-01

    Full Text Available The aims of the present research are: 1 to know that product buying decision possibly occurs, 2 to know how product buying decision occurs on Lazada e-commerce’s customers, 3 how previous buyers’ comments can increase product buying decision on Lazada e-commerce. This research utilizes qualitative research method. Qualitative research is a research that investigates other researches and makes assumption or discussion result so that other analysis results can be made in order to widen idea and opinion. Research result shows that product which has many ratings and reviews will trigger other buyers to purchase or get that product. The conclusion is that product buying decision may occur because there are some processes before making decision which are: looking for recognition and searching for problems, knowing the needs, collecting information, evaluating alternative, evaluating after buying. In those stages, buying decision on Lazada e-commerce is supported by price, promotion, service, and brand.

  5. Composite Techniques Based Color Image Compression

    Directory of Open Access Journals (Sweden)

    Zainab Ibrahim Abood

    2017-03-01

    Full Text Available Compression for color image is now necessary for transmission and storage in the data bases since the color gives a pleasing nature and natural for any object, so three composite techniques based color image compression is implemented to achieve image with high compression, no loss in original image, better performance and good image quality. These techniques are composite stationary wavelet technique (S, composite wavelet technique (W and composite multi-wavelet technique (M. For the high energy sub-band of the 3rd level of each composite transform in each composite technique, the compression parameters are calculated. The best composite transform among the 27 types is the three levels of multi-wavelet transform (MMM in M technique which has the highest values of energy (En and compression ratio (CR and least values of bit per pixel (bpp, time (T and rate distortion R(D. Also the values of the compression parameters of the color image are nearly the same as the average values of the compression parameters of the three bands of the same image.

  6. A New Zealand based cohort study of anaesthetic trainees' career outcomes compared with previously expressed intentions.

    Science.gov (United States)

    Moran, E M L; French, R A; Kennedy, R R

    2011-09-01

    Predicting workforce requirements is a difficult but necessary part of health resource planning. A 'snapshot' workforce survey undertaken in 2002 examined issues that New Zealand anaesthesia trainees expected would influence their choice of future workplace. We have restudied the same cohort to see if that workforce survey was a good predictor of outcome. Seventy (51%) of 138 surveys were completed in 2009 compared with 100 (80%) of 138 in the 2002 survey. Eighty percent of the 2002 respondents planned consultant positions in New Zealand. We found 64% of respondents were working in New Zealand (P New Zealand based respondents but only 40% of those living outside New Zealand agreed or strongly agreed with this statement (P New Zealand but was important for only 2% of those resident in New Zealand (P New Zealand were predominantly between NZ$150,000 and $200,000 while those overseas received between NZ$300,000 and $400,000. Of those that are resident in New Zealand, 84% had studied in a New Zealand medical school compared with 52% of those currently working overseas (P < 0.01). Our study shows that stated career intentions in a group do not predict the actual group outcomes. We suggest that 'snapshot' studies examining workforce intentions are of little value for workforce planning. However we believe an ongoing program matching career aspirations against career outcomes would be a useful tool in workforce planning.

  7. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  8. Analytical research using synchrotron radiation based techniques

    International Nuclear Information System (INIS)

    Jha, Shambhu Nath

    2015-01-01

    There are many Synchrotron Radiation (SR) based techniques such as X-ray Absorption Spectroscopy (XAS), X-ray Fluorescence Analysis (XRF), SR-Fourier-transform Infrared (SRFTIR), Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. which are increasingly being employed worldwide in analytical research. With advent of modern synchrotron sources these analytical techniques have been further revitalized and paved ways for new techniques such as microprobe XRF and XAS, FTIR microscopy, Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. The talk will cover mainly two techniques illustrating its capability in analytical research namely XRF and XAS. XRF spectroscopy: XRF spectroscopy is an analytical technique which involves the detection of emitted characteristic X-rays following excitation of the elements within the sample. While electron, particle (protons or alpha particles), or X-ray beams can be employed as the exciting source for this analysis, the use of X-ray beams from a synchrotron source has been instrumental in the advancement of the technique in the area of microprobe XRF imaging and trace level compositional characterisation of any sample. Synchrotron radiation induced X-ray emission spectroscopy, has become competitive with the earlier microprobe and nanoprobe techniques following the advancements in manipulating and detecting these X-rays. There are two important features that contribute to the superb elemental sensitivities of microprobe SR induced XRF: (i) the absence of the continuum (Bremsstrahlung) background radiation that is a feature of spectra obtained from charged particle beams, and (ii) the increased X-ray flux on the sample associated with the use of tunable third generation synchrotron facilities. Detection sensitivities have been reported in the ppb range, with values of 10 -17 g - 10 -14 g (depending on the particular element and matrix). Keeping in mind its demand, a microprobe XRF beamline has been setup by RRCAT at Indus-2 synchrotron

  9. Graph based techniques for tag cloud generation

    DEFF Research Database (Denmark)

    Leginus, Martin; Dolog, Peter; Lage, Ricardo Gomes

    2013-01-01

    Tag cloud is one of the navigation aids for exploring documents. Tag cloud also link documents through the user defined terms. We explore various graph based techniques to improve the tag cloud generation. Moreover, we introduce relevance measures based on underlying data such as ratings...... or citation counts for improved measurement of relevance of tag clouds. We show, that on the given data sets, our approach outperforms the state of the art baseline methods with respect to such relevance by 41 % on Movielens dataset and by 11 % on Bibsonomy data set....

  10. Artificial Intelligence based technique for BTS placement

    Science.gov (United States)

    Alenoghena, C. O.; Emagbetere, J. O.; Aibinu, A. M.

    2013-12-01

    The increase of the base transceiver station (BTS) in most urban areas can be traced to the drive by network providers to meet demand for coverage and capacity. In traditional network planning, the final decision of BTS placement is taken by a team of radio planners, this decision is not fool proof against regulatory requirements. In this paper, an intelligent based algorithm for optimal BTS site placement has been proposed. The proposed technique takes into consideration neighbour and regulation considerations objectively while determining cell site. The application will lead to a quantitatively unbiased evaluated decision making process in BTS placement. An experimental data of a 2km by 3km territory was simulated for testing the new algorithm, results obtained show a 100% performance of the neighbour constrained algorithm in BTS placement optimization. Results on the application of GA with neighbourhood constraint indicate that the choices of location can be unbiased and optimization of facility placement for network design can be carried out.

  11. Artificial Intelligence based technique for BTS placement

    International Nuclear Information System (INIS)

    Alenoghena, C O; Emagbetere, J O; 1 Minna (Nigeria))" data-affiliation=" (Department of Telecommunications Engineering, Federal University of Techn.1 Minna (Nigeria))" >Aibinu, A M

    2013-01-01

    The increase of the base transceiver station (BTS) in most urban areas can be traced to the drive by network providers to meet demand for coverage and capacity. In traditional network planning, the final decision of BTS placement is taken by a team of radio planners, this decision is not fool proof against regulatory requirements. In this paper, an intelligent based algorithm for optimal BTS site placement has been proposed. The proposed technique takes into consideration neighbour and regulation considerations objectively while determining cell site. The application will lead to a quantitatively unbiased evaluated decision making process in BTS placement. An experimental data of a 2km by 3km territory was simulated for testing the new algorithm, results obtained show a 100% performance of the neighbour constrained algorithm in BTS placement optimization. Results on the application of GA with neighbourhood constraint indicate that the choices of location can be unbiased and optimization of facility placement for network design can be carried out

  12. Estimating the effect of current, previous and never use of drugs in studies based on prescription registries

    DEFF Research Database (Denmark)

    Nielsen, Lars Hougaard; Løkkegaard, Ellen; Andreasen, Anne Helms

    2009-01-01

    of this misclassification for analysing the risk of breast cancer. MATERIALS AND METHODS: Prescription data were obtained from Danish Registry of Medicinal Products Statistics and we applied various methods to approximate treatment episodes. We analysed the duration of HT episodes to study the ability to identify......PURPOSE: Many studies which investigate the effect of drugs categorize the exposure variable into never, current, and previous use of the study drug. When prescription registries are used to make this categorization, the exposure variable possibly gets misclassified since the registries do...... not carry any information on the time of discontinuation of treatment.In this study, we investigated the amount of misclassification of exposure (never, current, previous use) to hormone therapy (HT) when the exposure variable was based on prescription data. Furthermore, we evaluated the significance...

  13. New continuous air pumping technique to improve clinical outcomes of descemet-stripping automated endothelial keratoplasty in asian patients with previous ahmed glaucoma valve implantation.

    Directory of Open Access Journals (Sweden)

    Chang-Min Liang

    Full Text Available BACKGROUND: To evaluate the outcomes of Descemet-stripping automated endothelial keratoplasty (DSAEK with the use of continuous air pumping technique in Asian eyes with previous Ahmed glaucoma valve implantation. METHODS: The DSAEK procedure was modified in that complete air retention of the anterior chamber was maintained for 10 min using continuous air pumping at 30 mm Hg. The primary outcome measurement was graft survival, and postoperative clinical features including, rate of graft detachment, endothelial cell count, intraocular pressure (IOP, surgical time and cup/disc ratio were also recorded. RESULTS: A total of 13 eyes of 13 patients which underwent modified DSAEK and 6 eyes of 6 patients which underwent conventional DSAEK were included. There was a significant difference in graft survival curves between two groups (P = 0.029; the 1-year graft survival rates were estimated as 100% and 66.7% for patients with modified DSAEK and those with traditional DSAEK, respectively. The rate of graft detachment were 0% and 33.3% for the modified DSAEK and conventional DSAEK groups, respectively (P = 0.088. The significantly lowered surgical time for air tamponade was noted in the modified DSAEK group compared to that in the conventional DSAEK group [median (IQR: 10.0 (10.0, 10.0 min vs. 24.5 (22.0, 27.0 min; P<0.001] Postoperatively, patients in the modified DSAEK group had significantly lower IOP as compared to the conventional DSAEK group [12.0 (11.0, 15.0 mm Hg vs. 16.0 (15.0, 18.0 mm Hg; P = 0.047]. Modified DSAEK patients had higher endothelial cell counts as compared to conventional DSAEK patients [2148.0 (1964.0, 2218.0 vs. 1529.0 (713.0, 2014.0], but the difference did not reach statistical significance (P = 0.072. CONCLUSIONS: New continuous air pumping technique in DSAEK can be performed safely and effectively in patients with prior GDDs placement who have corneal failure.

  14. DCT-based cyber defense techniques

    Science.gov (United States)

    Amsalem, Yaron; Puzanov, Anton; Bedinerman, Anton; Kutcher, Maxim; Hadar, Ofer

    2015-09-01

    With the increasing popularity of video streaming services and multimedia sharing via social networks, there is a need to protect the multimedia from malicious use. An attacker may use steganography and watermarking techniques to embed malicious content, in order to attack the end user. Most of the attack algorithms are robust to basic image processing techniques such as filtering, compression, noise addition, etc. Hence, in this article two novel, real-time, defense techniques are proposed: Smart threshold and anomaly correction. Both techniques operate at the DCT domain, and are applicable for JPEG images and H.264 I-Frames. The defense performance was evaluated against a highly robust attack, and the perceptual quality degradation was measured by the well-known PSNR and SSIM quality assessment metrics. A set of defense techniques is suggested for improving the defense efficiency. For the most aggressive attack configuration, the combination of all the defense techniques results in 80% protection against cyber-attacks with PSNR of 25.74 db.

  15. NMR-based phytochemical analysis of Vitis vinifera cv Falanghina leaves. Characterization of a previously undescribed biflavonoid with antiproliferative activity.

    Science.gov (United States)

    Tartaglione, Luciana; Gambuti, Angelita; De Cicco, Paola; Ercolano, Giuseppe; Ianaro, Angela; Taglialatela-Scafati, Orazio; Moio, Luigi; Forino, Martino

    2018-03-01

    Vitis vinifera cv Falanghina is an ancient grape variety of Southern Italy. A thorough phytochemical analysis of the Falanghina leaves was conducted to investigate its specialised metabolite content. Along with already known molecules, such as caftaric acid, quercetin-3-O-β-d-glucopyranoside, quercetin-3-O-β-d-glucuronide, kaempferol-3-O-β-d-glucopyranoside and kaempferol-3-O-β-d-glucuronide, a previously undescribed biflavonoid was identified. For this last compound, a moderate bioactivity against metastatic melanoma cells proliferation was discovered. This datum can be of some interest to researchers studying human melanoma. The high content in antioxidant glycosylated flavonoids supports the exploitation of grape vine leaves as an inexpensive source of natural products for the food industry and for both pharmaceutical and nutraceutical companies. Additionally, this study offers important insights into the plant physiology, thus prompting possible technological researches of genetic selection based on the vine adaptation to specific pedo-climatic environments. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Nasal base narrowing: the combined alar base excision technique.

    Science.gov (United States)

    Foda, Hossam M T

    2007-01-01

    To evaluate the role of the combined alar base excision technique in narrowing the nasal base and correcting excessive alar flare. The study included 60 cases presenting with a wide nasal base and excessive alar flaring. The surgical procedure combined an external alar wedge resection with an internal vestibular floor excision. All cases were followed up for a mean of 32 (range, 12-144) months. Nasal tip modification and correction of any preexisting caudal septal deformities were always completed before the nasal base narrowing. The mean width of the external alar wedge excised was 7.2 (range, 4-11) mm, whereas the mean width of the sill excision was 3.1 (range, 2-7) mm. Completing the internal excision first resulted in a more conservative external resection, thus avoiding any blunting of the alar-facial crease. No cases of postoperative bleeding, infection, or keloid formation were encountered, and the external alar wedge excision healed with an inconspicuous scar that was well hidden in the depth of the alar-facial crease. Finally, the risk of notching of the alar rim, which can occur at the junction of the external and internal excisions, was significantly reduced by adopting a 2-layered closure of the vestibular floor (P = .01). The combined alar base excision resulted in effective narrowing of the nasal base with elimination of excessive alar flare. Commonly feared complications, such as blunting of the alar-facial crease or notching of the alar rim, were avoided by using simple modifications in the technique of excision and closure.

  17. Effect of six different cooking techniques in the nutritional composition of two fish species previously selected as optimal for renal patient's diet.

    Science.gov (United States)

    Castro-González, Isabel; Maafs-Rodríguez, Ana Gabriela; Pérez-Gil Romo, Fernando

    2015-07-01

    Benefits of fish consumption are widely known, but there is little information about nutrient values of raw and cooked fish. The aim was to study the impact that six cooking techniques have on the nutritional composition of two fish species with low content of adverse nutrients in renal diet. Raw and steamed, foiled with aluminum, foiled with banana leaf, gas oven-baked, microwave oven-coked and fried lightly samples were chemically analyzed to determine their protein, phosphorus and lipid content. Crevalle jack: all methods increased lipid and protein content and fatty acids (FA) varied in all cooking methods. Phosphorus decreased in the steamed and microwave oven-cooked samples. Red drum: foiled and fried lightly increased lipid content compared to the raw sample. FA concentration changed in all cooking methods. Protein increased with every technique and phosphorus decreased in the steamed and gas oven-baked samples. Renal patients should preferably consume crevalle jack steamed or microwave oven-cooked and red drum steamed or gas oven-baked.

  18. Sunburn and sun-protective behaviors among adults with and without previous nonmelanoma skin cancer: a population-based study

    Science.gov (United States)

    Fischer, Alexander H.; Wang, Timothy S.; Yenokyan, Gayane; Kang, Sewon; Chien, Anna L.

    2016-01-01

    Background Individuals with previous nonmelanoma skin cancer (NMSC) are at increased risk for subsequent skin cancer, and should therefore limit UV exposure. Objective To determine whether individuals with previous NMSC engage in better sun protection than those with no skin cancer history. Methods We pooled self-reported data (2005 and 2010 National Health Interview Surveys) from US non-Hispanic white adults (758 with and 34,161 without previous NMSC). We calculated adjusted prevalence odds ratios (aPOR) and 95% confidence intervals (95% CI), taking into account the complex survey design. Results Individuals with previous NMSC versus no history of NMSC had higher rates of frequent use of shade (44.3% versus 27.0%; aPOR=1.41; 1.16–1.71), long sleeves (20.5% versus 7.7%; aPOR=1.55; 1.21–1.98), a wide-brimmed hat (26.1% versus 10.5%; aPOR=1.52; 1.24–1.87), and sunscreen (53.7% versus 33.1%; aPOR=2.11; 95% CI=1.73–2.59), but did not have significantly lower odds of recent sunburn (29.7% versus 40.7%; aPOR=0.95; 0.77–1.17). Among subjects with previous NMSC, recent sunburn was inversely associated with age, sun avoidance, and shade but not sunscreen. Limitations Self-reported cross-sectional data and unavailable information quantifying regular sun exposure. Conclusion Physicians should emphasize sunburn prevention when counseling patients with previous NMSC, especially younger adults, focusing on shade and sun avoidance over sunscreen. PMID:27198078

  19. New calibration technique for KCD-based megavoltage imaging

    Science.gov (United States)

    Samant, Sanjiv S.; Zheng, Wei; DiBianca, Frank A.; Zeman, Herbert D.; Laughter, Joseph S.

    1999-05-01

    In megavoltage imaging, current commercial electronic portal imaging devices (EPIDs), despite having the advantage of immediate digital imaging over film, suffer from poor image contrast and spatial resolution. The feasibility of using a kinestatic charge detector (KCD) as an EPID to provide superior image contrast and spatial resolution for portal imaging has already been demonstrated in a previous paper. The KCD system had the additional advantage of requiring an extremely low dose per acquired image, allowing for superior imaging to be reconstructed form a single linac pulse per image pixel. The KCD based images utilized a dose of two orders of magnitude less that for EPIDs and film. Compared with the current commercial EPIDs and film, the prototype KCD system exhibited promising image qualities, despite being handicapped by the use of a relatively simple image calibration technique, and the performance limits of medical linacs on the maximum linac pulse frequency and energy flux per pulse delivered. This image calibration technique fixed relative image pixel values based on a linear interpolation of extrema provided by an air-water calibration, and accounted only for channel-to-channel variations. The counterpart of this for area detectors is the standard flat fielding method. A comprehensive calibration protocol has been developed. The new technique additionally corrects for geometric distortions due to variations in the scan velocity, and timing artifacts caused by mis-synchronization between the linear accelerator and the data acquisition system (DAS). The role of variations in energy flux (2 - 3%) on imaging is demonstrated to be not significant for the images considered. The methodology is presented, and the results are discussed for simulated images. It also allows for significant improvements in the signal-to- noise ratio (SNR) by increasing the dose using multiple images without having to increase the linac pulse frequency or energy flux per pulse. The

  20. New population-based exome data question the pathogenicity of some genetic variants previously associated with Marfan syndrome

    DEFF Research Database (Denmark)

    Yang, Ren-Qiang; Jabbari, Javad; Cheng, Xiao-Shu

    2014-01-01

    BACKGROUND: Marfan syndrome (MFS) is a rare autosomal dominantly inherited connective tissue disorder with an estimated prevalence of 1:5,000. More than 1000 variants have been previously reported to be associated with MFS. However, the disease-causing effect of these variants may be questionable...

  1. Synchrotron radiation based analytical techniques (XAS and XRF)

    International Nuclear Information System (INIS)

    Jha, Shambhu Nath

    2014-01-01

    A brief description of the principles of X-ray absorption spectroscopy (XAS) and X-ray fluorescence (XRF) techniques is given in this article with emphasis on the advantages of using synchrotron radiation-based instrumentation/beamline. XAS technique is described in more detail to emphasize the strength of the technique as a local structural probe. (author)

  2. Five criteria for using a surrogate endpoint to predict treatment effect based on data from multiple previous trials.

    Science.gov (United States)

    Baker, Stuart G

    2018-02-20

    A surrogate endpoint in a randomized clinical trial is an endpoint that occurs after randomization and before the true, clinically meaningful, endpoint that yields conclusions about the effect of treatment on true endpoint. A surrogate endpoint can accelerate the evaluation of new treatments but at the risk of misleading conclusions. Therefore, criteria are needed for deciding whether to use a surrogate endpoint in a new trial. For the meta-analytic setting of multiple previous trials, each with the same pair of surrogate and true endpoints, this article formulates 5 criteria for using a surrogate endpoint in a new trial to predict the effect of treatment on the true endpoint in the new trial. The first 2 criteria, which are easily computed from a zero-intercept linear random effects model, involve statistical considerations: an acceptable sample size multiplier and an acceptable prediction separation score. The remaining 3 criteria involve clinical and biological considerations: similarity of biological mechanisms of treatments between the new trial and previous trials, similarity of secondary treatments following the surrogate endpoint between the new trial and previous trials, and a negligible risk of harmful side effects arising after the observation of the surrogate endpoint in the new trial. These 5 criteria constitute an appropriately high bar for using a surrogate endpoint to make a definitive treatment recommendation. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.

  3. Accelerator based techniques for contraband detection

    Science.gov (United States)

    Vourvopoulos, George

    1994-05-01

    It has been shown that narcotics, explosives, and other contraband materials, contain various chemical elements such as H, C, N, O, P, S, and Cl in quantities and ratios that differentiate them from each other and from other innocuous substances. Neutrons and γ-rays have the ability to penetrate through various materials at large depths. They are thus able, in a non-intrusive way, to interrogate volumes ranging from suitcases to Sea-Land containers, and have the ability to image the object with an appreciable degree of reliability. Neutron induced reactions such as (n, γ), (n, n') (n, p) or proton induced γ-resonance absorption are some of the reactions currently investigated for the identification of the chemical elements mentioned above. Various DC and pulsed techniques are discussed and their advantages, characteristics, and current progress are shown. Areas where use of these methods is currently under evaluation are detection of hidden explosives, illicit drug interdiction, chemical war agents identification, nuclear waste assay, nuclear weapons destruction and others.

  4. RapidArc, intensity modulated photon and proton techniques for recurrent prostate cancer in previously irradiated patients: a treatment planning comparison study

    International Nuclear Information System (INIS)

    Weber, Damien C; Miralbell, Raymond; Wang, Hui; Cozzi, Luca; Dipasquale, Giovanna; Khan, Haleem G; Ratib, Osman; Rouzaud, Michel; Vees, Hansjoerg; Zaidi, Habib

    2009-01-01

    A study was performed comparing volumetric modulated arcs (RA) and intensity modulation (with photons, IMRT, or protons, IMPT) radiation therapy (RT) for patients with recurrent prostate cancer after RT. Plans for RA, IMRT and IMPT were optimized for 7 patients. Prescribed dose was 56 Gy in 14 fractions. The recurrent gross tumor volume (GTV) was defined on 18 F-fluorocholine PET/CT scans. Plans aimed to cover at least 95% of the planning target volume with a dose > 50.4 Gy. A maximum dose (D Max ) of 61.6 Gy was allowed to 5% of the GTV. For the urethra, D Max was constrained to 37 Gy. Rectal D Median was < 17 Gy. Results were analyzed using Dose-Volume Histogram and conformity index (CI 90 ) parameters. Tumor coverage (GTV and PTV) was improved with RA (V 95% 92.6 ± 7.9 and 83.7 ± 3.3%), when compared to IMRT (V 95% 88.6 ± 10.8 and 77.2 ± 2.2%). The corresponding values for IMPT were intermediate for the GTV (V 95% 88.9 ± 10.5%) and better for the PTV (V 95% 85.6 ± 5.0%). The percentages of rectal and urethral volumes receiving intermediate doses (35 Gy) were significantly decreased with RA (5.1 ± 3.0 and 38.0 ± 25.3%) and IMPT (3.9 ± 2.7 and 25.1 ± 21.1%), when compared to IMRT (9.8 ± 5.3 and 60.7 ± 41.7%). CI 90 was 1.3 ± 0.1 for photons and 1.6 ± 0.2 for protons. Integral Dose was 1.1 ± 0.5 Gy*cm 3 *10 5 for IMPT and about a factor three higher for all photon's techniques. RA and IMPT showed improvements in conformal avoidance relative to fixed beam IMRT for 7 patients with recurrent prostate cancer. IMPT showed further sparing of organs at risk

  5. An Authentication Technique Based on Classification

    Institute of Scientific and Technical Information of China (English)

    李钢; 杨杰

    2004-01-01

    We present a novel watermarking approach based on classification for authentication, in which a watermark is embedded into the host image. When the marked image is modified, the extracted watermark is also different to the original watermark, and different kinds of modification lead to different extracted watermarks. In this paper, different kinds of modification are considered as classes, and we used classification algorithm to recognize the modifications with high probability. Simulation results show that the proposed method is potential and effective.

  6. Laparoscopy After Previous Laparotomy

    Directory of Open Access Journals (Sweden)

    Zulfo Godinjak

    2006-11-01

    Full Text Available Following the abdominal surgery, extensive adhesions often occur and they can cause difficulties during laparoscopic operations. However, previous laparotomy is not considered to be a contraindication for laparoscopy. The aim of this study is to present that an insertion of Veres needle in the region of umbilicus is a safe method for creating a pneumoperitoneum for laparoscopic operations after previous laparotomy. In the last three years, we have performed 144 laparoscopic operations in patients that previously underwent one or two laparotomies. Pathology of digestive system, genital organs, Cesarean Section or abdominal war injuries were the most common causes of previouslaparotomy. During those operations or during entering into abdominal cavity we have not experienced any complications, while in 7 patients we performed conversion to laparotomy following the diagnostic laparoscopy. In all patients an insertion of Veres needle and trocar insertion in the umbilical region was performed, namely a technique of closed laparoscopy. Not even in one patient adhesions in the region of umbilicus were found, and no abdominal organs were injured.

  7. Laser-based techniques for combustion diagnostics

    Energy Technology Data Exchange (ETDEWEB)

    Georgiev, N.

    1997-04-01

    Two-photon-induced Degenerate Four-Wave Mixing, DFWM, was applied for the first time to the detection of CO, and NH{sub 3} molecules. Measurements were performed in a cell, and in atmospheric-pressure flames. In the cell measurements, the signal dependence on the pressure and on the laser beam intensity was studied. The possibility of simultaneous detection of NH{sub 3} and OH was investigated. Carbon monoxide and ammonia were also detected employing two-photon-induced Polarization Spectroscopy, PS. In the measurements performed in a cold gas flow, the signal strength dependence on the laser intensity, and on the polarization of the pump beam, was investigated. An approach to improve the spatial resolution of the Amplified Stimulated Emission, ASE, was developed. In this approach, two laser beams at different frequencies were crossed in the sample. If the sum of the frequencies of the two laser beams matches a two photon resonance of the investigated species, only the molecules in the intersection volume will be excited. NH{sub 3} molecules and C atoms were studied. The potential of using two-photon LIF for two-dimensional imaging of combustion species was investigated. Although LIF is species specific, several species can be detected simultaneously by utilizing spectral coincidences. Combining one- and two-photon process, OH, NO, and O were detected simultaneously, as well as OH, NO, and NH{sub 3}. Collisional quenching is the major source of uncertainty in quantitative applications of LIF. A technique for two-dimensional, absolute species concentration measurements, circumventing the problems associated with collisional quenching, was developed. By applying simple mathematics to the ratio of two LIF signals generated from two counterpropagating laser beams, the absolute species concentration could be obtained. 41 refs

  8. New population-based exome data are questioning the pathogenicity of previously cardiomyopathy-associated genetic variants

    DEFF Research Database (Denmark)

    Andreasen, Charlotte Hartig; Nielsen, Jonas B; Refsgaard, Lena

    2013-01-01

    Cardiomyopathies are a heterogeneous group of diseases with various etiologies. We focused on three genetically determined cardiomyopathies: hypertrophic (HCM), dilated (DCM), and arrhythmogenic right ventricular cardiomyopathy (ARVC). Eighty-four genes have so far been associated with these card......Cardiomyopathies are a heterogeneous group of diseases with various etiologies. We focused on three genetically determined cardiomyopathies: hypertrophic (HCM), dilated (DCM), and arrhythmogenic right ventricular cardiomyopathy (ARVC). Eighty-four genes have so far been associated...... with these cardiomyopathies, but the disease-causing effect of reported variants is often dubious. In order to identify possible false-positive variants, we investigated the prevalence of previously reported cardiomyopathy-associated variants in recently published exome data. We searched for reported missense and nonsense...... variants in the NHLBI-Go Exome Sequencing Project (ESP) containing exome data from 6500 individuals. In ESP, we identified 94 variants out of 687 (14%) variants previously associated with HCM, 58 out of 337 (17%) variants associated with DCM, and 38 variants out of 209 (18%) associated with ARVC...

  9. Reasons for placement of restorations on previously unrestored tooth surfaces by dentists in The Dental Practice-Based Research Network

    DEFF Research Database (Denmark)

    Nascimento, Marcelle M; Gordan, Valeria V; Qvist, Vibeke

    2010-01-01

    The authors conducted a study to identify and quantify the reasons used by dentists in The Dental Practice-Based Research Network (DPBRN) for placing restorations on unrestored permanent tooth surfaces and the dental materials they used in doing so....

  10. Techniques for Scaling Up Analyses Based on Pre-interpretations

    DEFF Research Database (Denmark)

    Gallagher, John Patrick; Henriksen, Kim Steen; Banda, Gourinath

    2005-01-01

    a variety of analyses, both generic (such as mode analysis) and program-specific (with respect to a type describing some particular property of interest). Previous work demonstrated the approach using pre-interpretations over small domains. In this paper we present techniques that allow the method...

  11. Array-based techniques for fingerprinting medicinal herbs

    Directory of Open Access Journals (Sweden)

    Xue Charlie

    2011-05-01

    Full Text Available Abstract Poor quality control of medicinal herbs has led to instances of toxicity, poisoning and even deaths. The fundamental step in quality control of herbal medicine is accurate identification of herbs. Array-based techniques have recently been adapted to authenticate or identify herbal plants. This article reviews the current array-based techniques, eg oligonucleotides microarrays, gene-based probe microarrays, Suppression Subtractive Hybridization (SSH-based arrays, Diversity Array Technology (DArT and Subtracted Diversity Array (SDA. We further compare these techniques according to important parameters such as markers, polymorphism rates, restriction enzymes and sample type. The applicability of the array-based methods for fingerprinting depends on the availability of genomics and genetics of the species to be fingerprinted. For the species with few genome sequence information but high polymorphism rates, SDA techniques are particularly recommended because they require less labour and lower material cost.

  12. Development of a computerized tomographic system based on the FAN-BEAM technique

    International Nuclear Information System (INIS)

    Junqueira, M.M.; Santos, C.A.C.; Borges, J.C.

    1986-01-01

    The Nuclear Instrumentation Laboratory, at COPPE/UFRJ, concentrates its researches in the development of computerized tomographic systems, looking for applications in industrial and medical non destructive analysing techniques. In this work we have projected and constructed a tomographic prototype, based on the FAN-BEAM technique for irradiating the object under analysis. An algorithm previously developed to analyse parallel beams, was modified and adapted to the FAN-BEAM geometry. (Author) [pt

  13. Biotin IgM Antibodies in Human Blood: A Previously Unknown Factor Eliciting False Results in Biotinylation-Based Immunoassays

    Science.gov (United States)

    Chen, Tingting; Hedman, Lea; Mattila, Petri S.; Jartti, Laura; Jartti, Tuomas; Ruuskanen, Olli; Söderlund-Venermo, Maria; Hedman, Klaus

    2012-01-01

    Biotin is an essential vitamin that binds streptavidin or avidin with high affinity and specificity. As biotin is a small molecule that can be linked to proteins without affecting their biological activity, biotinylation is applied widely in biochemical assays. In our laboratory, IgM enzyme immuno assays (EIAs) of µ-capture format have been set up against many viruses, using as antigen biotinylated virus like particles (VLPs) detected by horseradish peroxidase-conjugated streptavidin. We recently encountered one serum sample reacting with the biotinylated VLP but not with the unbiotinylated one, suggesting in human sera the occurrence of biotin-reactive antibodies. In the present study, we search the general population (612 serum samples from adults and 678 from children) for IgM antibodies reactive with biotin and develop an indirect EIA for quantification of their levels and assessment of their seroprevalence. These IgM antibodies were present in 3% adults regardless of age, but were rarely found in children. The adverse effects of the biotin IgM on biotinylation-based immunoassays were assessed, including four inhouse and one commercial virus IgM EIAs, showing that biotin IgM do cause false positivities. The biotin can not bind IgM and streptavidin or avidin simultaneously, suggesting that these biotin-interactive compounds compete for the common binding site. In competitive inhibition assays, the affinities of biotin IgM antibodies ranged from 2.1×10−3 to 1.7×10−4 mol/L. This is the first report on biotin antibodies found in humans, providing new information on biotinylation-based immunoassays as well as new insights into the biomedical effects of vitamins. PMID:22879954

  14. Base Oils Biodegradability Prediction with Data Mining Techniques

    Directory of Open Access Journals (Sweden)

    Malika Trabelsi

    2010-02-01

    Full Text Available In this paper, we apply various data mining techniques including continuous numeric and discrete classification prediction models of base oils biodegradability, with emphasis on improving prediction accuracy. The results show that highly biodegradable oils can be better predicted through numeric models. In contrast, classification models did not uncover a similar dichotomy. With the exception of Memory Based Reasoning and Decision Trees, tested classification techniques achieved high classification prediction. However, the technique of Decision Trees helped uncover the most significant predictors. A simple classification rule derived based on this predictor resulted in good classification accuracy. The application of this rule enables efficient classification of base oils into either low or high biodegradability classes with high accuracy. For the latter, a higher precision biodegradability prediction can be obtained using continuous modeling techniques.

  15. Hemoglobin-Based Oxygen Carrier (HBOC) Development in Trauma: Previous Regulatory Challenges, Lessons Learned, and a Path Forward.

    Science.gov (United States)

    Keipert, Peter E

    2017-01-01

    Historically, hemoglobin-based oxygen carriers (HBOCs) were being developed as "blood substitutes," despite their transient circulatory half-life (~ 24 h) vs. transfused red blood cells (RBCs). More recently, HBOC commercial development focused on "oxygen therapeutic" indications to provide a temporary oxygenation bridge until medical or surgical interventions (including RBC transfusion, if required) can be initiated. This included the early trauma trials with HemAssist ® (BAXTER), Hemopure ® (BIOPURE) and PolyHeme ® (NORTHFIELD) for resuscitating hypotensive shock. These trials all failed due to safety concerns (e.g., cardiac events, mortality) and certain protocol design limitations. In 2008 the Food and Drug Administration (FDA) put all HBOC trials in the US on clinical hold due to the unfavorable benefit:risk profile demonstrated by various HBOCs in different clinical studies in a meta-analysis published by Natanson et al. (2008). During standard resuscitation in trauma, organ dysfunction and failure can occur due to ischemia in critical tissues, which can be detected by the degree of lactic acidosis. SANGART'S Phase 2 trauma program with MP4OX therefore added lactate >5 mmol/L as an inclusion criterion to enroll patients who had lost sufficient blood to cause a tissue oxygen debt. This was key to the successful conduct of their Phase 2 program (ex-US, from 2009 to 2012) to evaluate MP4OX as an adjunct to standard fluid resuscitation and transfusion of RBCs. In 2013, SANGART shared their Phase 2b results with the FDA, and succeeded in getting the FDA to agree that a planned Phase 2c higher dose comparison study of MP4OX in trauma could include clinical sites in the US. Unfortunately, SANGART failed to secure new funding and was forced to terminate development and operations in Dec 2013, even though a regulatory path forward with FDA approval to proceed in trauma had been achieved.

  16. improvement of digital image watermarking techniques based on FPGA implementation

    International Nuclear Information System (INIS)

    EL-Hadedy, M.E

    2006-01-01

    digital watermarking provides the ownership of a piece of digital data by marking the considered data invisibly or visibly. this can be used to protect several types of multimedia objects such as audio, text, image and video. this thesis demonstrates the different types of watermarking techniques such as (discrete cosine transform (DCT) and discrete wavelet transform (DWT) and their characteristics. then, it classifies these techniques declaring their advantages and disadvantages. an improved technique with distinguished features, such as peak signal to noise ratio ( PSNR) and similarity ratio (SR) has been introduced. the modified technique has been compared with the other techniques by measuring heir robustness against differ attacks. finally, field programmable gate arrays (FPGA) based implementation and comparison, for the proposed watermarking technique have been presented and discussed

  17. Rapid analysis of steels using laser-based techniques

    International Nuclear Information System (INIS)

    Cremers, D.A.; Archuleta, F.L.; Dilworth, H.C.

    1985-01-01

    Based on the data obtained by this study, we conclude that laser-based techniques can be used to provide at least semi-quantitative information about the elemental composition of molten steel. Of the two techniques investigated here, the Sample-Only method appears preferable to the LIBS (laser-induced breakdown spectroscopy) method because of its superior analytical performance. In addition, the Sample-Only method would probably be easier to incorporate into a steel plant environment. However, before either technique can be applied to steel monitoring, additional research is needed

  18. Power system stabilizers based on modern control techniques

    Energy Technology Data Exchange (ETDEWEB)

    Malik, O P; Chen, G P; Zhang, Y; El-Metwally, K [Calgary Univ., AB (Canada). Dept. of Electrical and Computer Engineering

    1994-12-31

    Developments in digital technology have made it feasible to develop and implement improved controllers based on sophisticated control techniques. Power system stabilizers based on adaptive control, fuzzy logic and artificial networks are being developed. Each of these control techniques possesses unique features and strengths. In this paper, the relative performance of power systems stabilizers based on adaptive control, fuzzy logic and neural network, both in simulation studies and real time tests on a physical model of a power system, is presented and compared to that of a fixed parameter conventional power system stabilizer. (author) 16 refs., 45 figs., 3 tabs.

  19. Simulation-based optimization parametric optimization techniques and reinforcement learning

    CERN Document Server

    Gosavi, Abhijit

    2003-01-01

    Simulation-Based Optimization: Parametric Optimization Techniques and Reinforcement Learning introduces the evolving area of simulation-based optimization. The book's objective is two-fold: (1) It examines the mathematical governing principles of simulation-based optimization, thereby providing the reader with the ability to model relevant real-life problems using these techniques. (2) It outlines the computational technology underlying these methods. Taken together these two aspects demonstrate that the mathematical and computational methods discussed in this book do work. Broadly speaking, the book has two parts: (1) parametric (static) optimization and (2) control (dynamic) optimization. Some of the book's special features are: *An accessible introduction to reinforcement learning and parametric-optimization techniques. *A step-by-step description of several algorithms of simulation-based optimization. *A clear and simple introduction to the methodology of neural networks. *A gentle introduction to converg...

  20. An Image Registration Based Technique for Noninvasive Vascular Elastography

    OpenAIRE

    Valizadeh, Sina; Makkiabadi, Bahador; Mirbagheri, Alireza; Soozande, Mehdi; Manwar, Rayyan; Mozaffarzadeh, Moein; Nasiriavanaki, Mohammadreza

    2018-01-01

    Non-invasive vascular elastography is an emerging technique in vascular tissue imaging. During the past decades, several techniques have been suggested to estimate the tissue elasticity by measuring the displacement of the Carotid vessel wall. Cross correlation-based methods are the most prevalent approaches to measure the strain exerted in the wall vessel by the blood pressure. In the case of a low pressure, the displacement is too small to be apparent in ultrasound imaging, especially in th...

  1. Memory Based Machine Intelligence Techniques in VLSI hardware

    OpenAIRE

    James, Alex Pappachen

    2012-01-01

    We briefly introduce the memory based approaches to emulate machine intelligence in VLSI hardware, describing the challenges and advantages. Implementation of artificial intelligence techniques in VLSI hardware is a practical and difficult problem. Deep architectures, hierarchical temporal memories and memory networks are some of the contemporary approaches in this area of research. The techniques attempt to emulate low level intelligence tasks and aim at providing scalable solutions to high ...

  2. The Research of Histogram Enhancement Technique Based on Matlab Software

    Directory of Open Access Journals (Sweden)

    Li Kai

    2014-08-01

    Full Text Available Histogram enhancement technique has been widely applied as a typical pattern in digital image processing. The paper is based on Matlab software, through the two ways of histogram equalization and histogram specification technologies to deal with the darker images, using two methods of partial equilibrium and mapping histogram to transform the original histograms, thereby enhanced the image information. The results show that these two kinds of techniques both can significantly improve the image quality and enhance the image feature.

  3. Laser-based direct-write techniques for cell printing

    Energy Technology Data Exchange (ETDEWEB)

    Schiele, Nathan R; Corr, David T [Biomedical Engineering Department, Rensselaer Polytechnic Institute, Troy, NY (United States); Huang Yong [Department of Mechanical Engineering, Clemson University, Clemson, SC (United States); Raof, Nurazhani Abdul; Xie Yubing [College of Nanoscale Science and Engineering, University at Albany, SUNY, Albany, NY (United States); Chrisey, Douglas B, E-mail: schien@rpi.ed, E-mail: chrisd@rpi.ed [Material Science and Engineering Department, Rensselaer Polytechnic Institute, Troy, NY (United States)

    2010-09-15

    Fabrication of cellular constructs with spatial control of cell location ({+-}5 {mu}m) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing. (topical review)

  4. Laser-based direct-write techniques for cell printing

    International Nuclear Information System (INIS)

    Schiele, Nathan R; Corr, David T; Huang Yong; Raof, Nurazhani Abdul; Xie Yubing; Chrisey, Douglas B

    2010-01-01

    Fabrication of cellular constructs with spatial control of cell location (±5 μm) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing. (topical review)

  5. Leak detection of complex pipelines based on the filter diagonalization method: robust technique for eigenvalue assessment

    International Nuclear Information System (INIS)

    Lay-Ekuakille, Aimé; Pariset, Carlo; Trotta, Amerigo

    2010-01-01

    The FDM (filter diagonalization method), an interesting technique used in nuclear magnetic resonance data processing for tackling FFT (fast Fourier transform) limitations, can be used by considering pipelines, especially complex configurations, as a vascular apparatus with arteries, veins, capillaries, etc. Thrombosis, which might occur in humans, can be considered as a leakage for the complex pipeline, the human vascular apparatus. The choice of eigenvalues in FDM or in spectra-based techniques is a key issue in recovering the solution of the main equation (for FDM) or frequency domain transformation (for FFT) in order to determine the accuracy in detecting leaks in pipelines. This paper deals with the possibility of improving the leak detection accuracy of the FDM technique thanks to a robust algorithm by assessing the problem of eigenvalues, making it less experimental and more analytical using Tikhonov-based regularization techniques. The paper starts from the results of previous experimental procedures carried out by the authors

  6. Effectiveness of Ritonavir-Boosted Protease Inhibitor Monotherapy in Clinical Practice Even with Previous Virological Failures to Protease Inhibitor-Based Regimens.

    Directory of Open Access Journals (Sweden)

    Luis F López-Cortés

    Full Text Available Significant controversy still exists about ritonavir-boosted protease inhibitor monotherapy (mtPI/rtv as a simplification strategy that is used up to now to treat patients that have not experienced previous virological failure (VF while on protease inhibitor (PI -based regimens. We have evaluated the effectiveness of two mtPI/rtv regimens in an actual clinical practice setting, including patients that had experienced previous VF with PI-based regimens.This retrospective study analyzed 1060 HIV-infected patients with undetectable viremia that were switched to lopinavir/ritonavir or darunavir/ritonavir monotherapy. In cases in which the patient had previously experienced VF while on a PI-based regimen, the lack of major HIV protease resistance mutations to lopinavir or darunavir, respectively, was mandatory. The primary endpoint of this study was the percentage of participants with virological suppression after 96 weeks according to intention-to-treat analysis (non-complete/missing = failure.A total of 1060 patients were analyzed, including 205 with previous VF while on PI-based regimens, 90 of whom were on complex therapies due to extensive resistance. The rates of treatment effectiveness (intention-to-treat analysis and virological efficacy (on-treatment analysis at week 96 were 79.3% (CI95, 76.8-81.8 and 91.5% (CI95, 89.6-93.4, respectively. No relationships were found between VF and earlier VF while on PI-based regimens, the presence of major or minor protease resistance mutations, the previous time on viral suppression, CD4+ T-cell nadir, and HCV-coinfection. Genotypic resistance tests were available in 49 out of the 74 patients with VFs and only four patients presented new major protease resistance mutations.Switching to mtPI/rtv achieves sustained virological control in most patients, even in those with previous VF on PI-based regimens as long as no major resistance mutations are present for the administered drug.

  7. An improved visualization-based force-measurement technique for short-duration hypersonic facilities

    Energy Technology Data Exchange (ETDEWEB)

    Laurence, Stuart J.; Karl, Sebastian [Institute of Aerodynamics and Flow Technology, Spacecraft Section, German Aerospace Center (DLR), Goettingen (Germany)

    2010-06-15

    This article is concerned with describing and exploring the limitations of an improved version of a recently proposed visualization-based technique for the measurement of forces and moments in short-duration hypersonic wind tunnels. The technique is based on tracking the motion of a free-flying body over a sequence of high-speed visualizations; while this idea is not new in itself, the use of high-speed digital cinematography combined with a highly accurate least-squares tracking algorithm allows improved results over what have been previously possible with such techniques. The technique precision is estimated through the analysis of artificially constructed and experimental test images, and the resulting error in acceleration measurements is characterized. For wind-tunnel scale models, position measurements to within a few microns are shown to be readily attainable. Image data from two previous experimental studies in the T5 hypervelocity shock tunnel are then reanalyzed with the improved technique: the uncertainty in the mean drag acceleration is shown to be reduced to the order of the flow unsteadiness, 2-3%, and time-resolved acceleration measurements are also shown to be possible. The response time of the technique for the configurations studied is estimated to be {proportional_to}0.5 ms. Comparisons with computations using the DLR TAU code also yield agreement to within the overall experimental uncertainty. Measurement of the pitching moment for blunt geometries still appears challenging, however. (orig.)

  8. Estimate-Merge-Technique-based algorithms to track an underwater ...

    Indian Academy of Sciences (India)

    D V A N Ravi Kumar

    2017-07-04

    Jul 4, 2017 ... In this paper, two novel methods based on the Estimate Merge Technique ... mentioned advantages of the proposed novel methods is shown by carrying out Monte Carlo simulation in .... equations are converted to sequential equations to make ... estimation error and low convergence time) at feasibly high.

  9. GIS-Based bivariate statistical techniques for groundwater potential ...

    Indian Academy of Sciences (India)

    24

    This study shows the potency of two GIS-based data driven bivariate techniques namely ... In the view of these weaknesses , there is a strong requirement for reassessment of .... Font color: Text 1, Not Expanded by / Condensed by , ...... West Bengal (India) using remote sensing, geographical information system and multi-.

  10. Learning Physics through Project-Based Learning Game Techniques

    Science.gov (United States)

    Baran, Medine; Maskan, Abdulkadir; Yasar, Seyma

    2018-01-01

    The aim of the present study, in which Project and game techniques are used together, is to examine the impact of project-based learning games on students' physics achievement. Participants of the study consist of 34 9th grade students (N = 34). The data were collected using achievement tests and a questionnaire. Throughout the applications, the…

  11. Ultrabroadband Phased-Array Receivers Based on Optical Techniques

    Science.gov (United States)

    2016-02-26

    bandwidths, and with it receiver noise floors , are unavoidable. Figure 1. SNR of a thermally limited receiver based on Friis equation showing the...techniques for RF and photonic integration based on liquid crystal polymer substrates were pursued that would aid in the realization of potential imaging...These models assumed that sufficient LNA gain was used on the antenna to set the noise floor of the imaging receiver, which necessitated physical

  12. Optical supervised filtering technique based on Hopfield neural network

    Science.gov (United States)

    Bal, Abdullah

    2004-11-01

    Hopfield neural network is commonly preferred for optimization problems. In image segmentation, conventional Hopfield neural networks (HNN) are formulated as a cost-function-minimization problem to perform gray level thresholding on the image histogram or the pixels' gray levels arranged in a one-dimensional array [R. Sammouda, N. Niki, H. Nishitani, Pattern Rec. 30 (1997) 921-927; K.S. Cheng, J.S. Lin, C.W. Mao, IEEE Trans. Med. Imag. 15 (1996) 560-567; C. Chang, P. Chung, Image and Vision comp. 19 (2001) 669-678]. In this paper, a new high speed supervised filtering technique is proposed for image feature extraction and enhancement problems by modifying the conventional HNN. The essential improvement in this technique is to use 2D convolution operation instead of weight-matrix multiplication. Thereby, neural network based a new filtering technique has been obtained that is required just 3 × 3 sized filter mask matrix instead of large size weight coefficient matrix. Optical implementation of the proposed filtering technique is executed easily using the joint transform correlator. The requirement of non-negative data for optical implementation is provided by bias technique to convert the bipolar data to non-negative data. Simulation results of the proposed optical supervised filtering technique are reported for various feature extraction problems such as edge detection, corner detection, horizontal and vertical line extraction, and fingerprint enhancement.

  13. Video Multiple Watermarking Technique Based on Image Interlacing Using DWT

    Directory of Open Access Journals (Sweden)

    Mohamed M. Ibrahim

    2014-01-01

    Full Text Available Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth.

  14. Video multiple watermarking technique based on image interlacing using DWT.

    Science.gov (United States)

    Ibrahim, Mohamed M; Abdel Kader, Neamat S; Zorkany, M

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth.

  15. Line impedance estimation using model based identification technique

    DEFF Research Database (Denmark)

    Ciobotaru, Mihai; Agelidis, Vassilios; Teodorescu, Remus

    2011-01-01

    The estimation of the line impedance can be used by the control of numerous grid-connected systems, such as active filters, islanding detection techniques, non-linear current controllers, detection of the on/off grid operation mode. Therefore, estimating the line impedance can add extra functions...... into the operation of the grid-connected power converters. This paper describes a quasi passive method for estimating the line impedance of the distribution electricity network. The method uses the model based identification technique to obtain the resistive and inductive parts of the line impedance. The quasi...

  16. Biometric image enhancement using decision rule based image fusion techniques

    Science.gov (United States)

    Sagayee, G. Mary Amirtha; Arumugam, S.

    2010-02-01

    Introducing biometrics into information systems may result in considerable benefits. Most of the researchers confirmed that the finger print is widely used than the iris or face and more over it is the primary choice for most privacy concerned applications. For finger prints applications, choosing proper sensor is at risk. The proposed work deals about, how the image quality can be improved by introducing image fusion technique at sensor levels. The results of the images after introducing the decision rule based image fusion technique are evaluated and analyzed with its entropy levels and root mean square error.

  17. Electromagnetism based atmospheric ice sensing technique - A conceptual review

    Directory of Open Access Journals (Sweden)

    U Mughal

    2016-09-01

    Full Text Available Electromagnetic and vibrational properties of ice can be used to measure certain parameters such as ice thickness, type and icing rate. In this paper we present a review of the dielectric based measurement techniques for matter and the dielectric/spectroscopic properties of ice. Atmospheric Ice is a complex material with a variable dielectric constant, but precise calculation of this constant may form the basis for measurement of its other properties such as thickness and strength using some electromagnetic methods. Using time domain or frequency domain spectroscopic techniques, by measuring both the reflection and transmission characteristics of atmospheric ice in a particular frequency range, the desired parameters can be determined.

  18. Proposing a Wiki-Based Technique for Collaborative Essay Writing

    Directory of Open Access Journals (Sweden)

    Mabel Ortiz Navarrete

    2014-10-01

    Full Text Available This paper aims at proposing a technique for students learning English as a foreign language when they collaboratively write an argumentative essay in a wiki environment. A wiki environment and collaborative work play an important role within the academic writing task. Nevertheless, an appropriate and systematic work assignment is required in order to make use of both. In this paper the proposed technique when writing a collaborative essay mainly attempts to provide the most effective way to enhance equal participation among group members by taking as a base computer mediated collaboration. Within this context, the students’ role is clearly defined and individual and collaborative tasks are explained.

  19. Knowledge based systems advanced concepts, techniques and applications

    CERN Document Server

    1997-01-01

    The field of knowledge-based systems (KBS) has expanded enormously during the last years, and many important techniques and tools are currently available. Applications of KBS range from medicine to engineering and aerospace.This book provides a selected set of state-of-the-art contributions that present advanced techniques, tools and applications. These contributions have been prepared by a group of eminent researchers and professionals in the field.The theoretical topics covered include: knowledge acquisition, machine learning, genetic algorithms, knowledge management and processing under unc

  20. Case-based reasoning diagnostic technique based on multi-attribute similarity

    Energy Technology Data Exchange (ETDEWEB)

    Makoto, Takahashi [Tohoku University, Miyagi (Japan); Akio, Gofuku [Okayama University, Okayamaa (Japan)

    2014-08-15

    Case-based diagnostic technique has been developed based on the multi-attribute similarity. Specific feature of the developed system is to use multiple attributes of process signals for similarity evaluation to retrieve a similar case stored in a case base. The present technique has been applied to the measurement data from Monju with some simulated anomalies. The results of numerical experiments showed that the present technique can be utilizes as one of the methods for a hybrid-type diagnosis system.

  1. An Observed Voting System Based On Biometric Technique

    Directory of Open Access Journals (Sweden)

    B. Devikiruba

    2015-08-01

    Full Text Available ABSTRACT This article describes a computational framework which can run almost on every computer connected to an IP based network to study biometric techniques. This paper discusses with a system protecting confidential information puts strong security demands on the identification. Biometry provides us with a user-friendly method for this identification and is becoming a competitor for current identification mechanisms. The experimentation section focuses on biometric verification specifically based on fingerprints. This article should be read as a warning to those thinking of using methods of identification without first examine the technical opportunities for compromising mechanisms and the associated legal consequences. The development is based on the java language that easily improves software packages that is useful to test new control techniques.

  2. Current STR-based techniques in forensic science

    Directory of Open Access Journals (Sweden)

    Phuvadol Thanakiatkrai

    2013-01-01

    Full Text Available DNA analysis in forensic science is mainly based on short tandem repeat (STR genotyping. The conventional analysis is a three-step process of DNA extraction, amplification and detection. An overview of various techniques that are currently in use and are being actively researched for STR typing is presented. The techniques are separated into STR amplification and detection. New techniques for forensic STR analysis focus on increasing sensitivity, resolution and discrimination power for suboptimal samples. These are achieved by shifting primer-binding sites, using high-fidelity and tolerant polymerases and applying novel methods to STR detection. Examples in which STRs are used in criminal investigations are provided and future research directions are discussed.

  3. MEMS-Based Power Generation Techniques for Implantable Biosensing Applications

    Directory of Open Access Journals (Sweden)

    Jonathan Lueke

    2011-01-01

    Full Text Available Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient.

  4. MEMS-based power generation techniques for implantable biosensing applications.

    Science.gov (United States)

    Lueke, Jonathan; Moussa, Walied A

    2011-01-01

    Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS)-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient.

  5. Power system dynamic state estimation using prediction based evolutionary technique

    International Nuclear Information System (INIS)

    Basetti, Vedik; Chandel, Ashwani K.; Chandel, Rajeevan

    2016-01-01

    In this paper, a new robust LWS (least winsorized square) estimator is proposed for dynamic state estimation of a power system. One of the main advantages of this estimator is that it has an inbuilt bad data rejection property and is less sensitive to bad data measurements. In the proposed approach, Brown's double exponential smoothing technique has been utilised for its reliable performance at the prediction step. The state estimation problem is solved as an optimisation problem using a new jDE-self adaptive differential evolution with prediction based population re-initialisation technique at the filtering step. This new stochastic search technique has been embedded with different state scenarios using the predicted state. The effectiveness of the proposed LWS technique is validated under different conditions, namely normal operation, bad data, sudden load change, and loss of transmission line conditions on three different IEEE test bus systems. The performance of the proposed approach is compared with the conventional extended Kalman filter. On the basis of various performance indices, the results thus obtained show that the proposed technique increases the accuracy and robustness of power system dynamic state estimation performance. - Highlights: • To estimate the states of the power system under dynamic environment. • The performance of the EKF method is degraded during anomaly conditions. • The proposed method remains robust towards anomalies. • The proposed method provides precise state estimates even in the presence of anomalies. • The results show that prediction accuracy is enhanced by using the proposed model.

  6. Fractal Image Compression Based on High Entropy Values Technique

    Directory of Open Access Journals (Sweden)

    Douaa Younis Abbaas

    2018-04-01

    Full Text Available There are many attempts tried to improve the encoding stage of FIC because it consumed time. These attempts worked by reducing size of the search pool for pair range-domain matching but most of them led to get a bad quality, or a lower compression ratio of reconstructed image. This paper aims to present a method to improve performance of the full search algorithm by combining FIC (lossy compression and another lossless technique (in this case entropy coding is used. The entropy technique will reduce size of the domain pool (i. e., number of domain blocks based on the entropy value of each range block and domain block and then comparing the results of full search algorithm and proposed algorithm based on entropy technique to see each of which give best results (such as reduced the encoding time with acceptable values in both compression quali-ty parameters which are C. R (Compression Ratio and PSNR (Image Quality. The experimental results of the proposed algorithm proven that using the proposed entropy technique reduces the encoding time while keeping compression rates and reconstruction image quality good as soon as possible.

  7. Characterization techniques for graphene-based materials in catalysis

    Directory of Open Access Journals (Sweden)

    Maocong Hu

    2017-06-01

    Full Text Available Graphene-based materials have been studied in a wide range of applications including catalysis due to the outstanding electronic, thermal, and mechanical properties. The unprecedented features of graphene-based catalysts, which are believed to be responsible for their superior performance, have been characterized by many techniques. In this article, we comprehensively summarized the characterization methods covering bulk and surface structure analysis, chemisorption ability determination, and reaction mechanism investigation. We reviewed the advantages/disadvantages of different techniques including Raman spectroscopy, X-ray photoelectron spectroscopy (XPS, Fourier transform infrared spectroscopy (FTIR and Diffuse Reflectance Fourier Transform Infrared Spectroscopy (DRIFTS, X-Ray diffraction (XRD, X-ray absorption near edge structure (XANES and X-ray absorption fine structure (XAFS, atomic force microscopy (AFM, scanning electron microscopy (SEM, transmission electron microscopy (TEM, high-resolution transmission electron microscopy (HRTEM, ultraviolet-visible spectroscopy (UV-vis, X-ray fluorescence (XRF, inductively coupled plasma mass spectrometry (ICP, thermogravimetric analysis (TGA, Brunauer–Emmett–Teller (BET, and scanning tunneling microscopy (STM. The application of temperature-programmed reduction (TPR, CO chemisorption, and NH3/CO2-temperature-programmed desorption (TPD was also briefly introduced. Finally, we discussed the challenges and provided possible suggestions on choosing characterization techniques. This review provides key information to catalysis community to adopt suitable characterization techniques for their research.

  8. IoT Security Techniques Based on Machine Learning

    OpenAIRE

    Xiao, Liang; Wan, Xiaoyue; Lu, Xiaozhen; Zhang, Yanyong; Wu, Di

    2018-01-01

    Internet of things (IoT) that integrate a variety of devices into networks to provide advanced and intelligent services have to protect user privacy and address attacks such as spoofing attacks, denial of service attacks, jamming and eavesdropping. In this article, we investigate the attack model for IoT systems, and review the IoT security solutions based on machine learning techniques including supervised learning, unsupervised learning and reinforcement learning. We focus on the machine le...

  9. The development of additive manufacturing technique for nickel-base alloys: A review

    Science.gov (United States)

    Zadi-Maad, Ahmad; Basuki, Arif

    2018-04-01

    Nickel-base alloys are an attractive alloy due to its excellent mechanical properties, a high resistance to creep deformation, corrosion, and oxidation. However, it is a hard task to control performance when casting or forging for this material. In recent years, additive manufacturing (AM) process has been implemented to replace the conventional directional solidification process for the production of nickel-base alloys. Due to its potentially lower cost and flexibility manufacturing process, AM is considered as a substitute technique for the existing. This paper provides a comprehensive review of the previous work related to the AM techniques for Ni-base alloys while highlighting current challenges and methods to solving them. The properties of conventionally manufactured Ni-base alloys are also compared with the AM fabricated alloys. The mechanical properties obtained from tension, hardness and fatigue test are included, along with discussions of the effect of post-treatment process. Recommendations for further work are also provided.

  10. An Image Morphing Technique Based on Optimal Mass Preserving Mapping

    Science.gov (United States)

    Zhu, Lei; Yang, Yan; Haker, Steven; Tannenbaum, Allen

    2013-01-01

    Image morphing, or image interpolation in the time domain, deals with the metamorphosis of one image into another. In this paper, a new class of image morphing algorithms is proposed based on the theory of optimal mass transport. The L2 mass moving energy functional is modified by adding an intensity penalizing term, in order to reduce the undesired double exposure effect. It is an intensity-based approach and, thus, is parameter free. The optimal warping function is computed using an iterative gradient descent approach. This proposed morphing method is also extended to doubly connected domains using a harmonic parameterization technique, along with finite-element methods. PMID:17547128

  11. Comparing Four Touch-Based Interaction Techniques for an Image-Based Audience Response System

    NARCIS (Netherlands)

    Jorritsma, Wiard; Prins, Jonatan T.; van Ooijen, Peter M. A.

    2015-01-01

    This study aimed to determine the most appropriate touch-based interaction technique for I2Vote, an image-based audience response system for radiology education in which users need to accurately mark a target on a medical image. Four plausible techniques were identified: land-on, take-off,

  12. A DIFFERENT WEB-BASED GEOCODING SERVICE USING FUZZY TECHNIQUES

    Directory of Open Access Journals (Sweden)

    P. Pahlavani

    2015-12-01

    Full Text Available Geocoding – the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.

  13. SKILLS-BASED ECLECTIC TECHNIQUES MATRIX FOR ELT MICROTEACHINGS

    Directory of Open Access Journals (Sweden)

    İskender Hakkı Sarıgöz

    2016-10-01

    Full Text Available Foreign language teaching undergoes constant changes due to the methodological improvement. This progress may be examined in two parts. They are the methods era and the post-methods era. It is not pragmatic today to propose a particular language teaching method and its techniques for all purposes. The holistic inflexibility of mid-century methods has long gone. In the present day, constructivist foreign language teaching trends attempt to see the learner as a whole person and an individual who may be different from the other students in many respects. At the same time, the individual differences should not keep the learners away from group harmony. For this reason, current teacher training programs require eclectic teaching matrixes for unit design considering the mixed ability student groups. These matrixes can be prepared in a multidimensional fashion because there are many functional techniques in different methods and other new techniques to be created by instructors freely in accordance with the teaching aims. The hypothesis in this argument is that the collection of foreign language teaching techniques compiled in ELT microteachings for a particular group of learners has to be arranged eclectically in order to update the teaching process. Nevertheless, designing a teaching format of this sort is a demanding and highly criticized task. This study briefly argues eclecticism in language-skills based methodological struggle from the perspective of ELT teacher education.

  14. FDTD technique based crosstalk analysis of bundled SWCNT interconnects

    International Nuclear Information System (INIS)

    Duksh, Yograj Singh; Kaushik, Brajesh Kumar; Agarwal, Rajendra P.

    2015-01-01

    The equivalent electrical circuit model of a bundled single-walled carbon nanotube based distributed RLC interconnects is employed for the crosstalk analysis. The accurate time domain analysis and crosstalk effect in the VLSI interconnect has emerged as an essential design criteria. This paper presents a brief description of the numerical method based finite difference time domain (FDTD) technique that is intended for estimation of voltages and currents on coupled transmission lines. For the FDTD implementation, the stability of the proposed model is strictly restricted by the Courant condition. This method is used for the estimation of crosstalk induced propagation delay and peak voltage in lossy RLC interconnects. Both functional and dynamic crosstalk effects are analyzed in the coupled transmission line. The effect of line resistance on crosstalk induced delay, and peak voltage under dynamic and functional crosstalk is also evaluated. The FDTD analysis and the SPICE simulations are carried out at 32 nm technology node for the global interconnects. It is observed that the analytical results obtained using the FDTD technique are in good agreement with the SPICE simulation results. The crosstalk induced delay, propagation delay, and peak voltage obtained using the FDTD technique shows average errors of 4.9%, 3.4% and 0.46%, respectively, in comparison to SPICE. (paper)

  15. IMAGE SEGMENTATION BASED ON MARKOV RANDOM FIELD AND WATERSHED TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    This paper presented a method that incorporates Markov Random Field(MRF), watershed segmentation and merging techniques for performing image segmentation and edge detection tasks. MRF is used to obtain an initial estimate of x regions in the image under process where in MRF model, gray level x, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The process needs an initial segmented result. An initial segmentation is got based on K-means clustering technique and the minimum distance, then the region process in modeled by MRF to obtain an image contains different intensity regions. Starting from this we calculate the gradient values of that image and then employ a watershed technique. When using MRF method it obtains an image that has different intensity regions and has all the edge and region information, then it improves the segmentation result by superimpose closed and an accurate boundary of each region using watershed algorithm. After all pixels of the segmented regions have been processed, a map of primitive region with edges is generated. Finally, a merge process based on averaged mean values is employed. The final segmentation and edge detection result is one closed boundary per actual region in the image.

  16. Microrheometric upconversion-based techniques for intracellular viscosity measurements

    Science.gov (United States)

    Rodríguez-Sevilla, Paloma; Zhang, Yuhai; de Sousa, Nuno; Marqués, Manuel I.; Sanz-Rodríguez, Francisco; Jaque, Daniel; Liu, Xiaogang; Haro-González, Patricia

    2017-08-01

    Rheological parameters (viscosity, creep compliance and elasticity) play an important role in cell function and viability. For this reason different strategies have been developed for their study. In this work, two new microrheometric techniques are presented. Both methods take advantage of the analysis of the polarized emission of an upconverting particle to determine its orientation inside the optical trap. Upconverting particles are optical materials that are able to convert infrared radiation into visible light. Their usefulness has been further boosted by the recent demonstration of their three-dimensional control and tracking by single beam infrared optical traps. In this work it is demonstrated that optical torques are responsible of the stable orientation of the upconverting particle inside the trap. Moreover, numerical calculations and experimental data allowed to use the rotation dynamics of the optically trapped upconverting particle for environmental sensing. In particular, the cytoplasm viscosity could be measured by using the rotation time and thermal fluctuations of an intracellular optically trapped upconverting particle, by means of the two previously mentioned microrheometric techniques.

  17. Is previous disaster experience a good predictor for disaster preparedness in extreme poverty households in remote Muslim minority based community in China?

    Science.gov (United States)

    Chan, Emily Y Y; Kim, Jean H; Lin, Cherry; Cheung, Eliza Y L; Lee, Polly P Y

    2014-06-01

    Disaster preparedness is an important preventive strategy for protecting health and mitigating adverse health effects of unforeseen disasters. A multi-site based ethnic minority project (2009-2015) is set up to examine health and disaster preparedness related issues in remote, rural, disaster prone communities in China. The primary objective of this reported study is to examine if previous disaster experience significantly increases household disaster preparedness levels in remote villages in China. A cross-sectional, household survey was conducted in January 2011 in Gansu Province, in a predominately Hui minority-based village. Factors related to disaster preparedness were explored using quantitative methods. Two focus groups were also conducted to provide additional contextual explanations to the quantitative findings of this study. The village household response rate was 62.4 % (n = 133). Although previous disaster exposure was significantly associated with perception of living in a high disaster risk area (OR = 6.16), only 10.7 % households possessed a disaster emergency kit. Of note, for households with members who had non-communicable diseases, 9.6 % had prepared extra medications to sustain clinical management of their chronic conditions. This is the first study that examined disaster preparedness in an ethnic minority population in remote communities in rural China. Our results indicate the need of disaster mitigation education to promote preparedness in remote, resource-poor communities.

  18. Experimental evaluation of a quasi-modal parameter based rotor foundation identification technique

    Science.gov (United States)

    Yu, Minli; Liu, Jike; Feng, Ningsheng; Hahn, Eric J.

    2017-12-01

    Correct modelling of the foundation of rotating machinery is an invaluable asset in model-based rotor dynamic study. One attractive approach for such purpose is to identify the relevant modal parameters of an equivalent foundation using the motion measurements of rotor and foundation at the bearing supports. Previous research showed that, a complex quasi-modal parameter based system identification technique could be feasible for this purpose; however, the technique was only validated by identifying simple structures under harmonic excitation. In this paper, such identification technique is further extended and evaluated by identifying the foundation of a numerical rotor-bearing-foundation system and an experimental rotor rig respectively. In the identification of rotor foundation with multiple bearing supports, all application points of excitation forces transmitted through bearings need to be included; however the assumed vibration modes far outside the rotor operating speed cannot or not necessary to be identified. The extended identification technique allows one to identify correctly an equivalent foundation with fewer modes than the assumed number of degrees of freedom, essentially by generalising the technique to be able to handle rectangular complex modal matrices. The extended technique is robust in numerical and experimental validation and is therefore likely to be applicable in the field.

  19. Adaptive differential correspondence imaging based on sorting technique

    Directory of Open Access Journals (Sweden)

    Heng Wu

    2017-04-01

    Full Text Available We develop an adaptive differential correspondence imaging (CI method using a sorting technique. Different from the conventional CI schemes, the bucket detector signals (BDS are first processed by a differential technique, and then sorted in a descending (or ascending order. Subsequently, according to the front and last several frames of the sorted BDS, the positive and negative subsets (PNS are created by selecting the relative frames from the reference detector signals. Finally, the object image is recovered from the PNS. Besides, an adaptive method based on two-step iteration is designed to select the optimum number of frames. To verify the proposed method, a single-detector computational ghost imaging (GI setup is constructed. We experimentally and numerically compare the performance of the proposed method with different GI algorithms. The results show that our method can improve the reconstruction quality and reduce the computation cost by using fewer measurement data.

  20. Wear Detection of Drill Bit by Image-based Technique

    Science.gov (United States)

    Sukeri, Maziyah; Zulhilmi Paiz Ismadi, Mohd; Rahim Othman, Abdul; Kamaruddin, Shahrul

    2018-03-01

    Image processing for computer vision function plays an essential aspect in the manufacturing industries for the tool condition monitoring. This study proposes a dependable direct measurement method to measure the tool wear using image-based analysis. Segmentation and thresholding technique were used as the means to filter and convert the colour image to binary datasets. Then, the edge detection method was applied to characterize the edge of the drill bit. By using cross-correlation method, the edges of original and worn drill bits were correlated to each other. Cross-correlation graphs were able to detect the difference of the worn edge despite small difference between the graphs. Future development will focus on quantifying the worn profile as well as enhancing the sensitivity of the technique.

  1. Underwater Time Service and Synchronization Based on Time Reversal Technique

    Science.gov (United States)

    Lu, Hao; Wang, Hai-bin; Aissa-El-Bey, Abdeldjalil; Pyndiah, Ramesh

    2010-09-01

    Real time service and synchronization are very important to many underwater systems. But the time service and synchronization in existence cannot work well due to the multi-path propagation and random phase fluctuation of signals in the ocean channel. The time reversal mirror technique can realize energy concentration through self-matching of the ocean channel and has very good spatial and temporal focusing properties. Based on the TRM technique, we present the Time Reversal Mirror Real Time service and synchronization (TRMRT) method which can bypass the processing of multi-path on the server side and reduce multi-path contamination on the client side. So TRMRT can improve the accuracy of time service. Furthermore, as an efficient and precise method of time service, TRMRT could be widely used in underwater exploration activities and underwater navigation and positioning systems.

  2. Endoscopic endonasal double flap technique for reconstruction of large anterior skull base defects: technical note.

    Science.gov (United States)

    Dolci, Ricardo Landini Lutaif; Todeschini, Alexandre Bossi; Santos, Américo Rubens Leite Dos; Lazarini, Paulo Roberto

    2018-04-19

    One of the main concerns in endoscopic endonasal approaches to the skull base has been the high incidence and morbidity associated with cerebrospinal fluid leaks. The introduction and routine use of vascularized flaps allowed a marked decrease in this complication followed by a great expansion in the indications and techniques used in endoscopic endonasal approaches, extending to defects from huge tumours and previously inaccessible areas of the skull base. Describe the technique of performing endoscopic double flap multi-layered reconstruction of the anterior skull base without craniotomy. Step by step description of the endoscopic double flap technique (nasoseptal and pericranial vascularized flaps and fascia lata free graft) as used and illustrated in two patients with an olfactory groove meningioma who underwent an endoscopic approach. Both patients achieved a gross total resection: subsequent reconstruction of the anterior skull base was performed with the nasoseptal and pericranial flaps onlay and a fascia lata free graft inlay. Both patients showed an excellent recovery, no signs of cerebrospinal fluid leak, meningitis, flap necrosis, chronic meningeal or sinonasal inflammation or cerebral herniation having developed. This endoscopic double flap technique we have described is a viable, versatile and safe option for anterior skull base reconstructions, decreasing the incidence of complications in endoscopic endonasal approaches. Copyright © 2018 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.

  3. Vision based techniques for rotorcraft low altitude flight

    Science.gov (United States)

    Sridhar, Banavar; Suorsa, Ray; Smith, Philip

    1991-01-01

    An overview of research in obstacle detection at NASA Ames Research Center is presented. The research applies techniques from computer vision to automation of rotorcraft navigation. The development of a methodology for detecting the range to obstacles based on the maximum utilization of passive sensors is emphasized. The development of a flight and image data base for verification of vision-based algorithms, and a passive ranging methodology tailored to the needs of helicopter flight are discussed. Preliminary results indicate that it is possible to obtain adequate range estimates except at regions close to the FOE. Closer to the FOE, the error in range increases since the magnitude of the disparity gets smaller, resulting in a low SNR.

  4. Combination Base64 Algorithm and EOF Technique for Steganography

    Science.gov (United States)

    Rahim, Robbi; Nurdiyanto, Heri; Hidayat, Rahmat; Saleh Ahmar, Ansari; Siregar, Dodi; Putera Utama Siahaan, Andysah; Faisal, Ilham; Rahman, Sayuti; Suita, Diana; Zamsuri, Ahmad; Abdullah, Dahlan; Napitupulu, Darmawan; Ikhsan Setiawan, Muhammad; Sriadhi, S.

    2018-04-01

    The steganography process combines mathematics and computer science. Steganography consists of a set of methods and techniques to embed the data into another media so that the contents are unreadable to anyone who does not have the authority to read these data. The main objective of the use of base64 method is to convert any file in order to achieve privacy. This paper discusses a steganography and encoding method using base64, which is a set of encoding schemes that convert the same binary data to the form of a series of ASCII code. Also, the EoF technique is used to embed encoding text performed by Base64. As an example, for the mechanisms a file is used to represent the texts, and by using the two methods together will increase the security level for protecting the data, this research aims to secure many types of files in a particular media with a good security and not to damage the stored files and coverage media that used.

  5. Full-duplex MIMO system based on antenna cancellation technique

    DEFF Research Database (Denmark)

    Foroozanfard, Ehsan; Franek, Ondrej; Tatomirescu, Alexandru

    2014-01-01

    The performance of an antenna cancellation technique for a multiple-input– multiple-output (MIMO) full-duplex system that is based on null-steering beamforming and antenna polarization diversity is investigated. A practical implementation of a symmetric antenna topology comprising three dual......-polarized patch antennas operating at 2.4 GHz is described. The measurement results show an average of 60 dB self-interference cancellation over 200 MHz bandwidth. Moreover, a decoupling level of up to 22 dB is achieved for MIMO multiplexing using antenna polarization diversity. The performance evaluation...

  6. Cooperative Technique Based on Sensor Selection in Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    ISLAM, M. R.

    2009-02-01

    Full Text Available An energy efficient cooperative technique is proposed for the IEEE 1451 based Wireless Sensor Networks. Selected numbers of Wireless Transducer Interface Modules (WTIMs are used to form a Multiple Input Single Output (MISO structure wirelessly connected with a Network Capable Application Processor (NCAP. Energy efficiency and delay of the proposed architecture are derived for different combination of cluster size and selected number of WTIMs. Optimized constellation parameters are used for evaluating derived parameters. The results show that the selected MISO structure outperforms the unselected MISO structure and it shows energy efficient performance than SISO structure after a certain distance.

  7. A technique for measuring oxygen saturation in biological tissues based on diffuse optical spectroscopy

    Science.gov (United States)

    Kleshnin, Mikhail; Orlova, Anna; Kirillin, Mikhail; Golubiatnikov, German; Turchin, Ilya

    2017-07-01

    A new approach to optical measuring blood oxygen saturation was developed and implemented. This technique is based on an original three-stage algorithm for reconstructing the relative concentration of biological chromophores (hemoglobin, water, lipids) from the measured spectra of diffusely scattered light at different distances from the probing radiation source. The numerical experiments and approbation of the proposed technique on a biological phantom have shown the high reconstruction accuracy and the possibility of correct calculation of hemoglobin oxygenation in the presence of additive noise and calibration errors. The obtained results of animal studies have agreed with the previously published results of other research groups and demonstrated the possibility to apply the developed technique to monitor oxygen saturation in tumor tissue.

  8. Nitrous oxide-based techniques versus nitrous oxide-free techniques for general anaesthesia.

    Science.gov (United States)

    Sun, Rao; Jia, Wen Qin; Zhang, Peng; Yang, KeHu; Tian, Jin Hui; Ma, Bin; Liu, Yali; Jia, Run H; Luo, Xiao F; Kuriyama, Akira

    2015-11-06

    Nitrous oxide has been used for over 160 years for the induction and maintenance of general anaesthesia. It has been used as a sole agent but is most often employed as part of a technique using other anaesthetic gases, intravenous agents, or both. Its low tissue solubility (and therefore rapid kinetics), low cost, and low rate of cardiorespiratory complications have made nitrous oxide by far the most commonly used general anaesthetic. The accumulating evidence regarding adverse effects of nitrous oxide administration has led many anaesthetists to question its continued routine use in a variety of operating room settings. Adverse events may result from both the biological actions of nitrous oxide and the fact that to deliver an effective dose, nitrous oxide, which is a relatively weak anaesthetic agent, needs to be given in high concentrations that restrict oxygen delivery (for example, a common mixture is 30% oxygen with 70% nitrous oxide). As well as the risk of low blood oxygen levels, concerns have also been raised regarding the risk of compromising the immune system, impaired cognition, postoperative cardiovascular complications, bowel obstruction from distention, and possible respiratory compromise. To determine if nitrous oxide-based anaesthesia results in similar outcomes to nitrous oxide-free anaesthesia in adults undergoing surgery. We searched the Cochrane Central Register of Controlled Trials (CENTRAL; 2014 Issue 10); MEDLINE (1966 to 17 October 2014); EMBASE (1974 to 17 October 2014); and ISI Web of Science (1974 to 17 October 2014). We also searched the reference lists of relevant articles, conference proceedings, and ongoing trials up to 17 October 2014 on specific websites (http://clinicaltrials.gov/, http://controlled-trials.com/, and http://www.centerwatch.com). We included randomized controlled trials (RCTs) comparing general anaesthesia where nitrous oxide was part of the anaesthetic technique used for the induction or maintenance of general

  9. Failure Mechanism of Rock Bridge Based on Acoustic Emission Technique

    Directory of Open Access Journals (Sweden)

    Guoqing Chen

    2015-01-01

    Full Text Available Acoustic emission (AE technique is widely used in various fields as a reliable nondestructive examination technology. Two experimental tests were carried out in a rock mechanics laboratory, which include (1 small scale direct shear tests of rock bridge with different lengths and (2 large scale landslide model with locked section. The relationship of AE event count and record time was analyzed during the tests. The AE source location technology and comparative analysis with its actual failure model were done. It can be found that whether it is small scale test or large scale landslide model test, AE technique accurately located the AE source point, which reflected the failure generation and expansion of internal cracks in rock samples. Large scale landslide model with locked section test showed that rock bridge in rocky slope has typical brittle failure behavior. The two tests based on AE technique well revealed the rock failure mechanism in rocky slope and clarified the cause of high speed and long distance sliding of rocky slope.

  10. Radiation synthesized protein-based nanoparticles: A technique overview

    International Nuclear Information System (INIS)

    Varca, Gustavo H.C.; Perossi, Gabriela G.; Grasselli, Mariano; Lugão, Ademar B.

    2014-01-01

    Seeking for alternative routes for protein engineering a novel technique – radiation induced synthesis of protein nanoparticles – to achieve size controlled particles with preserved bioactivity has been recently reported. This work aimed to evaluate different process conditions to optimize and provide an overview of the technique using γ-irradiation. Papain was used as model protease and the samples were irradiated in a gamma cell irradiator in phosphate buffer (pH=7.0) containing ethanol (0–35%). The dose effect was evaluated by exposure to distinct γ-irradiation doses (2.5, 5, 7.5 and 10 kGy) and scale up experiments involving distinct protein concentrations (12.5–50 mg mL −1 ) were also performed. Characterization involved size monitoring using dynamic light scattering. Bityrosine detection was performed using fluorescence measurements in order to provide experimental evidence of the mechanism involved. Best dose effects were achieved at 10 kGy with regard to size and no relevant changes were observed as a function of papain concentration, highlighting very broad operational concentration range. Bityrosine changes were identified for the samples as a function of the process confirming that such linkages play an important role in the nanoparticle formation. - Highlights: • Synthesis of protein-based nanoparticles by γ-irradiation. • Optimization of the technique. • Overview of mechanism involved in the nanoparticle formation. • Engineered papain nanoparticles for biomedical applications

  11. [Selective biopsy of the sentinel lymph node in patients with breast cancer and previous excisional biopsy: is there a change in the reliability of the technique according to time from surgery?].

    Science.gov (United States)

    Sabaté-Llobera, A; Notta, P C; Benítez-Segura, A; López-Ojeda, A; Pernas-Simon, S; Boya-Román, M P; Bajén, M T

    2015-01-01

    To assess the influence of time on the reliability of sentinel lymph node biopsy (SLNB) in breast cancer patients with previous excisional biopsy (EB), analyzing both the sentinel lymph node detection and the lymph node recurrence rate. Thirty-six patients with cT1/T2 N0 breast cancer and previous EB of the lesion underwent a lymphoscintigraphy after subdermal periareolar administration of radiocolloid, the day before SLNB. Patients were classified into two groups, one including 12 patients with up to 29 days elapsed between EB and SLNB (group A), and another with the remaining 24 in which time between both procedures was of 30 days or more (group B). Scintigraphic and surgical detection of the sentinel lymph node, histological status of the sentinel lymph node and of the axillary lymph node dissection, if performed, and lymphatic recurrences during follow-up, were analyzed. Sentinel lymph node visualization at the lymphoscintigraphy and surgical detection were 100% in both groups. Histologically, three patients showed macrometastasis in the sentinel lymph node, one from group A and two from group B. None of the patients, not even those with malignancy of the sentinel lymph node, relapsed after a medium follow-up of 49.5 months (24-75). Time elapsed between EB and SLNB does not influence the reliability of this latter technique as long as a superficial injection of the radiopharmaceutical is performed, proving a very high detection rate of the sentinel lymph node without evidence of lymphatic relapse during follow-up. Copyright © 2014 Elsevier España, S.L.U. and SEMNIM. All rights reserved.

  12. Structural level characterization of base oils using advanced analytical techniques

    KAUST Repository

    Hourani, Nadim

    2015-05-21

    Base oils, blended for finished lubricant formulations, are classified by the American Petroleum Institute into five groups, viz., groups I-V. Groups I-III consist of petroleum based hydrocarbons whereas groups IV and V are made of synthetic polymers. In the present study, five base oil samples belonging to groups I and III were extensively characterized using high performance liquid chromatography (HPLC), comprehensive two-dimensional gas chromatography (GC×GC), and Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) equipped with atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) sources. First, the capabilities and limitations of each analytical technique were evaluated, and then the availed information was combined to reveal compositional details on the base oil samples studied. HPLC showed the overwhelming presence of saturated over aromatic compounds in all five base oils. A similar trend was further corroborated using GC×GC, which yielded semiquantitative information on the compound classes present in the samples and provided further details on the carbon number distributions within these classes. In addition to chromatography methods, FT-ICR MS supplemented the compositional information on the base oil samples by resolving the aromatics compounds into alkyl- and naphtheno-subtituted families. APCI proved more effective for the ionization of the highly saturated base oil components compared to APPI. Furthermore, for the detailed information on hydrocarbon molecules FT-ICR MS revealed the presence of saturated and aromatic sulfur species in all base oil samples. The results presented herein offer a unique perspective into the detailed molecular structure of base oils typically used to formulate lubricants. © 2015 American Chemical Society.

  13. On HTML and XML based web design and implementation techniques

    International Nuclear Information System (INIS)

    Bezboruah, B.; Kalita, M.

    2006-05-01

    Web implementation is truly a multidisciplinary field with influences from programming, choosing of scripting languages, graphic design, user interface design, and database design. The challenge of a Web designer/implementer is his ability to create an attractive and informative Web. To work with the universal framework and link diagrams from the design process as well as the Web specifications and domain information, it is essential to create Hypertext Markup Language (HTML) or other software and multimedia to accomplish the Web's objective. In this article we will discuss Web design standards and the techniques involved in Web implementation based on HTML and Extensible Markup Language (XML). We will also discuss the advantages and disadvantages of HTML over its successor XML in designing and implementing a Web. We have developed two Web pages, one utilizing the features of HTML and the other based on the features of XML to carry out the present investigation. (author)

  14. Efficient Identification Using a Prime-Feature-Based Technique

    DEFF Research Database (Denmark)

    Hussain, Dil Muhammad Akbar; Haq, Shaiq A.; Valente, Andrea

    2011-01-01

    . Fingerprint identification system, implemented on PC/104 based real-time systems, can accurately identify the operator. Traditionally, the uniqueness of a fingerprint is determined by the overall pattern of ridges and valleys as well as the local ridge anomalies e.g., a ridge bifurcation or a ridge ending......, which are called minutiae points. Designing a reliable automatic fingerprint matching algorithm for minimal platform is quite challenging. In real-time systems, efficiency of the matching algorithm is of utmost importance. To achieve this goal, a prime-feature-based indexing algorithm is proposed......Identification of authorized train drivers through biometrics is a growing area of interest in locomotive radio remote control systems. The existing technique of password authentication is not very reliable and potentially unauthorized personnel may also operate the system on behalf of the operator...

  15. Designing on ICT reconstruction software based on DSP techniques

    International Nuclear Information System (INIS)

    Liu Jinhui; Xiang Xincheng

    2006-01-01

    The convolution back project (CBP) algorithm is used to realize the CT image's reconstruction in ICT generally, which is finished by using PC or workstation. In order to add the ability of multi-platform operation of CT reconstruction software, a CT reconstruction method based on modern digital signal processor (DSP) technique is proposed and realized in this paper. The hardware system based on TI's C6701 DSP processor is selected to support the CT software construction. The CT reconstruction software is compiled only using assembly language related to the DSP hardware. The CT software can be run on TI's C6701 EVM board by inputting the CT data, and can get the CT Images that satisfy the real demands. (authors)

  16. Risk-based maintenance-Techniques and applications

    International Nuclear Information System (INIS)

    Arunraj, N.S.; Maiti, J.

    2007-01-01

    Plant and equipment, however well designed, will not remain safe or reliable if it is not maintained. The general objective of the maintenance process is to make use of the knowledge of failures and accidents to achieve the possible safety with the lowest possible cost. The concept of risk-based maintenance was developed to inspect the high-risk components usually with greater frequency and thoroughness and to maintain in a greater manner, to achieve tolerable risk criteria. Risk-based maintenance methodology provides a tool for maintenance planning and decision making to reduce the probability of failure of equipment and the consequences of failure. In this paper, the risk analysis and risk-based maintenance methodologies were identified and classified into suitable classes. The factors affecting the quality of risk analysis were identified and analyzed. The applications, input data and output data were studied to understand their functioning and efficiency. The review showed that there is no unique way to perform risk analysis and risk-based maintenance. The use of suitable techniques and methodologies, careful investigation during the risk analysis phase, and its detailed and structured results are necessary to make proper risk-based maintenance decisions

  17. Generation of Quasi-Gaussian Pulses Based on Correlation Techniques

    Directory of Open Access Journals (Sweden)

    POHOATA, S.

    2012-02-01

    Full Text Available The Gaussian pulses have been mostly used within communications, where some applications can be emphasized: mobile telephony (GSM, where GMSK signals are used, as well as the UWB communications, where short-period pulses based on Gaussian waveform are generated. Since the Gaussian function signifies a theoretical concept, which cannot be accomplished from the physical point of view, this should be expressed by using various functions, able to determine physical implementations. New techniques of generating the Gaussian pulse responses of good precision are approached, proposed and researched in this paper. The second and third order derivatives with regard to the Gaussian pulse response are accurately generated. The third order derivates is composed of four individual rectangular pulses of fixed amplitudes, being easily to be generated by standard techniques. In order to generate pulses able to satisfy the spectral mask requirements, an adequate filter is necessary to be applied. This paper emphasizes a comparative analysis based on the relative error and the energy spectra of the proposed pulses.

  18. Refractive index sensor based on optical fiber end face using pulse reference-based compensation technique

    Science.gov (United States)

    Bian, Qiang; Song, Zhangqi; Zhang, Xueliang; Yu, Yang; Chen, Yuzhong

    2018-03-01

    We proposed a refractive index sensor based on optical fiber end face using pulse reference-based compensation technique. With good compensation effect of this compensation technique, the power fluctuation of light source, the change of optic components transmission loss and coupler splitting ratio can be compensated, which largely reduces the background noise. The refractive index resolutions can achieve 3.8 × 10-6 RIU and1.6 × 10-6 RIU in different refractive index regions.

  19. Enhancing the effectiveness of IST through risk-based techniques

    Energy Technology Data Exchange (ETDEWEB)

    Floyd, S.D.

    1996-12-01

    Current IST requirements were developed mainly through deterministic-based methods. While this approach has resulted in an adequate level of safety and reliability for pumps and valves, insights from probabilistic safety assessments suggest a better safety focus can be achieved at lower costs. That is, some high safety impact pumps and valves are currently not tested under the IST program and should be added, while low safety impact valves could be tested at significantly greater intervals than allowed by the current IST program. The nuclear utility industry, through the Nuclear Energy Institute (NEI), has developed a draft guideline for applying risk-based techniques to focus testing on those pumps and valves with a high safety impact while reducing test frequencies on low safety impact pumps and valves. The guideline is being validated through an industry pilot application program that is being reviewed by the U.S. Nuclear Regulatory Commission. NEI and the ASME maintain a dialogue on the two groups` activities related to risk-based IST. The presenter will provide an overview of the NEI guideline, discuss the methodological approach for applying risk-based technology to IST and provide the status of the industry pilot plant effort.

  20. Efficient techniques for wave-based sound propagation in interactive applications

    Science.gov (United States)

    Mehra, Ravish

    -driven, rotating or time-varying directivity function at runtime. Unlike previous approaches, the listener directivity approach can be used to compute spatial audio (3D audio) for a moving, rotating listener at interactive rates. Lastly, we propose an efficient GPU-based time-domain solver for the wave equation that enables wave simulation up to the mid-frequency range in tens of minutes on a desktop computer. It is demonstrated that by carefully mapping all the components of the wave simulator to match the parallel processing capabilities of the graphics processors, significant improvement in performance can be achieved compared to the CPU-based simulators, while maintaining numerical accuracy. We validate these techniques with offline numerical simulations and measured data recorded in an outdoor scene. We present results of preliminary user evaluations conducted to study the impact of these techniques on user's immersion in virtual environment. We have integrated these techniques with the Half-Life 2 game engine, Oculus Rift head-mounted display, and Xbox game controller to enable users to experience high-quality acoustics effects and spatial audio in the virtual environment.

  1. A novel pH-responsive hydrogel-based on calcium alginate engineered by the previous formation of polyelectrolyte complexes (PECs) intended to vaginal administration.

    Science.gov (United States)

    Ferreira, Natália Noronha; Perez, Taciane Alvarenga; Pedreiro, Liliane Neves; Prezotti, Fabíola Garavello; Boni, Fernanda Isadora; Cardoso, Valéria Maria de Oliveira; Venâncio, Tiago; Gremião, Maria Palmira Daflon

    2017-10-01

    This work aimed to develop a calcium alginate hydrogel as a pH responsive delivery system for polymyxin B (PMX) sustained-release through the vaginal route. Two samples of sodium alginate from different suppliers were characterized. The molecular weight and M/G ratio determined were, approximately, 107 KDa and 1.93 for alginate_S and 32 KDa and 1.36 for alginate_V. Polymer rheological investigations were further performed through the preparation of hydrogels. Alginate_V was selected for subsequent incorporation of PMX due to the acquisition of pseudoplastic viscous system able to acquiring a differential structure in simulated vaginal microenvironment (pH 4.5). The PMX-loaded hydrogel (hydrogel_PMX) was engineered based on polyelectrolyte complexes (PECs) formation between alginate and PMX followed by crosslinking with calcium chloride. This system exhibited a morphology with variable pore sizes, ranging from 100 to 200 μm and adequate syringeability. The hydrogel liquid uptake ability in an acid environment was minimized by the previous PECs formation. In vitro tests evidenced the hydrogels mucoadhesiveness. PMX release was pH-dependent and the system was able to sustain the release up to 6 days. A burst release was observed at pH 7.4 and drug release was driven by an anomalous transport, as determined by the Korsmeyer-Peppas model. At pH 4.5, drug release correlated with Weibull model and drug transport was driven by Fickian diffusion. The calcium alginate hydrogels engineered by the previous formation of PECs showed to be a promising platform for sustained release of cationic drugs through vaginal administration.

  2. Trial of labour and vaginal birth after previous caesarean section: A population based study of Eastern African immigrants in Victoria, Australia.

    Science.gov (United States)

    Belihu, Fetene B; Small, Rhonda; Davey, Mary-Ann

    2017-03-01

    Variations in caesarean section (CS) between some immigrant groups and receiving country populations have been widely reported. Often, African immigrant women are at higher risk of CS than the receiving population in developed countries. However, evidence about subsequent mode of birth following CS for African women post-migration is lacking. The objective of this study was to examine differences in attempted and successful vaginal birth after previous caesarean (VBAC) for Eastern African immigrants (Eritrea, Ethiopia, Somalia and Sudan) compared with Australian-born women. A population-based observational study was conducted using the Victorian Perinatal Data Collection. Pearson's chi-square test and logistic regression analysis were performed to generate adjusted odds ratios for attempted and successful VBAC. Victoria, Australia. 554 Eastern African immigrants and 24,587 Australian-born eligible women with previous CS having singleton births in public care. 41.5% of Eastern African immigrant women and 26.1% Australian-born women attempted a VBAC with 50.9% of Eastern African immigrants and 60.5% of Australian-born women being successful. After adjusting for maternal demographic characteristics and available clinical confounding factors, Eastern African immigrants were more likely to attempt (OR adj 1.94, 95% CI 1.57-2.47) but less likely to succeed (OR adj 0.54 95% CI 0.41-0.71) in having a VBAC. There are disparities in attempted and successful VBAC between Eastern African origin and Australian-born women. Unsuccessful VBAC attempt is more common among Eastern African immigrants, suggesting the need for improved strategies to select and support potential candidates for vaginal birth among these immigrants to enhance success and reduce potential complications associated with failed VBAC attempt. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  3. Detecting Molecular Properties by Various Laser-Based Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hsin, Tse-Ming [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    Four different laser-based techniques were applied to study physical and chemical characteristics of biomolecules and dye molecules. These techniques are liole burning spectroscopy, single molecule spectroscopy, time-resolved coherent anti-Stokes Raman spectroscopy and laser-induced fluorescence microscopy. Results from hole burning and single molecule spectroscopy suggested that two antenna states (C708 & C714) of photosystem I from cyanobacterium Synechocystis PCC 6803 are connected by effective energy transfer and the corresponding energy transfer time is ~6 ps. In addition, results from hole burning spectroscopy indicated that the chlorophyll dimer of the C714 state has a large distribution of the dimer geometry. Direct observation of vibrational peaks and evolution of coumarin 153 in the electronic excited state was demonstrated by using the fs/ps CARS, a variation of time-resolved coherent anti-Stokes Raman spectroscopy. In three different solvents, methanol, acetonitrile, and butanol, a vibration peak related to the stretch of the carbonyl group exhibits different relaxation dynamics. Laser-induced fluorescence microscopy, along with the biomimetic containers-liposomes, allows the measurement of the enzymatic activity of individual alkaline phosphatase from bovine intestinal mucosa without potential interferences from glass surfaces. The result showed a wide distribution of the enzyme reactivity. Protein structural variation is one of the major reasons that are responsible for this highly heterogeneous behavior.

  4. Model-checking techniques based on cumulative residuals.

    Science.gov (United States)

    Lin, D Y; Wei, L J; Ying, Z

    2002-03-01

    Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.

  5. Determination of rock fragmentation based on a photographic technique

    International Nuclear Information System (INIS)

    Dehgan Banadaki, M.M.; Majdi, A.; Raessi Gahrooei, D.

    2002-01-01

    The paper represents a physical blasting model in laboratory scale along with a photographic approach to describe the distribution of blasted rock materials. For this purpose, based on wobble probability distribution function, eight samples each weighted 100 kg,were obtained. Four pictures from four different section of each sample were taken. Then, pictures were converted into graphic files with characterizing boundary of each piece of rocks in the samples. Error caused due to perspective were eliminated. Volume of each piece of the blasted rock materials and hence the required sieve size, each piece of rock to pass through, were calculated. Finally, original blasted rock size distribution was compared with that obtained from the photographic method. The paper concludes with presenting an approach to convert the results of photographic technique into size distribution obtained by seine analysis with sufficient verification

  6. Whitelists Based Multiple Filtering Techniques in SCADA Sensor Networks

    Directory of Open Access Journals (Sweden)

    DongHo Kang

    2014-01-01

    Full Text Available Internet of Things (IoT consists of several tiny devices connected together to form a collaborative computing environment. Recently IoT technologies begin to merge with supervisory control and data acquisition (SCADA sensor networks to more efficiently gather and analyze real-time data from sensors in industrial environments. But SCADA sensor networks are becoming more and more vulnerable to cyber-attacks due to increased connectivity. To safely adopt IoT technologies in the SCADA environments, it is important to improve the security of SCADA sensor networks. In this paper we propose a multiple filtering technique based on whitelists to detect illegitimate packets. Our proposed system detects the traffic of network and application protocol attacks with a set of whitelists collected from normal traffic.

  7. Demand Management Based on Model Predictive Control Techniques

    Directory of Open Access Journals (Sweden)

    Yasser A. Davizón

    2014-01-01

    Full Text Available Demand management (DM is the process that helps companies to sell the right product to the right customer, at the right time, and for the right price. Therefore the challenge for any company is to determine how much to sell, at what price, and to which market segment while maximizing its profits. DM also helps managers efficiently allocate undifferentiated units of capacity to the available demand with the goal of maximizing revenue. This paper introduces control system approach to demand management with dynamic pricing (DP using the model predictive control (MPC technique. In addition, we present a proper dynamical system analogy based on active suspension and a stability analysis is provided via the Lyapunov direct method.

  8. Clustering economies based on multiple criteria decision making techniques

    Directory of Open Access Journals (Sweden)

    Mansour Momeni

    2011-10-01

    Full Text Available One of the primary concerns on many countries is to determine different important factors affecting economic growth. In this paper, we study some factors such as unemployment rate, inflation ratio, population growth, average annual income, etc to cluster different countries. The proposed model of this paper uses analytical hierarchy process (AHP to prioritize the criteria and then uses a K-mean technique to cluster 59 countries based on the ranked criteria into four groups. The first group includes countries with high standards such as Germany and Japan. In the second cluster, there are some developing countries with relatively good economic growth such as Saudi Arabia and Iran. The third cluster belongs to countries with faster rates of growth compared with the countries located in the second group such as China, India and Mexico. Finally, the fourth cluster includes countries with relatively very low rates of growth such as Jordan, Mali, Niger, etc.

  9. Diagnosis of Dengue Infection Using Conventional and Biosensor Based Techniques

    Science.gov (United States)

    Parkash, Om; Hanim Shueb, Rafidah

    2015-01-01

    Dengue is an arthropod-borne viral disease caused by four antigenically different serotypes of dengue virus. This disease is considered as a major public health concern around the world. Currently, there is no licensed vaccine or antiviral drug available for the prevention and treatment of dengue disease. Moreover, clinical features of dengue are indistinguishable from other infectious diseases such as malaria, chikungunya, rickettsia and leptospira. Therefore, prompt and accurate laboratory diagnostic test is urgently required for disease confirmation and patient triage. The traditional diagnostic techniques for the dengue virus are viral detection in cell culture, serological testing, and RNA amplification using reverse transcriptase PCR. This paper discusses the conventional laboratory methods used for the diagnosis of dengue during the acute and convalescent phase and highlights the advantages and limitations of these routine laboratory tests. Subsequently, the biosensor based assays developed using various transducers for the detection of dengue are also reviewed. PMID:26492265

  10. An RSS based location estimation technique for cognitive relay networks

    KAUST Repository

    Qaraqe, Khalid A.

    2010-11-01

    In this paper, a received signal strength (RSS) based location estimation method is proposed for a cooperative wireless relay network where the relay is a cognitive radio. We propose a method for the considered cognitive relay network to determine the location of the source using the direct and the relayed signal at the destination. We derive the Cramer-Rao lower bound (CRLB) expressions separately for x and y coordinates of the location estimate. We analyze the effects of cognitive behaviour of the relay on the performance of the proposed method. We also discuss and quantify the reliability of the location estimate using the proposed technique if the source is not stationary. The overall performance of the proposed method is presented through simulations. ©2010 IEEE.

  11. Fludarabine-based versus CHOP-like regimens with or without rituximab in patients with previously untreated indolent lymphoma: a retrospective analysis of safety and efficacy

    Directory of Open Access Journals (Sweden)

    Xu XX

    2013-10-01

    Full Text Available Xiao-xiao Xu,1 Bei Yan,2 Zhen-xing Wang,3 Yong Yu,1 Xiao-xiong Wu,2 Yi-zhuo Zhang11Department of Hematology, Tianjin Medical University Cancer Institute and Hospital, Tianjin Key Laboratory of Cancer Prevention and Therapy, Tianjin, 2Department of Hematology, First Affiliated Hospital of Chinese People's Liberation Army General Hospital, Beijing, 3Department of Stomach Oncology, TianJin Medical University Cancer Institute and Hospital, Key Laboratory of Cancer Prevention and Therapy, Tianjin, People's Republic of ChinaAbstract: Fludarabine-based regimens and CHOP (doxorubicin, cyclophosphamide, vincristine, prednisone-like regimens with or without rituximab are the most common treatment modalities for indolent lymphoma. However, there is no clear evidence to date about which chemotherapy regimen should be the proper initial treatment of indolent lymphoma. More recently, the use of fludarabine has raised concerns due to its high number of toxicities, especially hematological toxicity and infectious complications. The present study aimed to retrospectively evaluate both the efficacy and the potential toxicities of the two main regimens (fludarabine-based and CHOP-like regimens in patients with previously untreated indolent lymphoma. Among a total of 107 patients assessed, 54 patients received fludarabine-based regimens (FLU arm and 53 received CHOP or CHOPE (doxorubicin, cyclophosphamide, vincristine, prednisone, or plus etoposide regimens (CHOP arm. The results demonstrated that fludarabine-based regimens could induce significantly improved progression-free survival (PFS compared with CHOP-like regimens. However, the FLU arm showed overall survival, complete response, and overall response rates similar to those of the CHOP arm. Grade 3–4 neutropenia occurred in 42.6% of the FLU arm and 7.5% of the CHOP arm (P 60 years and presentation of grade 3–4 myelosuppression were the independent factors to infection, and the FLU arm had significantly

  12. Astronomical Image Compression Techniques Based on ACC and KLT Coder

    Directory of Open Access Journals (Sweden)

    J. Schindler

    2011-01-01

    Full Text Available This paper deals with a compression of image data in applications in astronomy. Astronomical images have typical specific properties — high grayscale bit depth, size, noise occurrence and special processing algorithms. They belong to the class of scientific images. Their processing and compression is quite different from the classical approach of multimedia image processing. The database of images from BOOTES (Burst Observer and Optical Transient Exploring System has been chosen as a source of the testing signal. BOOTES is a Czech-Spanish robotic telescope for observing AGN (active galactic nuclei and the optical transient of GRB (gamma ray bursts searching. This paper discusses an approach based on an analysis of statistical properties of image data. A comparison of two irrelevancy reduction methods is presented from a scientific (astrometric and photometric point of view. The first method is based on a statistical approach, using the Karhunen-Loeve transform (KLT with uniform quantization in the spectral domain. The second technique is derived from wavelet decomposition with adaptive selection of used prediction coefficients. Finally, the comparison of three redundancy reduction methods is discussed. Multimedia format JPEG2000 and HCOMPRESS, designed especially for astronomical images, are compared with the new Astronomical Context Coder (ACC coder based on adaptive median regression.

  13. An investigation of a video-based patient repositioning technique

    International Nuclear Information System (INIS)

    Yan Yulong; Song Yulin; Boyer, Arthur L.

    2002-01-01

    Purpose: We have investigated a video-based patient repositioning technique designed to use skin features for radiotherapy repositioning. We investigated the feasibility of the clinical application of this system by quantitative evaluation of performance characteristics of the methodology. Methods and Materials: Multiple regions of interest (ROI) were specified in the field of view of video cameras. We used a normalized correlation pattern-matching algorithm to compute the translations of each ROI pattern in a target image. These translations were compared against trial translations using a quadratic cost function for an optimization process in which the patient rotation and translational parameters were calculated. Results: A hierarchical search technique achieved high-speed (compute correlation for 128x128 ROI in 512x512 target image within 0.005 s) and subpixel spatial accuracy (as high as 0.2 pixel). By treating the observed translations as movements of points on the surfaces of a hypothetical cube, we were able to estimate accurately the actual translations and rotations of the test phantoms used in our experiments to less than 1 mm and 0.2 deg. with a standard deviation of 0.3 mm and 0.5 deg. respectively. For human volunteer cases, we estimated the translations and rotations to have an accuracy of 2 mm and 1.2 deg. Conclusion: A personal computer-based video system is suitable for routine patient setup of fractionated conformal radiotherapy. It is expected to achieve high-precision repositioning of the skin surface with high efficiency

  14. Evaluation of a titanium dioxide-based DGT technique for measuring inorganic uranium species in fresh and marine waters

    DEFF Research Database (Denmark)

    Hutchins, Colin M.; Panther, Jared G.; Teasdale, Peter R.

    2012-01-01

    A new diffusive gradients in a thin film (DGT) technique for measuring dissolved uranium (U) in freshwater is reported. The new method utilises a previously described binding phase, Metsorb (a titanium dioxide based adsorbent). This binding phase was evaluated and compared to the well-established...

  15. Generalized hardware post-processing technique for chaos-based pseudorandom number generators

    KAUST Repository

    Barakat, Mohamed L.

    2013-06-01

    This paper presents a generalized post-processing technique for enhancing the pseudorandomness of digital chaotic oscillators through a nonlinear XOR-based operation with rotation and feedback. The technique allows full utilization of the chaotic output as pseudorandom number generators and improves throughput without a significant area penalty. Digital design of a third-order chaotic system with maximum function nonlinearity is presented with verified chaotic dynamics. The proposed post-processing technique eliminates statistical degradation in all output bits, thus maximizing throughput compared to other processing techniques. Furthermore, the technique is applied to several fully digital chaotic oscillators with performance surpassing previously reported systems in the literature. The enhancement in the randomness is further examined in a simple image encryption application resulting in a better security performance. The system is verified through experiment on a Xilinx Virtex 4 FPGA with throughput up to 15.44 Gbit/s and logic utilization less than 0.84% for 32-bit implementations. © 2013 ETRI.

  16. Huffman-based code compression techniques for embedded processors

    KAUST Repository

    Bonny, Mohamed Talal; Henkel, Jö rg

    2010-01-01

    % for ARM and MIPS, respectively. In our compression technique, we have conducted evaluations using a representative set of applications and we have applied each technique to two major embedded processor architectures, namely ARM and MIPS. © 2010 ACM.

  17. Skull base tumours part I: Imaging technique, anatomy and anterior skull base tumours

    International Nuclear Information System (INIS)

    Borges, Alexandra

    2008-01-01

    Advances in cross-sectional imaging, surgical technique and adjuvant treatment have largely contributed to ameliorate the prognosis, lessen the morbidity and mortality of patients with skull base tumours and to the growing medical investment in the management of these patients. Because clinical assessment of the skull base is limited, cross-sectional imaging became indispensable in the diagnosis, treatment planning and follow-up of patients with suspected skull base pathology and the radiologist is increasingly responsible for the fate of these patients. This review will focus on the advances in imaging technique; contribution to patient's management and on the imaging features of the most common tumours affecting the anterior skull base. Emphasis is given to a systematic approach to skull base pathology based upon an anatomic division taking into account the major tissue constituents in each skull base compartment. The most relevant information that should be conveyed to surgeons and radiation oncologists involved in patient's management will be discussed

  18. Comparative assessment of PIV-based pressure evaluation techniques applied to a transonic base flow

    NARCIS (Netherlands)

    Blinde, P; Michaelis, D; van Oudheusden, B.W.; Weiss, P.E.; de Kat, R.; Laskari, A.; Jeon, Y.J.; David, L; Schanz, D; Huhn, F.; Gesemann, S; Novara, M.; McPhaden, C.; Neeteson, N.; Rival, D.; Schneiders, J.F.G.; Schrijer, F.F.J.

    2016-01-01

    A test case for PIV-based pressure evaluation techniques has been developed by constructing a simulated experiment from a ZDES simulation for an axisymmetric base flow at Mach 0.7. The test case comprises sequences of four subsequent particle images (representing multi-pulse data) as well as

  19. A novel technique for active vibration control, based on optimal

    Indian Academy of Sciences (India)

    In the last few decades, researchers have proposed many control techniques to suppress unwanted vibrations in a structure. In this work, a novel and simple technique is proposed for the active vibration control. In this technique, an optimal tracking control is employed to suppress vibrations in a structure by simultaneously ...

  20. Acellular dermal matrix based nipple reconstruction: A modified technique

    Directory of Open Access Journals (Sweden)

    Raghavan Vidya

    2017-09-01

    Full Text Available Nipple areolar reconstruction (NAR has evolved with the advancement in breast reconstruction and can improve self-esteem and, consequently, patient satisfaction. Although a variety of reconstruction techniques have been described in the literature varying from nipple sharing, local flaps to alloplastic and allograft augmentation, over time, loss of nipple projection remains a major problem. Acellular dermal matrices (ADM have revolutionised breast reconstruction more recently. We discuss the use of ADM to act as a base plate and strut to give support to the base and offer nipple bulk and projection in a primary procedure of NAR with a local clover shaped dermal flap in 5 breasts (4 patients. We used 5-point Likert scales (1 = highly unsatisfied, 5 = highly satisfied to assess patient satisfaction. Median age was 46 years (range: 38–55 years. Nipple projection of 8 mm, 7 mm, and 7 mms were achieved in the unilateral cases and 6 mm in the bilateral case over a median 18 month period. All patients reported at least a 4 on the Likert scale. We had no post-operative complications. It seems that nipple areolar reconstruction [NAR] using ADM can achieve nipple projection which is considered aesthetically pleasing for patients.

  1. Crack identification based on synthetic artificial intelligent technique

    International Nuclear Information System (INIS)

    Shim, Mun Bo; Suh, Myung Won

    2001-01-01

    It has been established that a crack has an important effect on the dynamic behavior of a structure. This effect depends mainly on the location and depth of the crack. To identify the location and depth of a crack in a structure, a method is presented in this paper which uses synthetic artificial intelligent technique, that is, Adaptive-Network-based Fuzzy Inference System(ANFIS) solved via hybrid learning algorithm(the back-propagation gradient descent and the least-squares method) are used to learn the input(the location and depth of a crack)-output(the structural eigenfrequencies) relation of the structural system. With this ANFIS and a Continuous Evolutionary Algorithm(CEA), it is possible to formulate the inverse problem. CEAs based on genetic algorithms work efficiently for continuous search space optimization problems like a parameter identification problem. With this ANFIS, CEAs are used to identify the crack location and depth minimizing the difference from the measured frequencies. We have tried this new idea on a simple beam structure and the results are promising

  2. Structural design systems using knowledge-based techniques

    International Nuclear Information System (INIS)

    Orsborn, K.

    1993-01-01

    Engineering information management and the corresponding information systems are of a strategic importance for industrial enterprises. This thesis treats the interdisciplinary field of designing computing systems for structural design and analysis using knowledge-based techniques. Specific conceptual models have been designed for representing the structure and the process of objects and activities in a structural design and analysis domain. In this thesis, it is shown how domain knowledge can be structured along several classification principles in order to reduce complexity and increase flexibility. By increasing the conceptual level of the problem description and representation of the domain knowledge in a declarative form, it is possible to enhance the development, maintenance and use of software for mechanical engineering. This will result in a corresponding increase of the efficiency of the mechanical engineering design process. These ideas together with the rule-based control point out the leverage of declarative knowledge representation within this domain. Used appropriately, a declarative knowledge representation preserves information better, is more problem-oriented and change-tolerant than procedural representations. 74 refs

  3. Positron emission tomography, physical bases and comparaison with other techniques

    International Nuclear Information System (INIS)

    Guermazi, Fadhel; Hamza, F; Amouri, W.; Charfeddine, S.; Kallel, S.; Jardak, I.

    2013-01-01

    Positron emission tomography (PET) is a medical imaging technique that measures the three-dimensional distribution of molecules marked by a positron-emitting particle. PET has grown significantly in clinical fields, particularly in oncology for diagnosis and therapeutic follow purposes. The technical evolutions of this technique are fast. Among the technical improvements, is the coupling of the PET scan with computed tomography (CT). PET is obtained by intravenous injection of a radioactive tracer. The marker is usually fluorine ( 18 F) embedded in a glucose molecule forming the 18-fluorodeoxyglucose (FDG-18). This tracer, similar to glucose, binds to tissues that consume large quantities of the sugar such cancerous tissue, cardiac muscle or brain. Detection using scintillation crystals (BGO, LSO, LYSO) suitable for high energy (511keV) recognizes the lines of the gamma photons originating from the annihilation of a positron with an electron. The electronics of detection or coincidence circuit is based on two criteria: a time window, of about 6 to 15 ns, and an energy window. This system measures the true coincidences that correspond to the detection of two photons of 511 kV from the same annihilation. Most PET devices are constituted by a series of elementary detectors distributed annularly around the patient. Each detector comprises a scintillation crystal matrix coupled to a finite number (4 or 6) of photomultipliers. The electronic circuit, or the coincidence circuit, determines the projection point of annihilation by means of two elementary detectors. The processing of such information must be extremely fast, considering the count rates encountered in practice. The information measured by the coincidence circuit is then positioned in a matrix or sinogram, which contains a set of elements of a projection section of the object. Images are obtained by tomographic reconstruction by powerful computer stations equipped with a software tools allowing the analysis and

  4. Validation techniques of agent based modelling for geospatial simulations

    Directory of Open Access Journals (Sweden)

    M. Darvishi

    2014-10-01

    Full Text Available One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS, biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI’s ArcGIS, OpenMap, GeoTools, etc for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  5. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  6. Ground-based intercomparison of two isoprene measurement techniques

    Directory of Open Access Journals (Sweden)

    E. Leibrock

    2003-01-01

    Full Text Available An informal intercomparison of two isoprene (C5H8 measurement techniques was carried out during Fall of 1998 at a field site located approximately 3 km west of Boulder, Colorado, USA. A new chemical ionization mass spectrometric technique (CIMS was compared to a well-established gas chromatographic technique (GC. The CIMS technique utilized benzene cation chemistry to ionize isoprene. The isoprene levels measured by the CIMS were often larger than those obtained with the GC. The results indicate that the CIMS technique suffered from an anthropogenic interference associated with air masses from the Denver, CO metropolitan area as well as an additional interference occurring in clean conditions. However, the CIMS technique is also demonstrated to be sensitive and fast. Especially after introduction of a tandem mass spectrometric technique, it is therefore a candidate for isoprene measurements in remote environments near isoprene sources.

  7. A linac-based stereotactic irradiation technique of uveal melanoma

    International Nuclear Information System (INIS)

    Dieckmann, Karin; Bogner, Joachim; Georg, Dietmar; Zehetmayer, Martin; Kren, Gerhard; Poetter, Richard

    2001-01-01

    Purpose: To describe a stereotactic irradiation technique for uveal melanomas performed at a linac, based on a non-invasive eye fixation and eye monitoring system. Methods: For eye immobilization a light source system is integrated in a standard stereotactic mask system in front of the healthy eye: During treatment preparation (computed tomography/magnetic resonance imaging) as well as for treatment delivery, patients are instructed to gaze at the fixation light source. A mini-video camera monitors the pupil center position of the diseased eye. For treatment planning and beam delivery standard stereotactic radiotherapy equipment is used. If the pupil center deviation from a predefined 'zero-position' exceeds 1 mm (for more than 2 s), treatment delivery is interrupted. Between 1996 and 1999 60 patients with uveal melanomas, where (i) tumor height exceeded 7 mm, or (ii) tumor height was more than 3 mm, and the central tumor distance to the optic disc and/or the macula was less than 3 mm, have been treated. A total dose of 60 or 70 Gy has been given in 5 fractions within 10 days. Results: The repositioning accuracy in the mask system is 0.47±0.36 mm in rostral-occipital direction, 0.75±0.52 mm laterally, and 1.12±0.96 mm in vertical direction. An eye movement analysis performed for 23 patients shows a pupil center deviation from the 'zero' position<1 mm in 91% of all cases investigated. In a theoretical analysis, pupil center deviations are correlated with GTV 'movements'. For a pupil center deviation of 1 mm (rotation of the globe of 5 degree sign ) the GTV is still encompassed by the 80% isodose in 94%. Conclusion: For treatments of uveal melanomas, linac-based stereotactic radiotherapy combined with a non-invasive eye immobilization and monitoring system represents a feasible, accurate and reproducible method. Besides considerable technical requirements, the complexity of the treatment technique demands an interdisciplinary team continuously dedicated to this

  8. Orientation of student entrepreneurial practices based on administrative techniques

    Directory of Open Access Journals (Sweden)

    Héctor Horacio Murcia Cabra

    2005-07-01

    Full Text Available As part of the second phase of the research project «Application of a creativity model to update the teaching of the administration in Colombian agricultural entrepreneurial systems» it was decided to re-enforce student planning and execution of the students of the Agricultural business Administration Faculty of La Salle University. Those finishing their studies were given special attention. The plan of action was initiated in the second semester of 2003. It was initially defined as a model of entrepreneurial strengthening based on a coherent methodology that included the most recent administration and management techniques. Later, the applicability of this model was tested in some organizations of the agricultural sector that had asked for support in their planning processes. Through an investigation-action process the methodology was redefined in order to arrive at a final model that could be used by faculty students and graduates. The results obtained were applied to the teaching of Entrepreneurial Laboratory of ninth semester students with the hope of improving administrative support to agricultural enterprises. Following this procedure more than 100 students and 200 agricultural producers have applied this procedure between June 2003 and July 2005. The methodology used and the results obtained are presented in this article.

  9. Structural reliability analysis based on the cokriging technique

    International Nuclear Information System (INIS)

    Zhao Wei; Wang Wei; Dai Hongzhe; Xue Guofeng

    2010-01-01

    Approximation methods are widely used in structural reliability analysis because they are simple to create and provide explicit functional relationships between the responses and variables in stead of the implicit limit state function. Recently, the kriging method which is a semi-parameter interpolation technique that can be used for deterministic optimization and structural reliability has gained popularity. However, to fully exploit the kriging method, especially in high-dimensional problems, a large number of sample points should be generated to fill the design space and this can be very expensive and even impractical in practical engineering analysis. Therefore, in this paper, a new method-the cokriging method, which is an extension of kriging, is proposed to calculate the structural reliability. cokriging approximation incorporates secondary information such as the values of the gradients of the function being approximated. This paper explores the use of the cokriging method for structural reliability problems by comparing it with the Kriging method based on some numerical examples. The results indicate that the cokriging procedure described in this work can generate approximation models to improve on the accuracy and efficiency for structural reliability problems and is a viable alternative to the kriging.

  10. Microgrids Real-Time Pricing Based on Clustering Techniques

    Directory of Open Access Journals (Sweden)

    Hao Liu

    2018-05-01

    Full Text Available Microgrids are widely spreading in electricity markets worldwide. Besides the security and reliability concerns for these microgrids, their operators need to address consumers’ pricing. Considering the growth of smart grids and smart meter facilities, it is expected that microgrids will have some level of flexibility to determine real-time pricing for at least some consumers. As such, the key challenge is finding an optimal pricing model for consumers. This paper, accordingly, proposes a new pricing scheme in which microgrids are able to deploy clustering techniques in order to understand their consumers’ load profiles and then assign real-time prices based on their load profile patterns. An improved weighted fuzzy average k-means is proposed to cluster load curve of consumers in an optimal number of clusters, through which the load profile of each cluster is determined. Having obtained the load profile of each cluster, real-time prices are given to each cluster, which is the best price given to all consumers in that cluster.

  11. Using Neutron-based techniques to investigate battery behaviour

    International Nuclear Information System (INIS)

    Pramudita, James C.; Goonetilleke, Damien; Sharma, Neeraj; Peterson, Vanessa K.

    2016-01-01

    The extensive use of portable electronic devices has given rise to increasing demand for reliable high energy density storage in the form of batteries. Today, lithium-ion batteries (LIBs) are the leading technology as they offer high energy density and relatively long lifetimes. Despite their widespread adoption, Li-ion batteries still suffer from significant degradation in their performance over time. The most obvious degradation in lithium-ion battery performance is capacity fade – where the capacity of the battery reduces after extended cycling. This talk will focus on how in situ time-resolved neutron powder diffraction (NPD) can be used to gain a better understanding of the structural changes which contribute to the observed capacity fade. The commercial batteries studied each feature different electrochemical and storage histories that are precisely known, allowing us to elucidate the tell-tale signs of battery degradation using NPD and relate these to battery history. Moreover, this talk will also showcase the diverse use of other neutron-based techniques such as neutron imaging to study electrolyte concentrations in lead-acid batteries, and the use of quasi-elastic neutron scattering to study Na-ion dynamics in sodium-ion batteries.

  12. Light based techniques for improving health care: studies at RRCAT

    International Nuclear Information System (INIS)

    Gupta, P.K.; Patel, H.S.; Ahlawat, S.

    2015-01-01

    The invention of Lasers in 1960, the phenomenal advances in photonics as well as the information processing capability of the computers has given a major boost to the R and D activity on the use of light for high resolution biomedical imaging, sensitive, non-invasive diagnosis and precision therapy. The effort has resulted in remarkable progress and it is widely believed that light based techniques hold great potential to offer simpler, portable systems which can help provide diagnostics and therapy in a low resource setting. At Raja Ramanna Centre for Advanced Technology (RRCAT) extensive studies have been carried out on fluorescence spectroscopy of native tissue. This work led to two important outcomes. First, a better understanding of tissue fluorescence and insights on the possible use of fluorescence spectroscopy for screening of cancer and second development of diagnostic systems that can serve as standalone tool for non-invasive screening of the cancer of oral cavity. The optical coherence tomography setups and their functional extensions (polarization sensitive, Doppler) have also been developed and used for high resolution (∼10 µm) biomedical imaging applications, in particular for non-invasive monitoring of the healing of wounds. Chlorophyll based photo-sensitisers and their derivatives have been synthesized in house and used for photodynamic therapy of tumors in animal models and for antimicrobial applications. Various variants of optical tweezers (holographic, Raman etc.) have also been developed and utilised for different applications notably Raman spectroscopy of optically trapped red blood cells. An overview of these activities carried out at RRCAT is presented in this article. (author)

  13. Weighted graph based ordering techniques for preconditioned conjugate gradient methods

    Science.gov (United States)

    Clift, Simon S.; Tang, Wei-Pai

    1994-01-01

    We describe the basis of a matrix ordering heuristic for improving the incomplete factorization used in preconditioned conjugate gradient techniques applied to anisotropic PDE's. Several new matrix ordering techniques, derived from well-known algorithms in combinatorial graph theory, which attempt to implement this heuristic, are described. These ordering techniques are tested against a number of matrices arising from linear anisotropic PDE's, and compared with other matrix ordering techniques. A variation of RCM is shown to generally improve the quality of incomplete factorization preconditioners.

  14. Optimal technique of linear accelerator–based stereotactic radiosurgery for tumors adjacent to brainstem

    International Nuclear Information System (INIS)

    Chang, Chiou-Shiung; Hwang, Jing-Min; Tai, Po-An; Chang, You-Kang; Wang, Yu-Nong; Shih, Rompin; Chuang, Keh-Shih

    2016-01-01

    Stereotactic radiosurgery (SRS) is a well-established technique that is replacing whole-brain irradiation in the treatment of intracranial lesions, which leads to better preservation of brain functions, and therefore a better quality of life for the patient. There are several available forms of linear accelerator (LINAC)–based SRS, and the goal of the present study is to identify which of these techniques is best (as evaluated by dosimetric outcomes statistically) when the target is located adjacent to brainstem. We collected the records of 17 patients with lesions close to the brainstem who had previously been treated with single-fraction radiosurgery. In all, 5 different lesion catalogs were collected, and the patients were divided into 2 distance groups—1 consisting of 7 patients with a target-to-brainstem distance of less than 0.5 cm, and the other of 10 patients with a target-to-brainstem distance of ≥ 0.5 and < 1 cm. Comparison was then made among the following 3 types of LINAC-based radiosurgery: dynamic conformal arcs (DCA), intensity-modulated radiosurgery (IMRS), and volumetric modulated arc radiotherapy (VMAT). All techniques included multiple noncoplanar beams or arcs with or without intensity-modulated delivery. The volume of gross tumor volume (GTV) ranged from 0.2 cm 3 to 21.9 cm 3 . Regarding the dose homogeneity index (HI ICRU ) and conformity index (CI ICRU ) were without significant difference between techniques statistically. However, the average CI ICRU = 1.09 ± 0.56 achieved by VMAT was the best of the 3 techniques. Moreover, notable improvement in gradient index (GI) was observed when VMAT was used (0.74 ± 0.13), and this result was significantly better than those achieved by the 2 other techniques (p < 0.05). For V 4 Gy of brainstem, both VMAT (2.5%) and IMRS (2.7%) were significantly lower than DCA (4.9%), both at the p < 0.05 level. Regarding V 2 Gy of normal brain, VMAT plans had attained 6.4 ± 5%; this was significantly better

  15. Knowledge-base for the new human reliability analysis method, A Technique for Human Error Analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Wreathall, J.; Thompson, C.M., Drouin, M.; Bley, D.C.

    1996-01-01

    This paper describes the knowledge base for the application of the new human reliability analysis (HRA) method, a ''A Technique for Human Error Analysis'' (ATHEANA). Since application of ATHEANA requires the identification of previously unmodeled human failure events, especially errors of commission, and associated error-forcing contexts (i.e., combinations of plant conditions and performance shaping factors), this knowledge base is an essential aid for the HRA analyst

  16. Sunburn and sun-protective behaviors among adults with and without previous nonmelanoma skin cancer (NMSC): A population-based study.

    Science.gov (United States)

    Fischer, Alexander H; Wang, Timothy S; Yenokyan, Gayane; Kang, Sewon; Chien, Anna L

    2016-08-01

    Individuals with previous nonmelanoma skin cancer (NMSC) are at increased risk for subsequent skin cancer, and should therefore limit ultraviolet exposure. We sought to determine whether individuals with previous NMSC engage in better sun protection than those with no skin cancer history. We pooled self-reported data (2005 and 2010 National Health Interview Surveys) from US non-Hispanic white adults (758 with and 34,161 without previous NMSC). We calculated adjusted prevalence odds ratios (aPOR) and 95% confidence intervals (CI), taking into account the complex survey design. Individuals with previous NMSC versus no history of NMSC had higher rates of frequent use of shade (44.3% vs 27.0%; aPOR 1.41; 95% CI 1.16-1.71), long sleeves (20.5% vs 7.7%; aPOR 1.55; 95% CI 1.21-1.98), a wide-brimmed hat (26.1% vs 10.5%; aPOR 1.52; 95% CI 1.24-1.87), and sunscreen (53.7% vs 33.1%; aPOR 2.11; 95% CI 1.73-2.59), but did not have significantly lower odds of recent sunburn (29.7% vs 40.7%; aPOR 0.95; 95% CI 0.77-1.17). Among those with previous NMSC, recent sunburn was inversely associated with age, sun avoidance, and shade but not sunscreen. Self-reported cross-sectional data and unavailable information quantifying regular sun exposure are limitations. Physicians should emphasize sunburn prevention when counseling patients with previous NMSC, especially younger adults, focusing on shade and sun avoidance over sunscreen. Copyright © 2016 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  17. Optimal technique of linear accelerator-based stereotactic radiosurgery for tumors adjacent to brainstem.

    Science.gov (United States)

    Chang, Chiou-Shiung; Hwang, Jing-Min; Tai, Po-An; Chang, You-Kang; Wang, Yu-Nong; Shih, Rompin; Chuang, Keh-Shih

    2016-01-01

    Stereotactic radiosurgery (SRS) is a well-established technique that is replacing whole-brain irradiation in the treatment of intracranial lesions, which leads to better preservation of brain functions, and therefore a better quality of life for the patient. There are several available forms of linear accelerator (LINAC)-based SRS, and the goal of the present study is to identify which of these techniques is best (as evaluated by dosimetric outcomes statistically) when the target is located adjacent to brainstem. We collected the records of 17 patients with lesions close to the brainstem who had previously been treated with single-fraction radiosurgery. In all, 5 different lesion catalogs were collected, and the patients were divided into 2 distance groups-1 consisting of 7 patients with a target-to-brainstem distance of less than 0.5cm, and the other of 10 patients with a target-to-brainstem distance of ≥ 0.5 and linear accelerator is only 1 modality can to establish for SRS treatment. Based on statistical evidence retrospectively, we recommend VMAT as the optimal technique for delivering treatment to tumors adjacent to brainstem. Copyright © 2016 American Association of Medical Dosimetrists. All rights reserved.

  18. SPAM CLASSIFICATION BASED ON SUPERVISED LEARNING USING MACHINE LEARNING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    T. Hamsapriya

    2011-12-01

    Full Text Available E-mail is one of the most popular and frequently used ways of communication due to its worldwide accessibility, relatively fast message transfer, and low sending cost. The flaws in the e-mail protocols and the increasing amount of electronic business and financial transactions directly contribute to the increase in e-mail-based threats. Email spam is one of the major problems of the today’s Internet, bringing financial damage to companies and annoying individual users. Spam emails are invading users without their consent and filling their mail boxes. They consume more network capacity as well as time in checking and deleting spam mails. The vast majority of Internet users are outspoken in their disdain for spam, although enough of them respond to commercial offers that spam remains a viable source of income to spammers. While most of the users want to do right think to avoid and get rid of spam, they need clear and simple guidelines on how to behave. In spite of all the measures taken to eliminate spam, they are not yet eradicated. Also when the counter measures are over sensitive, even legitimate emails will be eliminated. Among the approaches developed to stop spam, filtering is the one of the most important technique. Many researches in spam filtering have been centered on the more sophisticated classifier-related issues. In recent days, Machine learning for spam classification is an important research issue. The effectiveness of the proposed work is explores and identifies the use of different learning algorithms for classifying spam messages from e-mail. A comparative analysis among the algorithms has also been presented.

  19. Improved mesh based photon sampling techniques for neutron activation analysis

    International Nuclear Information System (INIS)

    Relson, E.; Wilson, P. P. H.; Biondo, E. D.

    2013-01-01

    The design of fusion power systems requires analysis of neutron activation of large, complex volumes, and the resulting particles emitted from these volumes. Structured mesh-based discretization of these problems allows for improved modeling in these activation analysis problems. Finer discretization of these problems results in large computational costs, which drives the investigation of more efficient methods. Within an ad hoc subroutine of the Monte Carlo transport code MCNP, we implement sampling of voxels and photon energies for volumetric sources using the alias method. The alias method enables efficient sampling of a discrete probability distribution, and operates in 0(1) time, whereas the simpler direct discrete method requires 0(log(n)) time. By using the alias method, voxel sampling becomes a viable alternative to sampling space with the 0(1) approach of uniformly sampling the problem volume. Additionally, with voxel sampling it is straightforward to introduce biasing of volumetric sources, and we implement this biasing of voxels as an additional variance reduction technique that can be applied. We verify our implementation and compare the alias method, with and without biasing, to direct discrete sampling of voxels, and to uniform sampling. We study the behavior of source biasing in a second set of tests and find trends between improvements and source shape, material, and material density. Overall, however, the magnitude of improvements from source biasing appears to be limited. Future work will benefit from the implementation of efficient voxel sampling - particularly with conformal unstructured meshes where the uniform sampling approach cannot be applied. (authors)

  20. Two new prediction rules for spontaneous pregnancy leading to live birth among subfertile couples, based on the synthesis of three previous models.

    NARCIS (Netherlands)

    C.C. Hunault; J.D.F. Habbema (Dik); M.J.C. Eijkemans (René); J.A. Collins (John); J.L.H. Evers (Johannes); E.R. te Velde (Egbert)

    2004-01-01

    textabstractBACKGROUND: Several models have been published for the prediction of spontaneous pregnancy among subfertile patients. The aim of this study was to broaden the empirical basis for these predictions by making a synthesis of three previously published models. METHODS:

  1. Structural level characterization of base oils using advanced analytical techniques

    KAUST Repository

    Hourani, Nadim; Muller, Hendrik; Adam, Frederick M.; Panda, Saroj K.; Witt, Matthias; Al-Hajji, Adnan A.; Sarathy, Mani

    2015-01-01

    cyclotron resonance mass spectrometry (FT-ICR MS) equipped with atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) sources. First, the capabilities and limitations of each analytical technique were evaluated

  2. Contests versus Norms: Implications of Contest-Based and Norm-Based Intervention Techniques.

    Science.gov (United States)

    Bergquist, Magnus; Nilsson, Andreas; Hansla, André

    2017-01-01

    Interventions using either contests or norms can promote environmental behavioral change. Yet research on the implications of contest-based and norm-based interventions is lacking. Based on Goal-framing theory, we suggest that a contest-based intervention frames a gain goal promoting intensive but instrumental behavioral engagement. In contrast, the norm-based intervention was expected to frame a normative goal activating normative obligations for targeted and non-targeted behavior and motivation to engage in pro-environmental behaviors in the future. In two studies participants ( n = 347) were randomly assigned to either a contest- or a norm-based intervention technique. Participants in the contest showed more intensive engagement in both studies. Participants in the norm-based intervention tended to report higher intentions for future energy conservation (Study 1) and higher personal norms for non-targeted pro-environmental behaviors (Study 2). These findings suggest that contest-based intervention technique frames a gain goal, while norm-based intervention frames a normative goal.

  3. Contests versus Norms: Implications of Contest-Based and Norm-Based Intervention Techniques

    Directory of Open Access Journals (Sweden)

    Magnus Bergquist

    2017-11-01

    Full Text Available Interventions using either contests or norms can promote environmental behavioral change. Yet research on the implications of contest-based and norm-based interventions is lacking. Based on Goal-framing theory, we suggest that a contest-based intervention frames a gain goal promoting intensive but instrumental behavioral engagement. In contrast, the norm-based intervention was expected to frame a normative goal activating normative obligations for targeted and non-targeted behavior and motivation to engage in pro-environmental behaviors in the future. In two studies participants (n = 347 were randomly assigned to either a contest- or a norm-based intervention technique. Participants in the contest showed more intensive engagement in both studies. Participants in the norm-based intervention tended to report higher intentions for future energy conservation (Study 1 and higher personal norms for non-targeted pro-environmental behaviors (Study 2. These findings suggest that contest-based intervention technique frames a gain goal, while norm-based intervention frames a normative goal.

  4. FPGA based mixed-signal circuit novel testing techniques

    International Nuclear Information System (INIS)

    Pouros, Sotirios; Vassios, Vassilios; Papakostas, Dimitrios; Hristov, Valentin

    2013-01-01

    Electronic circuits fault detection techniques, especially on modern mixed-signal circuits, are evolved and customized around the world to meet the industry needs. The paper presents techniques used on fault detection in mixed signal circuits. Moreover, the paper involves standardized methods, along with current innovations for external testing like Design for Testability (DfT) and Built In Self Test (BIST) systems. Finally, the research team introduces a circuit implementation scheme using FPGA

  5. Biogeosystem technique as a base of Sustainable Irrigated Agriculture

    Science.gov (United States)

    Batukaev, Abdulmalik

    2016-04-01

    The world water strategy is to be changed because the current imitational gravitational frontal isotropic-continual paradigm of irrigation is not sustainable. This paradigm causes excessive consumption of fresh water - global deficit - up to 4-15 times, adverse effects on soils and landscapes. Current methods of irrigation does not control the water spread throughout the soil continuum. The preferable downward fluxes of irrigation water are forming, up to 70% and more of water supply loses into vadose zone. The moisture of irrigated soil is high, soil loses structure in the process of granulometric fractions flotation decomposition, the stomatal apparatus of plant leaf is fully open, transpiration rate is maximal. We propose the Biogeosystem technique - the transcendental, uncommon and non-imitating methods for Sustainable Natural Resources Management. New paradigm of irrigation is based on the intra-soil pulse discrete method of water supply into the soil continuum by injection in small discrete portions. Individual volume of water is supplied as a vertical cylinder of soil preliminary watering. The cylinder position in soil is at depth form 10 to 30 cm. Diameter of cylinder is 1-2 cm. Within 5-10 min after injection the water spreads from the cylinder of preliminary watering into surrounding soil by capillary, film and vapor transfer. Small amount of water is transferred gravitationally to the depth of 35-40 cm. The soil watering cylinder position in soil profile is at depth of 5-50 cm, diameter of the cylinder is 2-4 cm. Lateral distance between next cylinders along the plant raw is 10-15 cm. The soil carcass which is surrounding the cylinder of non-watered soil remains relatively dry and mechanically stable. After water injection the structure of soil in cylinder restores quickly because of no compression from the stable adjoining volume of soil and soil structure memory. The mean soil thermodynamic water potential of watered zone is -0.2 MPa. At this potential

  6. Generalized image contrast enhancement technique based on Heinemann contrast discrimination model

    Science.gov (United States)

    Liu, Hong; Nodine, Calvin F.

    1994-03-01

    This paper presents a generalized image contrast enhancement technique which equalizes perceived brightness based on the Heinemann contrast discrimination model. This is a modified algorithm which presents an improvement over the previous study by Mokrane in its mathematically proven existence of a unique solution and in its easily tunable parameterization. The model uses a log-log representation of contrast luminosity between targets and the surround in a fixed luminosity background setting. The algorithm consists of two nonlinear gray-scale mapping functions which have seven parameters, two of which are adjustable Heinemann constants. Another parameter is the background gray level. The remaining four parameters are nonlinear functions of gray scale distribution of the image, and can be uniquely determined once the previous three are given. Tests have been carried out to examine the effectiveness of the algorithm for increasing the overall contrast of images. It can be demonstrated that the generalized algorithm provides better contrast enhancement than histogram equalization. In fact, the histogram equalization technique is a special case of the proposed mapping.

  7. Acceleration and Orientation Jumping Performance Differences Among Elite Professional Male Handball Players With or Without Previous ACL Reconstruction: An Inertial Sensor Unit-Based Study.

    Science.gov (United States)

    Setuain, Igor; González-Izal, Miriam; Alfaro, Jesús; Gorostiaga, Esteban; Izquierdo, Mikel

    2015-12-01

    Handball is one of the most challenging sports for the knee joint. Persistent biomechanical and jumping capacity alterations can be observed in athletes with an anterior cruciate ligament (ACL) injury. Commonly identified jumping biomechanical alterations have been described by the use of laboratory technologies. However, portable and easy-to-handle technologies that enable an evaluation of jumping biomechanics at the training field are lacking. To analyze unilateral/bilateral acceleration and orientation jumping performance differences among elite male handball athletes with or without previous ACL reconstruction via a single inertial sensor unit device. Case control descriptive study. At the athletes' usual training court. Twenty-two elite male (6 ACL-reconstructed and 16 uninjured control players) handball players were evaluated. The participants performed a vertical jump test battery that included a 50-cm vertical bilateral drop jump, a 20-cm vertical unilateral drop jump, and vertical unilateral countermovement jump maneuvers. Peak 3-dimensional (X, Y, Z) acceleration (m·s(-2)), jump phase duration and 3-dimensional orientation values (°) were obtained from the inertial sensor unit device. Two-tailed t-tests and a one-way analysis of variance were performed to compare means. The P value cut-off for significance was set at P handball athletes with previous ACL reconstruction demonstrated a jumping biomechanical profile similar to control players, including similar jumping performance values in both bilateral and unilateral jumping maneuvers, several years after ACL reconstruction. These findings are in agreement with previous research showing full functional restoration of abilities in top-level male athletes after ACL reconstruction, rehabilitation and subsequent return to sports at the previous level. Copyright © 2015 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.

  8. Effect of feeding different cereal-based diets on the performance and gut health of weaned piglets with or without previous access to creep feed during lactation.

    Science.gov (United States)

    Torrallardona, D; Andrés-Elias, N; López-Soria, S; Badiola, I; Cerdà-Cuéllar, M

    2012-12-01

    A trial was conducted to evaluate the effect of different cereals on the performance, gut mucosa, and microbiota of weanling pigs with or without previous access to creep feed during lactation. A total of 108 newly weaned pigs (7.4 kg BW; 26 d of age; half with and half without creep feed) were used. Piglets were distributed by BW into 36 pens according to a 2 × 6 factorial arrangement of treatments with previous access to creep feed (with or without) and cereal source in the experimental diet [barley (Hordeum vulgare), rice (Oryza sativa)-wheat (Triticum aestivum) bran, corn (Zea mays), naked oats (Avena sativa), oats, or rice] as main factors. Pigs were offered the experimental diets for 21 d and performance was monitored. At day 21, 4 piglets from each treatment were killed and sampled for the histological evaluation of jejunal mucosa and the study of ileal and cecal microbiota by RFLP. The Manhattan distances between RFLP profiles were calculated and intragroup similarities (IGS) were estimated for each treatment. An interaction between cereal source and previous creep feeding was observed for ADFI (P creep feeding increased ADFI for the rice-wheat bran diet it reduced it for naked oats. No differences in mucosal morphology were observed except for deeper crypts in pigs that did not have previous access to creep feed (P creep feeding and cereal was also observed for the IGS of the cecal microbiota at day 21 (P creep feed reduced IGS in the piglets fed oats or barley but no differences were observed for the other cereal sources. It is concluded that the effect of creep feeding during lactation on the performance and the microbiota of piglets after weaning is dependent on the nature of the cereal in the postweaning diet.

  9. Generalized image contrast enhancement technique based on the Heinemann contrast discrimination model

    Science.gov (United States)

    Liu, Hong; Nodine, Calvin F.

    1996-07-01

    This paper presents a generalized image contrast enhancement technique, which equalizes the perceived brightness distribution based on the Heinemann contrast discrimination model. It is based on the mathematically proven existence of a unique solution to a nonlinear equation, and is formulated with easily tunable parameters. The model uses a two-step log-log representation of luminance contrast between targets and surround in a luminous background setting. The algorithm consists of two nonlinear gray scale mapping functions that have seven parameters, two of which are adjustable Heinemann constants. Another parameter is the background gray level. The remaining four parameters are nonlinear functions of the gray-level distribution of the given image, and can be uniquely determined once the previous three are set. Tests have been carried out to demonstrate the effectiveness of the algorithm for increasing the overall contrast of radiology images. The traditional histogram equalization can be reinterpreted as an image enhancement technique based on the knowledge of human contrast perception. In fact, it is a special case of the proposed algorithm.

  10. A compressed sensing based approach on Discrete Algebraic Reconstruction Technique.

    Science.gov (United States)

    Demircan-Tureyen, Ezgi; Kamasak, Mustafa E

    2015-01-01

    Discrete tomography (DT) techniques are capable of computing better results, even using less number of projections than the continuous tomography techniques. Discrete Algebraic Reconstruction Technique (DART) is an iterative reconstruction method proposed to achieve this goal by exploiting a prior knowledge on the gray levels and assuming that the scanned object is composed from a few different densities. In this paper, DART method is combined with an initial total variation minimization (TvMin) phase to ensure a better initial guess and extended with a segmentation procedure in which the threshold values are estimated from a finite set of candidates to minimize both the projection error and the total variation (TV) simultaneously. The accuracy and the robustness of the algorithm is compared with the original DART by the simulation experiments which are done under (1) limited number of projections, (2) limited view problem and (3) noisy projections conditions.

  11. Non-Destructive Techniques Based on Eddy Current Testing

    Science.gov (United States)

    García-Martín, Javier; Gómez-Gil, Jaime; Vázquez-Sánchez, Ernesto

    2011-01-01

    Non-destructive techniques are used widely in the metal industry in order to control the quality of materials. Eddy current testing is one of the most extensively used non-destructive techniques for inspecting electrically conductive materials at very high speeds that does not require any contact between the test piece and the sensor. This paper includes an overview of the fundamentals and main variables of eddy current testing. It also describes the state-of-the-art sensors and modern techniques such as multi-frequency and pulsed systems. Recent advances in complex models towards solving crack-sensor interaction, developments in instrumentation due to advances in electronic devices, and the evolution of data processing suggest that eddy current testing systems will be increasingly used in the future. PMID:22163754

  12. Plasticity models of material variability based on uncertainty quantification techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Reese E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Rizzi, Francesco [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Boyce, Brad [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Templeton, Jeremy Alan [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ostien, Jakob [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2017-11-01

    The advent of fabrication techniques like additive manufacturing has focused attention on the considerable variability of material response due to defects and other micro-structural aspects. This variability motivates the development of an enhanced design methodology that incorporates inherent material variability to provide robust predictions of performance. In this work, we develop plasticity models capable of representing the distribution of mechanical responses observed in experiments using traditional plasticity models of the mean response and recently developed uncertainty quantification (UQ) techniques. Lastly, we demonstrate that the new method provides predictive realizations that are superior to more traditional ones, and how these UQ techniques can be used in model selection and assessing the quality of calibrated physical parameters.

  13. Previously unknown species of Aspergillus.

    Science.gov (United States)

    Gautier, M; Normand, A-C; Ranque, S

    2016-08-01

    The use of multi-locus DNA sequence analysis has led to the description of previously unknown 'cryptic' Aspergillus species, whereas classical morphology-based identification of Aspergillus remains limited to the section or species-complex level. The current literature highlights two main features concerning these 'cryptic' Aspergillus species. First, the prevalence of such species in clinical samples is relatively high compared with emergent filamentous fungal taxa such as Mucorales, Scedosporium or Fusarium. Second, it is clearly important to identify these species in the clinical laboratory because of the high frequency of antifungal drug-resistant isolates of such Aspergillus species. Matrix-assisted laser desorption/ionization-time of flight mass spectrometry (MALDI-TOF MS) has recently been shown to enable the identification of filamentous fungi with an accuracy similar to that of DNA sequence-based methods. As MALDI-TOF MS is well suited to the routine clinical laboratory workflow, it facilitates the identification of these 'cryptic' Aspergillus species at the routine mycology bench. The rapid establishment of enhanced filamentous fungi identification facilities will lead to a better understanding of the epidemiology and clinical importance of these emerging Aspergillus species. Based on routine MALDI-TOF MS-based identification results, we provide original insights into the key interpretation issues of a positive Aspergillus culture from a clinical sample. Which ubiquitous species that are frequently isolated from air samples are rarely involved in human invasive disease? Can both the species and the type of biological sample indicate Aspergillus carriage, colonization or infection in a patient? Highly accurate routine filamentous fungi identification is central to enhance the understanding of these previously unknown Aspergillus species, with a vital impact on further improved patient care. Copyright © 2016 European Society of Clinical Microbiology and

  14. Skull base tumours part I: Imaging technique, anatomy and anterior skull base tumours

    Energy Technology Data Exchange (ETDEWEB)

    Borges, Alexandra [Instituto Portugues de Oncologia Francisco Gentil, Centro de Lisboa, Servico de Radiologia, Rua Professor Lima Basto, 1093 Lisboa Codex (Portugal)], E-mail: borgesalexandra@clix.pt

    2008-06-15

    Advances in cross-sectional imaging, surgical technique and adjuvant treatment have largely contributed to ameliorate the prognosis, lessen the morbidity and mortality of patients with skull base tumours and to the growing medical investment in the management of these patients. Because clinical assessment of the skull base is limited, cross-sectional imaging became indispensable in the diagnosis, treatment planning and follow-up of patients with suspected skull base pathology and the radiologist is increasingly responsible for the fate of these patients. This review will focus on the advances in imaging technique; contribution to patient's management and on the imaging features of the most common tumours affecting the anterior skull base. Emphasis is given to a systematic approach to skull base pathology based upon an anatomic division taking into account the major tissue constituents in each skull base compartment. The most relevant information that should be conveyed to surgeons and radiation oncologists involved in patient's management will be discussed.

  15. A novel technique for extracting clouds base height using ground based imaging

    Directory of Open Access Journals (Sweden)

    E. Hirsch

    2011-01-01

    Full Text Available The height of a cloud in the atmospheric column is a key parameter in its characterization. Several remote sensing techniques (passive and active, either ground-based or on space-borne platforms and in-situ measurements are routinely used in order to estimate top and base heights of clouds. In this article we present a novel method that combines thermal imaging from the ground and sounded wind profile in order to derive the cloud base height. This method is independent of cloud types, making it efficient for both low boundary layer and high clouds. In addition, using thermal imaging ensures extraction of clouds' features during daytime as well as at nighttime. The proposed technique was validated by comparison to active sounding by ceilometers (which is a standard ground based method, to lifted condensation level (LCL calculations, and to MODIS products obtained from space. As all passive remote sensing techniques, the proposed method extracts only the height of the lowest cloud layer, thus upper cloud layers are not detected. Nevertheless, the information derived from this method can be complementary to space-borne cloud top measurements when deep-convective clouds are present. Unlike techniques such as LCL, this method is not limited to boundary layer clouds, and can extract the cloud base height at any level, as long as sufficient thermal contrast exists between the radiative temperatures of the cloud and its surrounding air parcel. Another advantage of the proposed method is its simplicity and modest power needs, making it particularly suitable for field measurements and deployment at remote locations. Our method can be further simplified for use with visible CCD or CMOS camera (although nighttime clouds will not be observed.

  16. A Dynamic Operation Permission Technique Based on an MFM Model and Numerical Simulation

    International Nuclear Information System (INIS)

    Akio, Gofuku; Masahiro, Yonemura

    2011-01-01

    It is important to support operator activities to an abnormal plant situation where many counter actions are taken in relatively short time. The authors proposed a technique called dynamic operation permission to decrease human errors without eliminating creative idea of operators to cope with an abnormal plant situation by checking if the counter action taken is consistent with emergency operation procedure. If the counter action is inconsistent, a dynamic operation permission system warns it to operators. It also explains how and why the counter action is inconsistent and what influence will appear on the future plant behavior by a qualitative influence inference technique based on a model by the Mf (Multilevel Flow Modeling). However, the previous dynamic operation permission is not able to explain quantitative effects on plant future behavior. Moreover, many possible influence paths are derived because a qualitative reasoning does not give a solution when positive and negative influences are propagated to the same node. This study extends the dynamic operation permission by combining the qualitative reasoning and the numerical simulation technique. The qualitative reasoning based on an Mf model of plant derives all possible influence propagation paths. Then, a numerical simulation gives a prediction of plant future behavior in the case of taking a counter action. The influence propagation that does not coincide with the simulation results is excluded from possible influence paths. The extended technique is implemented in a dynamic operation permission system for an oil refinery plant. An MFM model and a static numerical simulator are developed. The results of dynamic operation permission for some abnormal plant situations show the improvement of the accuracy of dynamic operation permission and the quality of explanation for the effects of the counter action taken

  17. Radiation hardening techniques for rare-earth based optical fibers and amplifiers

    International Nuclear Information System (INIS)

    Girard, Sylvain; Marcandella, Claude; Vivona, Marilena; Prudenzano, Luciano Mescia F.; Laurent, Arnaud; Robin, Thierry; Cadier, Benoit; Pinsard, Emmanuel; Ouerdane, Youcef; Boukenter, Aziz; Cannas, Marco; Boscaino, Roberto

    2012-01-01

    Er/Yb doped fibers and amplifiers have been shown to be very radiation sensitive, limiting their integration in space. We present an approach including successive hardening techniques to enhance their radiation tolerance. The efficiency of our approach is demonstrated by comparing the radiation responses of optical amplifiers made with same lengths of different rare-earth doped fibers and exposed to gamma-rays. Previous studies indicated that such amplifiers suffered significant degradation for doses exceeding 10 krad. Applying our techniques significantly enhances the amplifier radiation resistance, resulting in a very limited degradation up to 50 krad. Our optimization techniques concern the fiber composition, some possible pre-treatments and the interest of simulation tools used to harden by design the amplifiers. We showed that adding cerium inside the fiber phospho-silicate-based core strongly decreases the fiber radiation sensitivity compared to the standard fiber. For both fibers, a pre-treatment with hydrogen permits to enhance again the fiber resistance. Furthermore, simulations tools can also be used to improve the tolerance of the fiber amplifier by helping identifying the best amplifier configuration for operation in the radiative environment. (authors)

  18. MySQL based selection of appropriate indexing technique in ...

    African Journals Online (AJOL)

    This paper deals with selection of appropriate indexing technique applied on MySQL Database for a health care system and related performance issues using multiclass support vector machine (SVM). The patient database is generally huge and contains lot of variations. For the quick search or fast retrieval of the desired ...

  19. an architecture-based technique to mobile contact recommendation

    African Journals Online (AJOL)

    user

    Aside being able to store the name of contacts and their phone numbers, there are ... the artificial neural network technique [21], along with ... Recommendation is part of everyday life. This concept ... However, to use RSs some level of intelligence must be ...... [3] Min J.-K. & Cho S.-B.Mobile Human Network Management.

  20. MRA Based Efficient Database Storing and Fast Querying Technique

    Directory of Open Access Journals (Sweden)

    Mitko Kostov

    2017-02-01

    Full Text Available In this paper we consider a specific way of organizing 1D signals or 2D image databases, such that a more efficient storage and faster querying is achieved. A multiresolution technique of data processing is used in order of saving the most significant processed data.

  1. Proportion of U.S. Civilian Population Ineligible for U.S. Air Force Enlistment Based on Current and Previous Weight Standards

    National Research Council Canada - National Science Library

    D'Mello, Tiffany A; Yamane, Grover K

    2007-01-01

    .... Until recently, gender-specific weight standards based on height were in place. However, in June 2006 the USAF implemented a new set of height-weight limits utilizing body mass index (BMI) criteria...

  2. A New Three Dimensional Based Key Generation Technique in AVK

    Science.gov (United States)

    Banerjee, Subhasish; Dutta, Manash Pratim; Bhunia, Chandan Tilak

    2017-08-01

    In modern era, ensuring high order security becomes one and only objective of computer networks. From the last few decades, many researchers have given their contributions to achieve the secrecy over the communication channel. In achieving perfect security, Shannon had done the pioneer work on perfect secret theorem and illustrated that secrecy of the shared information can be maintained if the key becomes variable in nature instead of static one. In this regard, a key generation technique has been proposed where the key can be changed every time whenever a new block of data needs to be exchanged. In our scheme, the keys not only vary in bit sequences but also in size. The experimental study is also included in this article to prove the correctness and effectiveness of our proposed technique.

  3. A Review On Segmentation Based Image Compression Techniques

    Directory of Open Access Journals (Sweden)

    S.Thayammal

    2013-11-01

    Full Text Available Abstract -The storage and transmission of imagery become more challenging task in the current scenario of multimedia applications. Hence, an efficient compression scheme is highly essential for imagery, which reduces the requirement of storage medium and transmission bandwidth. Not only improvement in performance and also the compression techniques must converge quickly in order to apply them for real time applications. There are various algorithms have been done in image compression, but everyone has its own pros and cons. Here, an extensive analysis between existing methods is performed. Also, the use of existing works is highlighted, for developing the novel techniques which face the challenging task of image storage and transmission in multimedia applications.

  4. Brain tumor segmentation based on a hybrid clustering technique

    Directory of Open Access Journals (Sweden)

    Eman Abdel-Maksoud

    2015-03-01

    This paper presents an efficient image segmentation approach using K-means clustering technique integrated with Fuzzy C-means algorithm. It is followed by thresholding and level set segmentation stages to provide an accurate brain tumor detection. The proposed technique can get benefits of the K-means clustering for image segmentation in the aspects of minimal computation time. In addition, it can get advantages of the Fuzzy C-means in the aspects of accuracy. The performance of the proposed image segmentation approach was evaluated by comparing it with some state of the art segmentation algorithms in case of accuracy, processing time, and performance. The accuracy was evaluated by comparing the results with the ground truth of each processed image. The experimental results clarify the effectiveness of our proposed approach to deal with a higher number of segmentation problems via improving the segmentation quality and accuracy in minimal execution time.

  5. LFC based adaptive PID controller using ANN and ANFIS techniques

    Directory of Open Access Journals (Sweden)

    Mohamed I. Mosaad

    2014-12-01

    Full Text Available This paper presents an adaptive PID Load Frequency Control (LFC for power systems using Neuro-Fuzzy Inference Systems (ANFIS and Artificial Neural Networks (ANN oriented by Genetic Algorithm (GA. PID controller parameters are tuned off-line by using GA to minimize integral error square over a wide-range of load variations. The values of PID controller parameters obtained from GA are used to train both ANFIS and ANN. Therefore, the two proposed techniques could, online, tune the PID controller parameters for optimal response at any other load point within the operating range. Testing of the developed techniques shows that the adaptive PID-LFC could preserve optimal performance over the whole loading range. Results signify superiority of ANFIS over ANN in terms of performance measures.

  6. New technique for producing the alloys based on transition metals

    International Nuclear Information System (INIS)

    Dolukhanyan, S.K.; Aleksanyan, A.G.; Shekhtman, V.Sh.; Mantashyan, A.A.; Mayilyan, D.G.; Ter-Galstyan, O.P.

    2007-01-01

    In principle new technique was elaborated for obtaining the alloys of refractory metals by their hydrides compacting and following dehydrogenation. The elaborated technique is described. The conditions of alloys formation from different hydrides of appropriate metals was investigated in detail. The influence of the process parameters such as: chemical peculiarities, composition of source hydrides, phase transformation during dehydrogenation, etc. on the alloys formation were established. The binary and tertiary alloys of α and ω phases: Ti 0 .8Zr 0 .8; Ti 0 .66Zr 0 .33; Ti 0 .3Zr 0 .8; Ti 0 .2Zr 0 .8; Ti 0 .8Hf 0 .2; Ti 0 .6Hf 0 .4Ti 0 .66Zr 0 .23Hf 0 .11; etc were recieved. Using elaborated special hydride cycle, an earlier unknown effective process for formation of alloys of transition metals was realized. The dependence of final alloy structure on the composition of initial mixture and hydrogen content in source hydrides was established

  7. Sample preparation for large-scale bioanalytical studies based on liquid chromatographic techniques.

    Science.gov (United States)

    Medvedovici, Andrei; Bacalum, Elena; David, Victor

    2018-01-01

    Quality of the analytical data obtained for large-scale and long term bioanalytical studies based on liquid chromatography depends on a number of experimental factors including the choice of sample preparation method. This review discusses this tedious part of bioanalytical studies, applied to large-scale samples and using liquid chromatography coupled with different detector types as core analytical technique. The main sample preparation methods included in this paper are protein precipitation, liquid-liquid extraction, solid-phase extraction, derivatization and their versions. They are discussed by analytical performances, fields of applications, advantages and disadvantages. The cited literature covers mainly the analytical achievements during the last decade, although several previous papers became more valuable in time and they are included in this review. Copyright © 2017 John Wiley & Sons, Ltd.

  8. EVE: Explainable Vector Based Embedding Technique Using Wikipedia

    OpenAIRE

    Qureshi, M. Atif; Greene, Derek

    2017-01-01

    We present an unsupervised explainable word embedding technique, called EVE, which is built upon the structure of Wikipedia. The proposed model defines the dimensions of a semantic vector representing a word using human-readable labels, thereby it readily interpretable. Specifically, each vector is constructed using the Wikipedia category graph structure together with the Wikipedia article link structure. To test the effectiveness of the proposed word embedding model, we consider its usefulne...

  9. Voltage Stabilizer Based on SPWM technique Using Microcontroller

    OpenAIRE

    K. N. Tarchanidis; J. N. Lygouras; P. Botsaris

    2013-01-01

    This paper presents an application of the well known SPWM technique on a voltage stabilizer, using a microcontroller. The stabilizer is AC/DC/AC type. So, the system rectifies the input AC voltage to a suitable DC level and the intelligent control of an embedded microcontroller regulates the pulse width of the output voltage in order to produce through a filter a perfect sinusoidal AC voltage. The control program on the microcontroller has the ability to change the FET transistor ...

  10. Optimization of Stereotactic Radiotherapy Treatment Delivery Technique for Base-Of-Skull Meningiomas

    International Nuclear Information System (INIS)

    Clark, Brenda G.; Candish, Charles; Vollans, Emily; Gete, Ermias; Lee, Richard; Martin, Monty; Ma, Roy; McKenzie, Michael

    2008-01-01

    This study compares static conformal field (CF), intensity modulated radiotherapy (IMRT), and dynamic arcs (DA) for the stereotactic radiotherapy of base-of-skull meningiomas. Twenty-one cases of base-of-skull meningioma (median planning target volume [PTV] = 21.3 cm 3 ) previously treated with stereotactic radiotherapy were replanned with each technique. The plans were compared for Radiation Therapy Oncology Group conformity index (CI) and homogeneity index (HI), and doses to normal structures at 6 dose values from 50.4 Gy to 5.6 Gy. The mean CI was 1.75 (CF), 1.75 (DA), and 1.66 (IMRT) (p 3 , the CI (IMRT) was always superior to CI (DA) and CI (CF). At PTV sizes below 25 cm 3 , there was no significant difference in CI between each technique. There was no significant difference in HI between plans. The total volume of normal tissue receiving 50.4, 44.8, and 5.6 Gy was significantly lower when comparing IMRT to CF and DA plans (p 3 , due to improved conformity and normal tissue sparing, in particular for the brain stem and ipsilateral temporal lobe

  11. Research and development of LANDSAT-based crop inventory techniques

    Science.gov (United States)

    Horvath, R.; Cicone, R. C.; Malila, W. A. (Principal Investigator)

    1982-01-01

    A wide spectrum of technology pertaining to the inventory of crops using LANDSAT without in situ training data is addressed. Methods considered include Bayesian based through-the-season methods, estimation technology based on analytical profile fitting methods, and expert-based computer aided methods. Although the research was conducted using U.S. data, the adaptation of the technology to the Southern Hemisphere, especially Argentina was considered.

  12. Hiding Techniques for Dynamic Encryption Text based on Corner Point

    Science.gov (United States)

    Abdullatif, Firas A.; Abdullatif, Alaa A.; al-Saffar, Amna

    2018-05-01

    Hiding technique for dynamic encryption text using encoding table and symmetric encryption method (AES algorithm) is presented in this paper. The encoding table is generated dynamically from MSB of the cover image points that used as the first phase of encryption. The Harris corner point algorithm is applied on cover image to generate the corner points which are used to generate dynamic AES key to second phase of text encryption. The embedded process in the LSB for the image pixels except the Harris corner points for more robust. Experimental results have demonstrated that the proposed scheme have embedding quality, error-free text recovery, and high value in PSNR.

  13. Skin fluorescence model based on the Monte Carlo technique

    Science.gov (United States)

    Churmakov, Dmitry Y.; Meglinski, Igor V.; Piletsky, Sergey A.; Greenhalgh, Douglas A.

    2003-10-01

    The novel Monte Carlo technique of simulation of spatial fluorescence distribution within the human skin is presented. The computational model of skin takes into account spatial distribution of fluorophores following the collagen fibers packing, whereas in epidermis and stratum corneum the distribution of fluorophores assumed to be homogeneous. The results of simulation suggest that distribution of auto-fluorescence is significantly suppressed in the NIR spectral region, while fluorescence of sensor layer embedded in epidermis is localized at the adjusted depth. The model is also able to simulate the skin fluorescence spectra.

  14. A Computer Based Moire Technique To Measure Very Small Displacements

    Science.gov (United States)

    Sciammarella, Cesar A.; Amadshahi, Mansour A.; Subbaraman, B.

    1987-02-01

    The accuracy that can be achieved in the measurement of very small displacements in techniques such as moire, holography and speckle is limited by the noise inherent to the utilized optical devices. To reduce the noise to signal ratio, the moire method can be utilized. Two system of carrier fringes are introduced, an initial system before the load is applied and a final system when the load is applied. The moire pattern of these two systems contains the sought displacement information and the noise common to the two patterns is eliminated. The whole process is performed by a computer on digitized versions of the patterns. Examples of application are given.

  15. Indirect Fluorescent Antibody Technique based Prevalence of Surra in Equines

    Directory of Open Access Journals (Sweden)

    Ahsan Nadeem, Asim Aslam*, Zafar Iqbal Chaudhary, Kamran Ashraf1, Khalid Saeed1, Nisar Ahmad1, Ishtiaq Ahmed and Habib ur Rehman2

    2011-04-01

    Full Text Available This project was carried out to find the prevalence of trypanosomiasis in equine in District Gujranwala by using indirect fluorescent antibody technique and thin smear method. Blood samples were collected from a total of 200 horses and donkeys of different ages and either sex. Duplicate thin blood smears were prepared from each sample and remaining blood samples were centrifuged to separate the serum. Smears from each animal were processed for giemsa staining and indirect fluorescent antibody test (IFAT. Giemsa stained smears revealed Trypanosome infection in 4/200 (2.0% samples and IFAT in 12/200 (6.0% animals.

  16. Multimedia-Based Integration of Cross-Layer Techniques

    Science.gov (United States)

    2014-06-01

    Wireless Commun. Mag., vol. 12, no. 4, pp. 50–58, August 2005. 11. E. Setton, T. Yoo, X. Zhu, A. Goldsmith , and B. Girod, “Cross-layer design of ad-hoc...Overview,” DARPA Presentation by Preston Marshall and Todd Martin, WAND Industry Day Workshop, Feb. 27, 2007. 17. S. Chan, “Shared spectrum access for DOD...Lavery, A. Goldsmith , and D. J. Goodman, “Throughput optimization using adaptive techniques,” IEEE Commun. Lett., pp. 1–7, 2006. 32. S. Choudhury and J

  17. GPU-Based Techniques for Global Illumination Effects

    CERN Document Server

    Szirmay-Kalos, László; Sbert, Mateu

    2008-01-01

    This book presents techniques to render photo-realistic images by programming the Graphics Processing Unit (GPU). We discuss effects such as mirror reflections, refractions, caustics, diffuse or glossy indirect illumination, radiosity, single or multiple scattering in participating media, tone reproduction, glow, and depth of field. This book targets game developers, graphics programmers, and also students with some basic understanding of computer graphics algorithms, rendering APIs like Direct3D or OpenGL, and shader programming. In order to make this book self-contained, the most important c

  18. A fast image reconstruction technique based on ART

    International Nuclear Information System (INIS)

    Zhang Shunli; Zhang Dinghua; Wang Kai; Huang Kuidong; Li Weibin

    2007-01-01

    Algebraic Reconstruction Technique (ART) is an iterative method for image reconstruction. Improving its reconstruction speed has been one of the important researching aspects of ART. For the simplified weight coefficients reconstruction model of ART, a fast grid traverse algorithm is proposed, which can determine the grid index by simple operations such as addition, subtraction and comparison. Since the weight coefficients are calculated at real time during iteration, large amount of storage is saved and the reconstruction speed is greatly increased. Experimental results show that the new algorithm is very effective and the reconstruction speed is improved about 10 times compared with the traditional algorithm. (authors)

  19. Satellite-based technique for nowcasting of thunderstorms over ...

    Indian Academy of Sciences (India)

    Suman Goyal

    2017-08-31

    Aug 31, 2017 ... Due to inadequate radar network, satellite plays the dominant role for nowcast of these thunderstorms. In this study, a nowcast based algorithm ForTracc developed by Vila ... of actual development of cumulonimbus clouds, ... MCS over Indian region using Infrared Channel ... (2016) based on case study of.

  20. Identifying content-based and relational techniques to change behaviour in motivational interviewing.

    Science.gov (United States)

    Hardcastle, Sarah J; Fortier, Michelle; Blake, Nicola; Hagger, Martin S

    2017-03-01

    Motivational interviewing (MI) is a complex intervention comprising multiple techniques aimed at changing health-related motivation and behaviour. However, MI techniques have not been systematically isolated and classified. This study aimed to identify the techniques unique to MI, classify them as content-related or relational, and evaluate the extent to which they overlap with techniques from the behaviour change technique taxonomy version 1 [BCTTv1; Michie, S., Richardson, M., Johnston, M., Abraham, C., Francis, J., Hardeman, W., … Wood, C. E. (2013). The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: Building an international consensus for the reporting of behavior change interventions. Annals of Behavioral Medicine, 46, 81-95]. Behaviour change experts (n = 3) content-analysed MI techniques based on Miller and Rollnick's [(2013). Motivational interviewing: Preparing people for change (3rd ed.). New York: Guildford Press] conceptualisation. Each technique was then coded for independence and uniqueness by independent experts (n = 10). The experts also compared each MI technique to those from the BCTTv1. Experts identified 38 distinct MI techniques with high agreement on clarity, uniqueness, preciseness, and distinctiveness ratings. Of the identified techniques, 16 were classified as relational techniques. The remaining 22 techniques were classified as content based. Sixteen of the MI techniques were identified as having substantial overlap with techniques from the BCTTv1. The isolation and classification of MI techniques will provide researchers with the necessary tools to clearly specify MI interventions and test the main and interactive effects of the techniques on health behaviour. The distinction between relational and content-based techniques within MI is also an important advance, recognising that changes in motivation and behaviour in MI is a function of both intervention content and the interpersonal style

  1. Five-year efficacy and safety of tenofovir-based salvage therapy for patients with chronic hepatitis B who previously failed LAM/ADV therapy.

    Science.gov (United States)

    Lim, Lucy; Thompson, Alexander; Patterson, Scott; George, Jacob; Strasser, Simone; Lee, Alice; Sievert, William; Nicoll, Amanda; Desmond, Paul; Roberts, Stuart; Marion, Kaye; Bowden, Scott; Locarnini, Stephen; Angus, Peter

    2017-06-01

    Multidrug-resistant HBV continues to be an important clinical problem. The TDF-109 study demonstrated that TDF±LAM is an effective salvage therapy through 96 weeks for LAM-resistant patients who previously failed ADV add-on or switch therapy. We evaluated the 5-year efficacy and safety outcomes in patients receiving long-term TDF±LAM in the TDF-109 study. A total of 59 patients completed the first phase of the TDF-109 study and 54/59 were rolled over into a long-term prospective open-label study of TDF±LAM 300 mg daily. Results are reported at the end of year 5 of treatment. At year 5, 75% (45/59) had achieved viral suppression by intent-to-treat analysis. Per-protocol assessment revealed 83% (45/54) were HBV DNA undetectable. Nine patients remained HBV DNA detectable, however 8/9 had very low HBV DNA levels (<264IU/mL) and did not meet virological criteria for virological breakthrough (VBT). One patient experienced VBT, but this was in the setting of documented non-compliance. The response was independent of baseline LAM therapy or mutations conferring ADV resistance. Four patients discontinued TDF, one patient was lost to follow-up and one died from hepatocellular carcinoma. Long-term TDF treatment appears to be safe and effective in patients with prior failure of LAM and a suboptimal response to ADV therapy. These findings confirm that TDF has a high genetic barrier to resistance is active against multidrug-resistant HBV, and should be the preferred oral anti-HBV agent in CHB patients who fail treatment with LAM and ADV. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. Population-based Neisseria gonorrhoeae, Chlamydia trachomatis and Trichomonas vaginalis Prevalence Using Discarded, Deidentified Urine Specimens Previously Collected for Drug Testing (Open Access Publisher’s Version)

    Science.gov (United States)

    2017-10-24

    trichomonas vaginalis testing, Melinda Balansay-ames, chris Myers and gary Brice for Pcr- based sex determination testing, and Kimberly De Vera for...2017-053355 rEFErEnCEs 1 torrone e , Papp J, Weinstock H. centers for Disease control and Prevention (cDc). Prevalence of Chlamydia trachomatis genital...infection among persons aged 14-39 years-United States, 2007-2012. MMWR Morb Mortal Wkly Rep 2014;63:834–7. 2 rietmeijer ca, Hopkins e , geisler WM

  3. A measurement-based technique for incipient anomaly detection

    KAUST Repository

    Harrou, Fouzi; Sun, Ying

    2016-01-01

    Fault detection is essential for safe operation of various engineering systems. Principal component analysis (PCA) has been widely used in monitoring highly correlated process variables. Conventional PCA-based methods, nevertheless, often fail to detect small or incipient faults. In this paper, we develop new PCA-based monitoring charts, combining PCA with multivariate memory control charts, such as the multivariate cumulative sum (MCUSUM) and multivariate exponentially weighted moving average (MEWMA) monitoring schemes. The multivariate control charts with memory are sensitive to small and moderate faults in the process mean, which significantly improves the performance of PCA methods and widen their applicability in practice. Using simulated data, we demonstrate that the proposed PCA-based MEWMA and MCUSUM control charts are more effective in detecting small shifts in the mean of the multivariate process variables, and outperform the conventional PCA-based monitoring charts. © 2015 IEEE.

  4. Wavelet packet transform-based robust video watermarking technique

    Indian Academy of Sciences (India)

    If any conflict happens to the copyright identification and authentication, ... the present work is concentrated on the robust digital video watermarking. .... the wavelet decomposition, resulting in a new family of orthonormal bases for function ...

  5. A measurement-based technique for incipient anomaly detection

    KAUST Repository

    Harrou, Fouzi

    2016-06-13

    Fault detection is essential for safe operation of various engineering systems. Principal component analysis (PCA) has been widely used in monitoring highly correlated process variables. Conventional PCA-based methods, nevertheless, often fail to detect small or incipient faults. In this paper, we develop new PCA-based monitoring charts, combining PCA with multivariate memory control charts, such as the multivariate cumulative sum (MCUSUM) and multivariate exponentially weighted moving average (MEWMA) monitoring schemes. The multivariate control charts with memory are sensitive to small and moderate faults in the process mean, which significantly improves the performance of PCA methods and widen their applicability in practice. Using simulated data, we demonstrate that the proposed PCA-based MEWMA and MCUSUM control charts are more effective in detecting small shifts in the mean of the multivariate process variables, and outperform the conventional PCA-based monitoring charts. © 2015 IEEE.

  6. An RSS based location estimation technique for cognitive relay networks

    KAUST Repository

    Qaraqe, Khalid A.; Hussain, Syed Imtiaz; Ç elebi, Hasari Burak; Abdallah, Mohamed M.; Alouini, Mohamed-Slim

    2010-01-01

    In this paper, a received signal strength (RSS) based location estimation method is proposed for a cooperative wireless relay network where the relay is a cognitive radio. We propose a method for the considered cognitive relay network to determine

  7. An Automated Sorting System Based on Virtual Instrumentation Techniques

    Directory of Open Access Journals (Sweden)

    Rodica Holonec

    2008-07-01

    Full Text Available The application presented in this paper represents an experimental model and it refers to the implementing of an automated sorting system for pieces of same shape but different sizes and/or colors. The classification is made according to two features: the color and weight of these pieces. The system is a complex combination of NI Vision hardware and software tools, strain gauges transducers, signal conditioning connected to data acquisition boards, motion and control elements. The system is very useful for students to learn and experiment different virtual instrumentation techniques in order to be able to develop a large field of applications from inspection and process control to sorting and assembly

  8. Innovative instrumentation for VVERs based in non-invasive techniques

    International Nuclear Information System (INIS)

    Jeanneau, H.; Favennec, J.M.; Tournu, E.; Germain, J.L.

    2000-01-01

    Nuclear power plants such as VVERs can greatly benefit from innovative instrumentation to improve plant safety and efficiency. In recent years innovative instrumentation has been developed for PWRs with the aim of providing additional measurements of physical parameters on the primary and secondary circuits: the addition of new instrumentation is made possible by using non-invasive techniques such as ultrasonics and radiation detection. These innovations can be adapted for upgrading VVERs presently in operation and also in future VVERs. The following innovative instrumentation for the control, monitoring or testing at VVERs is described: 1. instrumentation for more accurate primary side direct measurements (for a better monitoring of the primary circuit); 2. instrumentation to monitor radioactivity leaks (for a safer plant); 3. instrumentation-related systems to improve the plant efficiency (for a cheaper kWh)

  9. Extending Driving Vision Based on Image Mosaic Technique

    Directory of Open Access Journals (Sweden)

    Chen Deng

    2017-01-01

    Full Text Available Car cameras have been used extensively to assist driving by make driving visible. However, due to the limitation of the Angle of View (AoV, the dead zone still exists, which is a primary origin of car accidents. In this paper, we introduce a system to extend the vision of drivers to 360 degrees. Our system consists of four wide-angle cameras, which are mounted at different sides of a car. Although the AoV of each camera is within 180 degrees, relying on the image mosaic technique, our system can seamlessly integrate 4-channel videos into a panorama video. The panorama video enable drivers to observe everywhere around a car as far as three meters from a top view. We performed experiments in a laboratory environment. Preliminary results show that our system can eliminate vision dead zone completely. Additionally, the real-time performance of our system can satisfy requirements for practical use.

  10. Diagnostic Accuracy of Robot-Guided, Software Based Transperineal MRI/TRUS Fusion Biopsy of the Prostate in a High Risk Population of Previously Biopsy Negative Men

    Directory of Open Access Journals (Sweden)

    Malte Kroenig

    2016-01-01

    Full Text Available Objective. In this study, we compared prostate cancer detection rates between MRI-TRUS fusion targeted and systematic biopsies using a robot-guided, software based transperineal approach. Methods and Patients. 52 patients received a MRIT/TRUS fusion followed by a systematic volume adapted biopsy using the same robot-guided transperineal approach. The primary outcome was the detection rate of clinically significant disease (Gleason grade ≥ 4. Secondary outcomes were detection rate of all cancers, sampling efficiency and utility, and serious adverse event rate. Patients received no antibiotic prophylaxis. Results. From 52 patients, 519 targeted biopsies from 135 lesions and 1561 random biopsies were generated (total n=2080. Overall detection rate of clinically significant PCa was 44.2% (23/52 and 50.0% (26/52 for target and random biopsy, respectively. Sampling efficiency as the median number of cores needed to detect clinically significant prostate cancer was 9 for target (IQR: 6–14.0 and 32 (IQR: 24–32 for random biopsy. The utility as the number of additionally detected clinically significant PCa cases by either strategy was 0% (0/52 for target and 3.9% (2/52 for random biopsy. Conclusions. MRI/TRUS fusion based target biopsy did not show an advantage in the overall detection rate of clinically significant prostate cancer.

  11. Satisfactory patient-based outcomes after surgical treatment for idiopathic clubfoot: includes surgeon's individualized technique.

    Science.gov (United States)

    Mahan, Susan T; Spencer, Samantha A; Kasser, James R

    2014-09-01

    Treatment of idiopathic clubfoot has shifted towards Ponseti technique, but previously surgical management was standard. Outcomes of surgery have varied, with many authors reporting discouraging results. Our purpose was to evaluate a single surgeon's series of children with idiopathic clubfoot treated with a la carte posteromedial and lateral releases using the Pediatric Outcomes Data Collection Instrument (PODCI) with a minimum of 2-year follow-up. A total of 148 patients with idiopathic clubfoot treated surgically by a single surgeon over 15 years were identified, and mailed PODCI questionnaires. Fifty percent of the patients were located and responded, resulting in 74 complete questionnaires. Median age at surgery was 10 months (range, 5.3 to 84.7 mo), male sex 53/74 (71.6%), bilateral surgery 31/74 (41.9%), and average follow-up of 9.7 years. PODCI responses were compared with previously published normal healthy controls using t test for each separate category. Included in the methods is the individual surgeon's operative technique. In PODCIs where a parent reports for their child or adolescent, there was no difference between our data and the healthy controls in any of the 5 categories. In PODCI where an adolescent self-reports, there was no difference in 4 of 5 categories; significant difference was only found between our data (mean = 95.2; SD = 7.427) and normal controls (mean = 86.3; SD = 12.5) in Happiness Scale (P = 0.0031). In this group of idiopathic clubfoot patients, treated with judicious posteromedial release by a single surgeon, primarily when surgery was treatment of choice for clubfoot, patient-based outcomes are not different from their normal healthy peers through childhood and adolescence. While Ponseti treatment has since become the treatment of choice for clubfoot, surgical treatment, in some hands, has led to satisfactory results. Level III.

  12. A framework for laboratory pre-work based on the concepts, tools and techniques questioning method

    International Nuclear Information System (INIS)

    Huntula, J; Sharma, M D; Johnston, I; Chitaree, R

    2011-01-01

    Learning in the laboratory is different from learning in other contexts because students have to engage with various aspects of the practice of science. They have to use many skills and knowledge in parallel-not only to understand the concepts of physics but also to use the tools and analyse the data. The question arises, how to best guide students' learning in the laboratory. This study is about creating and using questions with a specifically designed framework to aid learning in the laboratory. The concepts, tools and techniques questioning (CTTQ) method was initially designed and used at Mahidol University, Thailand, and was subsequently extended to laboratory pre-work at the University of Sydney. The CTTQ method was implemented in Sydney with 190 first-year students. Three pre-work exercises on a series of electrical experiments were created based on the CTTQ method. The pre-works were completed individually and submitted before the experiment started. Analysed pre-work, surveys and interviews were used to evaluate the pre-work questions in this study. The results indicated that the CTTQ method was successful and the flow in the experiments was better than that in the previous year. At the same time students had difficulty with the last experiment in the sequence and with techniques.

  13. Adaptive Landmark-Based Navigation System Using Learning Techniques

    DEFF Research Database (Denmark)

    Zeidan, Bassel; Dasgupta, Sakyasingha; Wörgötter, Florentin

    2014-01-01

    The goal-directed navigational ability of animals is an essential prerequisite for them to survive. They can learn to navigate to a distal goal in a complex environment. During this long-distance navigation, they exploit environmental features, like landmarks, to guide them towards their goal. In...... hexapod robots. As a result, it allows the robots to successfully learn to navigate to distal goals in complex environments.......The goal-directed navigational ability of animals is an essential prerequisite for them to survive. They can learn to navigate to a distal goal in a complex environment. During this long-distance navigation, they exploit environmental features, like landmarks, to guide them towards their goal....... Inspired by this, we develop an adaptive landmark-based navigation system based on sequential reinforcement learning. In addition, correlation-based learning is also integrated into the system to improve learning performance. The proposed system has been applied to simulated simple wheeled and more complex...

  14. Estimating monthly temperature using point based interpolation techniques

    Science.gov (United States)

    Saaban, Azizan; Mah Hashim, Noridayu; Murat, Rusdi Indra Zuhdi

    2013-04-01

    This paper discusses the use of point based interpolation to estimate the value of temperature at an unallocated meteorology stations in Peninsular Malaysia using data of year 2010 collected from the Malaysian Meteorology Department. Two point based interpolation methods which are Inverse Distance Weighted (IDW) and Radial Basis Function (RBF) are considered. The accuracy of the methods is evaluated using Root Mean Square Error (RMSE). The results show that RBF with thin plate spline model is suitable to be used as temperature estimator for the months of January and December, while RBF with multiquadric model is suitable to estimate the temperature for the rest of the months.

  15. Product and process effectiveness using performance-based auditing techniques

    International Nuclear Information System (INIS)

    Horseman, M.L.

    1995-01-01

    Focus is the backbone of genius. Focus is the lifeblood of adequate products and effective processes. Focus is the theme of Performance-Based Audits (PBA). The Civilian Radioactive Waste Management (CRWM) Program is using the PBA tool extensively to focus on the evaluation of product adequacy and process effectiveness. The term Performance-Based Audit has been around for several years. however, the approach presented here for the systematic end-product selection, planning, and measurement of adequacy and effectiveness is new and innovative

  16. New Diagnosis of AIDS Based on Salmonella enterica subsp. I (enterica Enteritidis (A Meningitis in a Previously Immunocompetent Adult in the United States

    Directory of Open Access Journals (Sweden)

    Andrew C. Elton

    2017-01-01

    Full Text Available Salmonella meningitis is a rare manifestation of meningitis typically presenting in neonates and the elderly. This infection typically associates with foodborne outbreaks in developing nations and AIDS-endemic regions. We report a case of a 19-year-old male presenting with altered mental status after 3-day absence from work at a Wisconsin tourist area. He was febrile, tachycardic, and tachypneic with a GCS of 8. The patient was intubated and a presumptive diagnosis of meningitis was made. Treatment was initiated with ceftriaxone, vancomycin, acyclovir, dexamethasone, and fluid resuscitation. A lumbar puncture showed cloudy CSF with Gram negative rods. He was admitted to the ICU. CSF culture confirmed Salmonella enterica subsp. I (enterica Enteritidis (A. Based on this finding, a 4th-generation HIV antibody/p24 antigen test was sent. When this returned positive, a CD4 count was obtained and showed 3 cells/mm3, confirming AIDS. The patient ultimately received 38 days of ceftriaxone, was placed on elvitegravir, cobicistat, emtricitabine, and tenofovir alafenamide (Genvoya for HIV/AIDS, and was discharged neurologically intact after a 44-day admission.

  17. Detection and sizing of cracks using potential drop techniques based on electromagnetic induction

    International Nuclear Information System (INIS)

    Sato, Yasumoto; Kim, Hoon

    2011-01-01

    The potential drop techniques based on electromagnetic induction are classified into induced current focused potential drop (ICFPD) technique and remotely induced current potential drop (RICPD) technique. The possibility of numerical simulation of the techniques is investigated and the applicability of these techniques to the measurement of defects in conductive materials is presented. Finite element analysis (FEA) for the RICPD measurements on the plate specimen containing back wall slits is performed and calculated results by FEA show good agreement with experimental results. Detection limit of the RICPD technique in depth of back wall slits can also be estimated by FEA. Detection and sizing of artificial defects in parent and welded materials are successfully performed by the ICFPD technique. Applicability of these techniques to detection of cracks in field components is investigated, and most of the cracks in the components investigated are successfully detected by the ICFPD and RICPD techniques. (author)

  18. GIS-based bivariate statistical techniques for groundwater potential ...

    Indian Academy of Sciences (India)

    Ali Haghizadeh

    2017-11-23

    Nov 23, 2017 ... regions. This study shows the potency of two GIS-based data driven ... growth of these tools has also prepared another ..... Urban. 30467. 3. 0.06. 0.20. 0.74. 0.80. −0.64. Distance from road ..... and artificial neural networks for potential groundwater .... ping: A case study at Mehran region, Iran; Catena 137.

  19. Customer requirements based ERP customization using AHP technique

    NARCIS (Netherlands)

    Parthasarathy, S.; Daneva, Maia

    2014-01-01

    Purpose– Customization is a difficult task for many organizations implementing enterprise resource planning (ERP) systems. The purpose of this paper is to develop a new framework based on customers’ requirements to examine the ERP customization choices for the enterprise. The analytical hierarchy

  20. The Visual Memory-Based Memorization Techniques in Piano Education

    Science.gov (United States)

    Yucetoker, Izzet

    2016-01-01

    Problem Statement: Johann Sebastian Bach is one of the leading composers of the baroque period. In addition to his huge contributions in the artistic dimension, he also served greatly in the field of education. This study has been done for determining the impact of visual memory-based memorization practices in the piano education on the visual…

  1. Photon attenuation correction technique in SPECT based on nonlinear optimization

    International Nuclear Information System (INIS)

    Suzuki, Shigehito; Wakabayashi, Misato; Okuyama, Keiichi; Kuwamura, Susumu

    1998-01-01

    Photon attenuation correction in SPECT was made using a nonlinear optimization theory, in which an optimum image is searched so that the sum of square errors between observed and reprojected projection data is minimized. This correction technique consists of optimization and step-width algorithms, which determine at each iteration a pixel-by-pixel directional value of search and its step-width, respectively. We used the conjugate gradient and quasi-Newton methods as the optimization algorithm, and Curry rule and the quadratic function method as the step-width algorithm. Statistical fluctuations in the corrected image due to statistical noise in the emission projection data grew as the iteration increased, depending on the combination of optimization and step-width algorithms. To suppress them, smoothing for directional values was introduced. Computer experiments and clinical applications showed a pronounced reduction in statistical fluctuations of the corrected image for all combinations. Combinations using the conjugate gradient method were superior in noise characteristic and computation time. The use of that method with the quadratic function method was optimum if noise property was regarded as important. (author)

  2. Key techniques for space-based solar pumped semiconductor lasers

    Science.gov (United States)

    He, Yang; Xiong, Sheng-jun; Liu, Xiao-long; Han, Wei-hua

    2014-12-01

    In space, the absence of atmospheric turbulence, absorption, dispersion and aerosol factors on laser transmission. Therefore, space-based laser has important values in satellite communication, satellite attitude controlling, space debris clearing, and long distance energy transmission, etc. On the other hand, solar energy is a kind of clean and renewable resources, the average intensity of solar irradiation on the earth is 1353W/m2, and it is even higher in space. Therefore, the space-based solar pumped lasers has attracted much research in recent years, most research focuses on solar pumped solid state lasers and solar pumped fiber lasers. The two lasing principle is based on stimulated emission of the rare earth ions such as Nd, Yb, Cr. The rare earth ions absorb light only in narrow bands. This leads to inefficient absorption of the broad-band solar spectrum, and increases the system heating load, which make the system solar to laser power conversion efficiency very low. As a solar pumped semiconductor lasers could absorb all photons with energy greater than the bandgap. Thus, solar pumped semiconductor lasers could have considerably higher efficiencies than other solar pumped lasers. Besides, solar pumped semiconductor lasers has smaller volume chip, simpler structure and better heat dissipation, it can be mounted on a small satellite platform, can compose satellite array, which can greatly improve the output power of the system, and have flexible character. This paper summarizes the research progress of space-based solar pumped semiconductor lasers, analyses of the key technologies based on several application areas, including the processing of semiconductor chip, the design of small and efficient solar condenser, and the cooling system of lasers, etc. We conclude that the solar pumped vertical cavity surface-emitting semiconductor lasers will have a wide application prospects in the space.

  3. A hybrid technique for private location-based queries with database protection

    KAUST Repository

    Ghinita, Gabriel

    2009-01-01

    Mobile devices with global positioning capabilities allow users to retrieve points of interest (POI) in their proximity. To protect user privacy, it is important not to disclose exact user coordinates to un-trusted entities that provide location-based services. Currently, there are two main approaches to protect the location privacy of users: (i) hiding locations inside cloaking regions (CRs) and (ii) encrypting location data using private information retrieval (PIR) protocols. Previous work focused on finding good trade-offs between privacy and performance of user protection techniques, but disregarded the important issue of protecting the POI dataset D. For instance, location cloaking requires large-sized CRs, leading to excessive disclosure of POIs (O(|D|) in the worst case). PIR, on the other hand, reduces this bound to , but at the expense of high processing and communication overhead. We propose a hybrid, two-step approach to private location-based queries, which provides protection for both the users and the database. In the first step, user locations are generalized to coarse-grained CRs which provide strong privacy. Next, a PIR protocol is applied with respect to the obtained query CR. To protect excessive disclosure of POI locations, we devise a cryptographic protocol that privately evaluates whether a point is enclosed inside a rectangular region. We also introduce an algorithm to efficiently support PIR on dynamic POI sub-sets. Our method discloses O(1) POI, orders of magnitude fewer than CR- or PIR-based techniques. Experimental results show that the hybrid approach is scalable in practice, and clearly outperforms the pure-PIR approach in terms of computational and communication overhead. © 2009 Springer Berlin Heidelberg.

  4. Cost-Effectiveness Model for Chemoimmunotherapy Options in Patients with Previously Untreated Chronic Lymphocytic Leukemia Unsuitable for Full-Dose Fludarabine-Based Therapy.

    Science.gov (United States)

    Becker, Ursula; Briggs, Andrew H; Moreno, Santiago G; Ray, Joshua A; Ngo, Phuong; Samanta, Kunal

    2016-06-01

    To evaluate the cost-effectiveness of treatment with anti-CD20 monoclonal antibody obinutuzumab plus chlorambucil (GClb) in untreated patients with chronic lymphocytic leukemia unsuitable for full-dose fludarabine-based therapy. A Markov model was used to assess the cost-effectiveness of GClb versus other chemoimmunotherapy options. The model comprised three mutually exclusive health states: "progression-free survival (with/without therapy)", "progression (refractory/relapsed lines)", and "death". Each state was assigned a health utility value representing patients' quality of life and a specific cost value. Comparisons between GClb and rituximab plus chlorambucil or only chlorambucil were performed using patient-level clinical trial data; other comparisons were performed via a network meta-analysis using information gathered in a systematic literature review. To support the model, a utility elicitation study was conducted from the perspective of the UK National Health Service. There was good agreement between the model-predicted progression-free and overall survival and that from the CLL11 trial. On incorporating data from the indirect treatment comparisons, it was found that GClb was cost-effective with a range of incremental cost-effectiveness ratios below a threshold of £30,000 per quality-adjusted life-year gained, and remained so during deterministic and probabilistic sensitivity analyses under various scenarios. GClb was estimated to increase both quality-adjusted life expectancy and treatment costs compared with several commonly used therapies, with incremental cost-effectiveness ratios below commonly referenced UK thresholds. This article offers a real example of how to combine direct and indirect evidence in a cost-effectiveness analysis of oncology drugs. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  5. Whole arm manipulation planning based on feedback velocity fields and sampling-based techniques.

    Science.gov (United States)

    Talaei, B; Abdollahi, F; Talebi, H A; Omidi Karkani, E

    2013-09-01

    Changing the configuration of a cooperative whole arm manipulator is not easy while enclosing an object. This difficulty is mainly because of risk of jamming caused by kinematic constraints. To reduce this risk, this paper proposes a feedback manipulation planning algorithm that takes grasp kinematics into account. The idea is based on a vector field that imposes perturbation in object motion inducing directions when the movement is considerably along manipulator redundant directions. Obstacle avoidance problem is then considered by combining the algorithm with sampling-based techniques. As experimental results confirm, the proposed algorithm is effective in avoiding jamming as well as obstacles for a 6-DOF dual arm whole arm manipulator. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Eye gazing direction inspection based on image processing technique

    Science.gov (United States)

    Hao, Qun; Song, Yong

    2005-02-01

    According to the research result in neural biology, human eyes can obtain high resolution only at the center of view of field. In the research of Virtual Reality helmet, we design to detect the gazing direction of human eyes in real time and feed it back to the control system to improve the resolution of the graph at the center of field of view. In the case of current display instruments, this method can both give attention to the view field of virtual scene and resolution, and improve the immersion of virtual system greatly. Therefore, detecting the gazing direction of human eyes rapidly and exactly is the basis of realizing the design scheme of this novel VR helmet. In this paper, the conventional method of gazing direction detection that based on Purklinje spot is introduced firstly. In order to overcome the disadvantage of the method based on Purklinje spot, this paper proposed a method based on image processing to realize the detection and determination of the gazing direction. The locations of pupils and shapes of eye sockets change with the gazing directions. With the aid of these changes, analyzing the images of eyes captured by the cameras, gazing direction of human eyes can be determined finally. In this paper, experiments have been done to validate the efficiency of this method by analyzing the images. The algorithm can carry out the detection of gazing direction base on normal eye image directly, and it eliminates the need of special hardware. Experiment results show that the method is easy to implement and have high precision.

  7. COMPARISON AND EVALUATION OF CLUSTER BASED IMAGE SEGMENTATION TECHNIQUES

    OpenAIRE

    Hetangi D. Mehta*, Daxa Vekariya, Pratixa Badelia

    2017-01-01

    Image segmentation is the classification of an image into different groups. Numerous algorithms using different approaches have been proposed for image segmentation. A major challenge in segmentation evaluation comes from the fundamental conflict between generality and objectivity. A review is done on different types of clustering methods used for image segmentation. Also a methodology is proposed to classify and quantify different clustering algorithms based on their consistency in different...

  8. A new imaging riometer based on Mills Cross technique

    OpenAIRE

    Grill, M.; Honary, F.; Nielsen, E.; Hagfors, T.; Dekoulis, G.; Chapman, P.; Yamagishi, H.

    2003-01-01

    A new type of imaging riometer system based on a Mills Cross antenna array is currently under construction by the Ionosphere and Radio Propagation Group, Department of Communication Systems, Lancaster in collaboration with the Max-Planck-Institut für Aeronomie, Germany. The system will have an unprecedented spatial resolution in a viewing area of 300x300km. It is located at Ramfjordmoen, near Tromsø, Norway. The riometer (relative ionospheric opacity meter) determines the radio-wave absorptio...

  9. Projection computation based on pixel in simultaneous algebraic reconstruction technique

    International Nuclear Information System (INIS)

    Wang Xu; Chen Zhiqiang; Xiong Hua; Zhang Li

    2005-01-01

    SART is an important arithmetic of image reconstruction, in which the projection computation takes over half of the reconstruction time. An efficient way to compute projection coefficient matrix together with memory optimization is presented in this paper. Different from normal method, projection lines are located based on every pixel, and the following projection coefficient computation can make use of the results. Correlation of projection lines and pixels can be used to optimize the computation. (authors)

  10. Development of Energy Management System Based on Internet of Things Technique

    OpenAIRE

    Wen-Jye Shyr; Chia-Ming Lin and Hung-Yun Feng

    2017-01-01

    The purpose of this study was to develop an energy management system for university campuses based on the Internet of Things (IoT) technique. The proposed IoT technique based on WebAccess is used via network browser Internet Explore and applies TCP/IP protocol. The case study of IoT for lighting energy usage management system was proposed. Structure of proposed IoT technique included perception layer, equipment layer, control layer, application layer and network layer.

  11. Spreading of suppository bases assessed with histological and scintigraphic techniques

    International Nuclear Information System (INIS)

    Tupper, C.H.; Copping, N.; Thomas, N.W.; Wilson, C.G.

    1982-01-01

    Suppositories of PEG 15400 and PEG 600, Myrj 52 and Brij 35, were administered rectally to fasted male rats. 30 and 60 mins after liquefaction time samples of rectal mucosa were taken from treated and untreated rats. The reduction in rectal cell volume and density in treated rats was noted. Similar suppositories, containing anion exchange resin and labelled with technetium 99, were administered to other rats. Serial scintiscanning was carried out using a gamma camera linked to a computer. Spreading of the suppository bases was assessed histologically and by imaging. (U.K.)

  12. Flash floods warning technique based on wireless communication networks data

    Science.gov (United States)

    David, Noam; Alpert, Pinhas; Messer, Hagit

    2010-05-01

    Flash floods can occur throughout or subsequent to rainfall events, particularly in cases where the precipitation is of high-intensity. Unfortunately, each year these floods cause severe property damage and heavy casualties. At present, there are no sufficient real time flash flood warning facilities found to cope with this phenomenon. Here we show the tremendous potential of flash floods advanced warning based on precipitation measurements of commercial microwave links. As was recently shown, wireless communication networks supply high resolution precipitation measurements at ground level while often being situated in flood prone areas, covering large parts of these hazardous regions. We present the flash flood warning potential of the wireless communication system for two different cases when floods occurred at the Judean desert and at the northern Negev in Israel. In both cases, an advanced warning regarding the hazard could have been announced based on this system. • This research was supported by THE ISRAEL SCIENCE FOUNDATION (grant No. 173/08). This work was also supported by a grant from the Yeshaya Horowitz Association, Jerusalem. Additional support was given by the PROCEMA-BMBF project and by the GLOWA-JR BMBF project.

  13. Concepts and techniques for conducting performance-based audits

    International Nuclear Information System (INIS)

    Espy, I.J.

    1990-01-01

    Quality assurance (QA) audits have historically varied in purpose and approach and have earned labels that attempt to name each type of audit. Some more common labels for QA audits include compliance, program, product, and performance-based. While documentation and methodologies are important and hence controlled, an organizations product has ultimate impact on the user. Product quality then must be of more concern to the user than documentation and methodologies of the provider. Performance-based auditing (PBA) provides for assessing product quality by evaluating the suppliers activities that produce and affect product quality. PBA is defined as auditing that evaluates the ability of an activity to regularly produce and release only acceptable product, where product refers to the output of the activity. The output may be hardware, software, or a service, and acceptability includes suitability to the users needs. To satisfy this definition, PBA should focus on the activities that produce and affect product and should evaluate the systematics of each activity in terms of its ability to produce acceptable product. The activity evaluation model provides a framework for evaluating the systematicness of any activity. Elements of the activity evaluation model are described

  14. Physically-Based Interactive Flow Visualization Based on Schlieren and Interferometry Experimental Techniques

    KAUST Repository

    Brownlee, C.

    2011-11-01

    Understanding fluid flow is a difficult problem and of increasing importance as computational fluid dynamics (CFD) produces an abundance of simulation data. Experimental flow analysis has employed techniques such as shadowgraph, interferometry, and schlieren imaging for centuries, which allow empirical observation of inhomogeneous flows. Shadowgraphs provide an intuitive way of looking at small changes in flow dynamics through caustic effects while schlieren cutoffs introduce an intensity gradation for observing large scale directional changes in the flow. Interferometry tracks changes in phase-shift resulting in bands appearing. The combination of these shading effects provides an informative global analysis of overall fluid flow. Computational solutions for these methods have proven too complex until recently due to the fundamental physical interaction of light refracting through the flow field. In this paper, we introduce a novel method to simulate the refraction of light to generate synthetic shadowgraph, schlieren and interferometry images of time-varying scalar fields derived from computational fluid dynamics data. Our method computes physically accurate schlieren and shadowgraph images at interactive rates by utilizing a combination of GPGPU programming, acceleration methods, and data-dependent probabilistic schlieren cutoffs. Applications of our method to multifield data and custom application-dependent color filter creation are explored. Results comparing this method to previous schlieren approximations are finally presented. © 2011 IEEE.

  15. Developing Visualization Techniques for Semantics-based Information Networks

    Science.gov (United States)

    Keller, Richard M.; Hall, David R.

    2003-01-01

    Information systems incorporating complex network structured information spaces with a semantic underpinning - such as hypermedia networks, semantic networks, topic maps, and concept maps - are being deployed to solve some of NASA s critical information management problems. This paper describes some of the human interaction and navigation problems associated with complex semantic information spaces and describes a set of new visual interface approaches to address these problems. A key strategy is to leverage semantic knowledge represented within these information spaces to construct abstractions and views that will be meaningful to the human user. Human-computer interaction methodologies will guide the development and evaluation of these approaches, which will benefit deployed NASA systems and also apply to information systems based on the emerging Semantic Web.

  16. Computational Intelligence based techniques for islanding detection of distributed generation in distribution network: A review

    International Nuclear Information System (INIS)

    Laghari, J.A.; Mokhlis, H.; Karimi, M.; Bakar, A.H.A.; Mohamad, Hasmaini

    2014-01-01

    Highlights: • Unintentional and intentional islanding, their causes, and solutions are presented. • Remote, passive, active and hybrid islanding detection techniques are discussed. • The limitation of these techniques in accurately detect islanding are discussed. • Computational intelligence techniques ability in detecting islanding is discussed. • Review of ANN, fuzzy logic control, ANFIS, Decision tree techniques is provided. - Abstract: Accurate and fast islanding detection of distributed generation is highly important for its successful operation in distribution networks. Up to now, various islanding detection technique based on communication, passive, active and hybrid methods have been proposed. However, each technique suffers from certain demerits that cause inaccuracies in islanding detection. Computational intelligence based techniques, due to their robustness and flexibility in dealing with complex nonlinear systems, is an option that might solve this problem. This paper aims to provide a comprehensive review of computational intelligence based techniques applied for islanding detection of distributed generation. Moreover, the paper compares the accuracies of computational intelligence based techniques over existing techniques to provide a handful of information for industries and utility researchers to determine the best method for their respective system

  17. A survey on OFDM channel estimation techniques based on denoising strategies

    Directory of Open Access Journals (Sweden)

    Pallaviram Sure

    2017-04-01

    Full Text Available Channel estimation forms the heart of any orthogonal frequency division multiplexing (OFDM based wireless communication receiver. Frequency domain pilot aided channel estimation techniques are either least squares (LS based or minimum mean square error (MMSE based. LS based techniques are computationally less complex. Unlike MMSE ones, they do not require a priori knowledge of channel statistics (KCS. However, the mean square error (MSE performance of the channel estimator incorporating MMSE based techniques is better compared to that obtained with the incorporation of LS based techniques. To enhance the MSE performance using LS based techniques, a variety of denoising strategies have been developed in the literature, which are applied on the LS estimated channel impulse response (CIR. The advantage of denoising threshold based LS techniques is that, they do not require KCS but still render near optimal MMSE performance similar to MMSE based techniques. In this paper, a detailed survey on various existing denoising strategies, with a comparative discussion of these strategies is presented.

  18. Communication methods and production techniques in fixed prosthesis fabrication: a UK based survey. Part 2: Production techniques

    Science.gov (United States)

    Berry, J.; Nesbit, M.; Saberi, S.; Petridis, H.

    2014-01-01

    Aim The aim of this study was to identify the communication methods and production techniques used by dentists and dental technicians for the fabrication of fixed prostheses within the UK from the dental technicians' perspective. This second paper reports on the production techniques utilised. Materials and methods Seven hundred and eighty-two online questionnaires were distributed to the Dental Laboratories Association membership and included a broad range of topics, such as demographics, impression disinfection and suitability, and various production techniques. Settings were managed in order to ensure anonymity of respondents. Statistical analysis was undertaken to test the influence of various demographic variables such as the source of information, the location, and the size of the dental laboratory. Results The number of completed responses totalled 248 (32% response rate). Ninety percent of the respondents were based in England and the majority of dental laboratories were categorised as small sized (working with up to 25 dentists). Concerns were raised regarding inadequate disinfection protocols between dentists and dental laboratories and the poor quality of master impressions. Full arch plastic trays were the most popular impression tray used by dentists in the fabrication of crowns (61%) and bridgework (68%). The majority (89%) of jaw registration records were considered inaccurate. Forty-four percent of dental laboratories preferred using semi-adjustable articulators. Axial and occlusal under-preparation of abutment teeth was reported as an issue in about 25% of cases. Base metal alloy was the most (52%) commonly used alloy material. Metal-ceramic crowns were the most popular choice for anterior (69%) and posterior (70%) cases. The various factors considered did not have any statistically significant effect on the answers provided. The only notable exception was the fact that more methods of communicating the size and shape of crowns were utilised for

  19. Mobile Robot Navigation Based on Q-Learning Technique

    Directory of Open Access Journals (Sweden)

    Lazhar Khriji

    2011-03-01

    Full Text Available This paper shows how Q-learning approach can be used in a successful way to deal with the problem of mobile robot navigation. In real situations where a large number of obstacles are involved, normal Q-learning approach would encounter two major problems due to excessively large state space. First, learning the Q-values in tabular form may be infeasible because of the excessive amount of memory needed to store the table. Second, rewards in the state space may be so sparse that with random exploration they will only be discovered extremely slowly. In this paper, we propose a navigation approach for mobile robot, in which the prior knowledge is used within Q-learning. We address the issue of individual behavior design using fuzzy logic. The strategy of behaviors based navigation reduces the complexity of the navigation problem by dividing them in small actions easier for design and implementation. The Q-Learning algorithm is applied to coordinate between these behaviors, which make a great reduction in learning convergence times. Simulation and experimental results confirm the convergence to the desired results in terms of saved time and computational resources.

  20. Damage identification in beams by a response surface based technique

    Directory of Open Access Journals (Sweden)

    Teidj S.

    2014-01-01

    Full Text Available In this work, identification of damage in uniform homogeneous metallic beams was considered through the propagation of non dispersive elastic torsional waves. The proposed damage detection procedure consisted of the following sequence. Giving a localized torque excitation, having the form of a short half-sine pulse, the first step was calculating the transient solution of the resulting torsional wave. This torque could be generated in practice by means of asymmetric laser irradiation of the beam surface. Then, a localized defect assumed to be characterized by an abrupt reduction of beam section area with a given height and extent was placed at a known location of the beam. Next, the response in terms of transverse section rotation rate was obtained for a point situated afterwards the defect, where the sensor was positioned. This last could utilize in practice the concept of laser vibrometry. A parametric study has been conducted after that by using a full factorial design of experiments table and numerical simulations based on a finite difference characteristic scheme. This has enabled the derivation of a response surface model that was shown to represent adequately the response of the system in terms of the following factors: defect extent and severity. The final step was performing the inverse problem solution in order to identify the defect characteristics by using measurement.

  1. PELAN - a transportable, neutron-based UXO identification technique

    International Nuclear Information System (INIS)

    Vourvopoulos, G.

    1998-01-01

    An elemental characterization method is used to differentiate between inert projectiles and UXO's. This method identifies in a non-intrusive, nondestructive manner, the elemental composition of the projectile contents. Most major and minor chemical elements within the interrogated object (hydrogen, carbon, nitrogen, oxygen, fluorine, phosphorus, chlorine, arsenic, etc.) are identified and quantified. The method is based on PELAN - Pulsed Elemental Analysis with Neutrons. PELAN uses pulsed neutrons produced from a compact, sealed tube neutron generator. Using an automatic analysis computer program, the quantities of each major and minor chemical element are determined. A decision-making tree identifies the object by comparing its elemental composition with stored elemental composition libraries of substances that could be contained within the projectile. In a series of blind tests, PELAN was able to identify without failure, the contents of each shell placed in front of it. The PELAN probe does not need to be in contact with the interrogated projectile. If the object is buried, the interrogation can take place in situ provided the probe can be inserted a few centimeters from the object's surface. (author)

  2. Design of process displays based on risk analysis techniques

    International Nuclear Information System (INIS)

    Lundtang Paulsen, J.

    2004-05-01

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, especially in view of the enormous amount of information available in computer-based supervision systems. The state of the art is discussed: How are supervision systems designed today and why? Which strategies are used? What kind of research is going on? Four different plants and their display systems, designed by the author, are described and discussed. Next we outline different methods for eliciting knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chapter presents an overview of the various types of operation references: constitutive equations, set points, design parameters, component characteristics etc., and their validity in different situations. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engineer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described in some detail. Finally we address the problem of where to put the dot and the lines: when all information is on the table, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose was to develop a software tool for maintenance supervision of components in a nuclear power plant. (au)

  3. A correlation-based pulse detection technique for gamma-ray/neutron detectors

    International Nuclear Information System (INIS)

    Faisal, Muhammad; Schiffer, Randolph T.; Flaska, Marek; Pozzi, Sara A.; Wentzloff, David D.

    2011-01-01

    We present a correlation-based detection technique that significantly improves the probability of detection for low energy pulses. We propose performing a normalized cross-correlation of the incoming pulse data to a predefined pulse template, and using a threshold correlation value to trigger the detection of a pulse. This technique improves the detector sensitivity by amplifying the signal component of incoming pulse data and rejecting noise. Simulation results for various different templates are presented. Finally, the performance of the correlation-based detection technique is compared to the current state-of-the-art techniques.

  4. A Mindfulness-Based Decentering Technique Increases the Cognitive Accessibility of Health and Weight Loss Related Goals

    Directory of Open Access Journals (Sweden)

    Katy Tapper

    2018-04-01

    Full Text Available Previous research has shown that a mindfulness-based decentering technique can help individuals resist eating chocolate over a 5-day period. However, it is unclear how this technique exerts its effect. This study explored one potential mechanism; that decentering increases the cognitive accessibility of relevant goals. Male and female participants (n = 90 spent 5 min practicing either a decentering or relaxation (control technique. They then viewed a picture of a chocolate bar for 3 min whilst either applying the decentering technique or letting their mind wander (control. Finally, all participants completed 20 letter strings, rated their motivation for weight loss and for healthy eating, and indicated whether or not they were dieting to lose weight. As predicted, those who had applied the decentering technique produced a greater number of health and weight loss related words when completing the letter strings, compared to those who had simply let their mind wander (p < 0.001. However, contrary to predictions, these effects were not significantly greater amongst those who were more motivated to lose weight or eat healthily, or amongst those who were dieting to lose weight, though the means were in the predicted directions. The results suggest that this particular mindfulness technique may increase the accessibility of relevant goals. Further research would be needed to (a compare effects with other strategies that prompt individuals to remember their goals, (b examine other potential mechanisms of action, and (c confirm that effects on self-control are mediated by increased goal accessibility.

  5. A comparison of mandibular denture base deformation with different impression techniques for implant overdentures.

    Science.gov (United States)

    Elsyad, Moustafa Abdou; El-Waseef, Fatma Ahmad; Al-Mahdy, Yasmeen Fathy; Fouad, Mohammed Mohammed

    2013-08-01

    This study aimed to evaluate mandibular denture base deformation along with three impression techniques used for implant-retained overdenture. Ten edentulous patients (five men and five women) received two implants in the canine region of the mandible and three duplicate mandibular overdentures which were constructed with mucostatic, selective pressure, and definitive pressure impression techniques. Ball abutments and respective gold matrices were used to connect the overdentures to the implants. Six linear strain gauges were bonded to the lingual polished surface of each duplicate overdenture at midline and implant areas to measure strain during maximal clenching and gum chewing. The strains recorded at midline were compressive while strains at implant areas were tensile. Clenching recorded significant higher strain when compared with gum chewing for all techniques. The mucostatic technique recorded the highest strain and the definite pressure technique recorded the lowest. There was no significant difference between the strain recorded with mucostatic technique and that registered with selective pressure technique. The highest strain was recorded at the level of ball abutment's top with the mucostatic technique during clenching. Definite pressure impression technique for implant-retained mandibular overdenture is associated with minimal denture deformation during function when compared with mucostatic and selective pressure techniques. Reinforcement of the denture base over the implants may be recommended to increase resistance of fracture when mucostatic or selective pressure impression technique is used. © 2012 John Wiley & Sons A/S.

  6. Wood lens design philosophy based on a binary additive manufacturing technique

    Science.gov (United States)

    Marasco, Peter L.; Bailey, Christopher

    2016-04-01

    Using additive manufacturing techniques in optical engineering to construct a gradient index (GRIN) optic may overcome a number of limitations of GRIN technology. Such techniques are maturing quickly, yielding additional design degrees of freedom for the engineer. How best to employ these degrees of freedom is not completely clear at this time. This paper describes a preliminary design philosophy, including assumptions, pertaining to a particular printing technique for GRIN optics. It includes an analysis based on simulation and initial component measurement.

  7. Tilting-Twisting-Rolling: a pen-based technique for compass geometric construction

    Institute of Scientific and Technical Information of China (English)

    Fei LYU; Feng TIAN; Guozhong DAI; Hongan WANG

    2017-01-01

    This paper presents a new pen-based technique,Tilting-Twisting-Rolling,to support compass geometric construction.By leveraging the 3D orientation information and 3D rotation information of a pen,this technique allows smooth pen action to complete multi-step geometric construction without switching task states.Results from a user study show this Tilting-Twisting-Rolling technique can improve user performance and user experience in compass geometric construction.

  8. Progress in the development of a video-based wind farm simulation technique

    OpenAIRE

    Robotham, AJ

    1992-01-01

    The progress in the development of a video-based wind farm simulation technique is reviewed. While improvements have been achieved in the quality of the composite picture created by combining computer generated animation sequences of wind turbines with background scenes of the wind farm site, extending the technique to include camera movements has proved troublesome.

  9. Towards automating the discovery of certain innovative design principles through a clustering-based optimization technique

    Science.gov (United States)

    Bandaru, Sunith; Deb, Kalyanmoy

    2011-09-01

    In this article, a methodology is proposed for automatically extracting innovative design principles which make a system or process (subject to conflicting objectives) optimal using its Pareto-optimal dataset. Such 'higher knowledge' would not only help designers to execute the system better, but also enable them to predict how changes in one variable would affect other variables if the system has to retain its optimal behaviour. This in turn would help solve other similar systems with different parameter settings easily without the need to perform a fresh optimization task. The proposed methodology uses a clustering-based optimization technique and is capable of discovering hidden functional relationships between the variables, objective and constraint functions and any other function that the designer wishes to include as a 'basis function'. A number of engineering design problems are considered for which the mathematical structure of these explicit relationships exists and has been revealed by a previous study. A comparison with the multivariate adaptive regression splines (MARS) approach reveals the practicality of the proposed approach due to its ability to find meaningful design principles. The success of this procedure for automated innovization is highly encouraging and indicates its suitability for further development in tackling more complex design scenarios.

  10. Volume Recovery of Polymeric Glasses: Application of a Capacitance-based Measurement Technique

    Science.gov (United States)

    Sakib, Nazam; Simon, Sindee

    Glasses, including polymeric glasses, are inherently non-equilibrium materials. As a consequence, the volume and enthalpy of a glass evolve towards equilibrium in a process termed structural recovery. Several open questions and new controversies remain unanswered in the field. Specifically, the presence of intermediate plateaus during isothermal structural recovery has been reported in recent enthalpy work. In addition, the dependence of the relaxation time on state variables and thermal history is unclear. Dilatometry is particularly useful for structural recovery studies because volume is an absolute quantity and volumetric measurements can be done in-situ. A capillary dilatometer, fitted with a linear variable differential transducer, was used previously to measure volume recovery of polymeric glass formers in our laboratory. To improve on the limitations associated with that methodology, including competition between the range of measurements versus the sensitivity, a capacitance-based technique has been developed following the work of Richert, 2010. The modification is performed by converting the glass capillary dilatometer into a cylindrical capacitor. For precision in capacitance data acquisition, an Andeen-Hagerling ultra-precision capacitance bridge (2550A, 1 kHz) is used. The setup will be tested by performing the signatures of structural recovery as described by Kovacs, 1963. Experiments are also planned to address the open questions in the field.

  11. LVQ-SMOTE - Learning Vector Quantization based Synthetic Minority Over-sampling Technique for biomedical data.

    Science.gov (United States)

    Nakamura, Munehiro; Kajiwara, Yusuke; Otsuka, Atsushi; Kimura, Haruhiko

    2013-10-02

    Over-sampling methods based on Synthetic Minority Over-sampling Technique (SMOTE) have been proposed for classification problems of imbalanced biomedical data. However, the existing over-sampling methods achieve slightly better or sometimes worse result than the simplest SMOTE. In order to improve the effectiveness of SMOTE, this paper presents a novel over-sampling method using codebooks obtained by the learning vector quantization. In general, even when an existing SMOTE applied to a biomedical dataset, its empty feature space is still so huge that most classification algorithms would not perform well on estimating borderlines between classes. To tackle this problem, our over-sampling method generates synthetic samples which occupy more feature space than the other SMOTE algorithms. Briefly saying, our over-sampling method enables to generate useful synthetic samples by referring to actual samples taken from real-world datasets. Experiments on eight real-world imbalanced datasets demonstrate that our proposed over-sampling method performs better than the simplest SMOTE on four of five standard classification algorithms. Moreover, it is seen that the performance of our method increases if the latest SMOTE called MWMOTE is used in our algorithm. Experiments on datasets for β-turn types prediction show some important patterns that have not been seen in previous analyses. The proposed over-sampling method generates useful synthetic samples for the classification of imbalanced biomedical data. Besides, the proposed over-sampling method is basically compatible with basic classification algorithms and the existing over-sampling methods.

  12. THE OPTIMIZATION OF TECHNOLOGICAL MINING PARAMETERS IN QUARRY FOR DIMENSION STONE BLOCKS QUALITY IMPROVEMENT BASED ON PHOTOGRAMMETRIC TECHNIQUES OF MEASUREMENT

    Directory of Open Access Journals (Sweden)

    Ruslan Sobolevskyi

    2018-01-01

    Full Text Available This research focuses on patterns of change in the dimension stone commodity blocks quality production on previously identifi ed and measured geometrical parameters of natural cracks, modelling and planning out the fi nal dimension of stone products and fi nished products based on the proposed digital photogrammetric techniques. The optimal parameters of surveying are investigated and the infl uence of surveying distance to length and crack area is estimated. Rational technological parameters of dimension stone blocks production are taken into account.

  13. Photoacoustic Techniques for Trace Gas Sensing Based on Semiconductor Laser Sources

    Directory of Open Access Journals (Sweden)

    Vincenzo Spagnolo

    2009-12-01

    Full Text Available The paper provides an overview on the use of photoacoustic sensors based on semiconductor laser sources for the detection of trace gases. We review the results obtained using standard, differential and quartz enhanced photoacoustic techniques.

  14. Perceptual evaluation of corpus-based speech synthesis techniques in under-resourced environments

    CSIR Research Space (South Africa)

    Van Niekerk, DR

    2009-11-01

    Full Text Available With the increasing prominence and maturity of corpus-based techniques for speech synthesis, the process of system development has in some ways been simplified considerably. However, the dependence on sufficient amounts of relevant speech data...

  15. Resizing Technique-Based Hybrid Genetic Algorithm for Optimal Drift Design of Multistory Steel Frame Buildings

    Directory of Open Access Journals (Sweden)

    Hyo Seon Park

    2014-01-01

    Full Text Available Since genetic algorithm-based optimization methods are computationally expensive for practical use in the field of structural optimization, a resizing technique-based hybrid genetic algorithm for the drift design of multistory steel frame buildings is proposed to increase the convergence speed of genetic algorithms. To reduce the number of structural analyses required for the convergence, a genetic algorithm is combined with a resizing technique that is an efficient optimal technique to control the drift of buildings without the repetitive structural analysis. The resizing technique-based hybrid genetic algorithm proposed in this paper is applied to the minimum weight design of three steel frame buildings. To evaluate the performance of the algorithm, optimum weights, computational times, and generation numbers from the proposed algorithm are compared with those from a genetic algorithm. Based on the comparisons, it is concluded that the hybrid genetic algorithm shows clear improvements in convergence properties.

  16. A Comparative Analysis of Transmission Control Protocol Improvement Techniques over Space-Based Transmission Media

    National Research Council Canada - National Science Library

    Lawson, Joseph M

    2006-01-01

    ... justification for the implementation of a given enhancement technique. The research questions were answered through model and simulation of a satellite transmission system via a Linux-based network topology...

  17. Comparative Study of Retinal Vessel Segmentation Based on Global Thresholding Techniques

    Directory of Open Access Journals (Sweden)

    Temitope Mapayi

    2015-01-01

    Full Text Available Due to noise from uneven contrast and illumination during acquisition process of retinal fundus images, the use of efficient preprocessing techniques is highly desirable to produce good retinal vessel segmentation results. This paper develops and compares the performance of different vessel segmentation techniques based on global thresholding using phase congruency and contrast limited adaptive histogram equalization (CLAHE for the preprocessing of the retinal images. The results obtained show that the combination of preprocessing technique, global thresholding, and postprocessing techniques must be carefully chosen to achieve a good segmentation performance.

  18. Comparison of multivariate preprocessing techniques as applied to electronic tongue based pattern classification for black tea

    International Nuclear Information System (INIS)

    Palit, Mousumi; Tudu, Bipan; Bhattacharyya, Nabarun; Dutta, Ankur; Dutta, Pallab Kumar; Jana, Arun; Bandyopadhyay, Rajib; Chatterjee, Anutosh

    2010-01-01

    In an electronic tongue, preprocessing on raw data precedes pattern analysis and choice of the appropriate preprocessing technique is crucial for the performance of the pattern classifier. While attempting to classify different grades of black tea using a voltammetric electronic tongue, different preprocessing techniques have been explored and a comparison of their performances is presented in this paper. The preprocessing techniques are compared first by a quantitative measurement of separability followed by principle component analysis; and then two different supervised pattern recognition models based on neural networks are used to evaluate the performance of the preprocessing techniques.

  19. Developing a hybrid dictionary-based bio-entity recognition technique

    Science.gov (United States)

    2015-01-01

    Background Bio-entity extraction is a pivotal component for information extraction from biomedical literature. The dictionary-based bio-entity extraction is the first generation of Named Entity Recognition (NER) techniques. Methods This paper presents a hybrid dictionary-based bio-entity extraction technique. The approach expands the bio-entity dictionary by combining different data sources and improves the recall rate through the shortest path edit distance algorithm. In addition, the proposed technique adopts text mining techniques in the merging stage of similar entities such as Part of Speech (POS) expansion, stemming, and the exploitation of the contextual cues to further improve the performance. Results The experimental results show that the proposed technique achieves the best or at least equivalent performance among compared techniques, GENIA, MESH, UMLS, and combinations of these three resources in F-measure. Conclusions The results imply that the performance of dictionary-based extraction techniques is largely influenced by information resources used to build the dictionary. In addition, the edit distance algorithm shows steady performance with three different dictionaries in precision whereas the context-only technique achieves a high-end performance with three difference dictionaries in recall. PMID:26043907

  20. Developing a hybrid dictionary-based bio-entity recognition technique.

    Science.gov (United States)

    Song, Min; Yu, Hwanjo; Han, Wook-Shin

    2015-01-01

    Bio-entity extraction is a pivotal component for information extraction from biomedical literature. The dictionary-based bio-entity extraction is the first generation of Named Entity Recognition (NER) techniques. This paper presents a hybrid dictionary-based bio-entity extraction technique. The approach expands the bio-entity dictionary by combining different data sources and improves the recall rate through the shortest path edit distance algorithm. In addition, the proposed technique adopts text mining techniques in the merging stage of similar entities such as Part of Speech (POS) expansion, stemming, and the exploitation of the contextual cues to further improve the performance. The experimental results show that the proposed technique achieves the best or at least equivalent performance among compared techniques, GENIA, MESH, UMLS, and combinations of these three resources in F-measure. The results imply that the performance of dictionary-based extraction techniques is largely influenced by information resources used to build the dictionary. In addition, the edit distance algorithm shows steady performance with three different dictionaries in precision whereas the context-only technique achieves a high-end performance with three difference dictionaries in recall.

  1. Feathering effect detection and artifact agglomeration index-based video deinterlacing technique

    Science.gov (United States)

    Martins, André Luis; Rodrigues, Evandro Luis Linhari; de Paiva, Maria Stela Veludo

    2018-03-01

    Several video deinterlacing techniques have been developed, and each one presents a better performance in certain conditions. Occasionally, even the most modern deinterlacing techniques create frames with worse quality than primitive deinterlacing processes. This paper validates that the final image quality can be improved by combining different types of deinterlacing techniques. The proposed strategy is able to select between two types of deinterlaced frames and, if necessary, make the local correction of the defects. This decision is based on an artifact agglomeration index obtained from a feathering effect detection map. Starting from a deinterlaced frame produced by the "interfield average" method, the defective areas are identified, and, if deemed appropriate, these areas are replaced by pixels generated through the "edge-based line average" method. Test results have proven that the proposed technique is able to produce video frames with higher quality than applying a single deinterlacing technique through getting what is good from intra- and interfield methods.

  2. A new simple technique for improving the random properties of chaos-based cryptosystems

    Science.gov (United States)

    Garcia-Bosque, M.; Pérez-Resa, A.; Sánchez-Azqueta, C.; Celma, S.

    2018-03-01

    A new technique for improving the security of chaos-based stream ciphers has been proposed and tested experimentally. This technique manages to improve the randomness properties of the generated keystream by preventing the system to fall into short period cycles due to digitation. In order to test this technique, a stream cipher based on a Skew Tent Map algorithm has been implemented on a Virtex 7 FPGA. The randomness of the keystream generated by this system has been compared to the randomness of the keystream generated by the same system with the proposed randomness-enhancement technique. By subjecting both keystreams to the National Institute of Standards and Technology (NIST) tests, we have proved that our method can considerably improve the randomness of the generated keystreams. In order to incorporate our randomness-enhancement technique, only 41 extra slices have been needed, proving that, apart from effective, this method is also efficient in terms of area and hardware resources.

  3. Mapping accuracy via spectrally and structurally based filtering techniques: comparisons through visual observations

    Science.gov (United States)

    Chockalingam, Letchumanan

    2005-01-01

    The data of Gunung Ledang region of Malaysia acquired through LANDSAT are considered to map certain hydrogeolocial features. To map these significant features, image-processing tools such as contrast enhancement, edge detection techniques are employed. The advantages of these techniques over the other methods are evaluated from the point of their validity in properly isolating features of hydrogeolocial interest are discussed. As these techniques take the advantage of spectral aspects of the images, these techniques have several limitations to meet the objectives. To discuss these limitations, a morphological transformation, which generally considers the structural aspects rather than spectral aspects from the image, are applied to provide comparisons between the results derived from spectral based and the structural based filtering techniques.

  4. Exploring machine-learning-based control plane intrusion detection techniques in software defined optical networks

    Science.gov (United States)

    Zhang, Huibin; Wang, Yuqiao; Chen, Haoran; Zhao, Yongli; Zhang, Jie

    2017-12-01

    In software defined optical networks (SDON), the centralized control plane may encounter numerous intrusion threatens which compromise the security level of provisioned services. In this paper, the issue of control plane security is studied and two machine-learning-based control plane intrusion detection techniques are proposed for SDON with properly selected features such as bandwidth, route length, etc. We validate the feasibility and efficiency of the proposed techniques by simulations. Results show an accuracy of 83% for intrusion detection can be achieved with the proposed machine-learning-based control plane intrusion detection techniques.

  5. Low Power LDPC Code Decoder Architecture Based on Intermediate Message Compression Technique

    Science.gov (United States)

    Shimizu, Kazunori; Togawa, Nozomu; Ikenaga, Takeshi; Goto, Satoshi

    Reducing the power dissipation for LDPC code decoder is a major challenging task to apply it to the practical digital communication systems. In this paper, we propose a low power LDPC code decoder architecture based on an intermediate message-compression technique which features as follows: (i) An intermediate message compression technique enables the decoder to reduce the required memory capacity and write power dissipation. (ii) A clock gated shift register based intermediate message memory architecture enables the decoder to decompress the compressed messages in a single clock cycle while reducing the read power dissipation. The combination of the above two techniques enables the decoder to reduce the power dissipation while keeping the decoding throughput. The simulation results show that the proposed architecture improves the power efficiency up to 52% and 18% compared to that of the decoder based on the overlapped schedule and the rapid convergence schedule without the proposed techniques respectively.

  6. Theoretical Bound of CRLB for Energy Efficient Technique of RSS-Based Factor Graph Geolocation

    Science.gov (United States)

    Kahar Aziz, Muhammad Reza; Heriansyah; Saputra, EfaMaydhona; Musa, Ardiansyah

    2018-03-01

    To support the increase of wireless geolocation development as the key of the technology in the future, this paper proposes theoretical bound derivation, i.e., Cramer Rao lower bound (CRLB) for energy efficient of received signal strength (RSS)-based factor graph wireless geolocation technique. The theoretical bound derivation is crucially important to evaluate whether the energy efficient technique of RSS-based factor graph wireless geolocation is effective as well as to open the opportunity to further innovation of the technique. The CRLB is derived in this paper by using the Fisher information matrix (FIM) of the main formula of the RSS-based factor graph geolocation technique, which is lied on the Jacobian matrix. The simulation result shows that the derived CRLB has the highest accuracy as a bound shown by its lowest root mean squared error (RMSE) curve compared to the RMSE curve of the RSS-based factor graph geolocation technique. Hence, the derived CRLB becomes the lower bound for the efficient technique of RSS-based factor graph wireless geolocation.

  7. Nasal base narrowing of the caucasian nose through the cerclage technique

    Directory of Open Access Journals (Sweden)

    Mocellin, Marcos

    2010-06-01

    Full Text Available Introduction: Several techniques can be performed to reduce the nasal base (narrowing, as skin resection vestibular columellar skin resection, resection of skin in elliptical lip narinary, sloughing of skin and advancements (VY technique of Bernstein and the use of cerclage sutures in the nasal base. Objective: To evaluate the technique of cerclage performed in the nasal base, through endonasal rhinoplasty without delivery of basic technique, in the Caucasian nose, reducing the distance inter-alar flare and correcting the wing with consequent improvement in nasal harmony in the whole face. Methods: A retrospective analysis by analysis of clinical documents and photos of 43 patients in whom cerclage was made of the nasal base by resecting skin ellipse in the region of the vestibule and the nasal base (modified technique of Weir using colorless mononylon® 4 "0" with a straight cutting needle. The study was conducted in 2008 and 2009 at Hospital of Paraná Institute of Otolaryngology - IPO in Curitiba, Parana - Brazil. Patients had a follow up ranging 7-12 months. Results: In 100% of cases was achieved an improvement in nasal harmony, by decreasing the inter-alar distance. Conclusion: The encircling with minimal resection of vestibular skin and the nasal base is an effective method for the narrowing of the nasal base in the Caucasian nose, with predictable results and easy to perform.

  8. Enhancement of Edge-based Image Quality Measures Using Entropy for Histogram Equalization-based Contrast Enhancement Techniques

    Directory of Open Access Journals (Sweden)

    H. T. R. Kurmasha

    2017-12-01

    Full Text Available An Edge-based image quality measure (IQM technique for the assessment of histogram equalization (HE-based contrast enhancement techniques has been proposed that outperforms the Absolute Mean Brightness Error (AMBE and Entropy which are the most commonly used IQMs to evaluate Histogram Equalization based techniques, and also the two prominent fidelity-based IQMs which are Multi-Scale Structural Similarity (MSSIM and Information Fidelity Criterion-based (IFC measures. The statistical evaluation results show that the Edge-based IQM, which was designed for detecting noise artifacts distortion, has a Person Correlation Coefficient (PCC > 0.86 while the others have poor or fair correlation to human opinion, considering the Human Visual Perception (HVP. Based on HVP, this paper propose an enhancement to classic Edge-based IQM by taking into account the brightness saturation distortion which is the most prominent distortion in HE-based contrast enhancement techniques. It is tested and found to have significantly well correlation (PCC > 0.87, Spearman rank order correlation coefficient (SROCC > 0.92, Root Mean Squared Error (RMSE < 0.1054, and Outlier Ratio (OR = 0%.

  9. Simultaneous and integrated neutron-based techniques for material analysis of a metallic ancient flute

    International Nuclear Information System (INIS)

    Festa, G; Andreani, C; Pietropaolo, A; Grazzi, F; Scherillo, A; Barzagli, E; Sutton, L F; Bognetti, L; Bini, A; Schooneveld, E

    2013-01-01

    A metallic 19th century flute was studied by means of integrated and simultaneous neutron-based techniques: neutron diffraction, neutron radiative capture analysis and neutron radiography. This experiment follows benchmark measurements devoted to assessing the effectiveness of a multitask beamline concept for neutron-based investigation on materials. The aim of this study is to show the potential application of the approach using multiple and integrated neutron-based techniques for musical instruments. Such samples, in the broad scenario of cultural heritage, represent an exciting research field. They may represent an interesting link between different disciplines such as nuclear physics, metallurgy and acoustics. (paper)

  10. Physically-Based Interactive Flow Visualization Based on Schlieren and Interferometry Experimental Techniques

    KAUST Repository

    Brownlee, C.; Pegoraro, V.; Shankar, S.; McCormick, Patrick S.; Hansen, C. D.

    2011-01-01

    Understanding fluid flow is a difficult problem and of increasing importance as computational fluid dynamics (CFD) produces an abundance of simulation data. Experimental flow analysis has employed techniques such as shadowgraph, interferometry

  11. RF Sub-sampling Receiver Architecture based on Milieu Adapting Techniques

    DEFF Research Database (Denmark)

    Behjou, Nastaran; Larsen, Torben; Jensen, Ole Kiel

    2012-01-01

    A novel sub-sampling based architecture is proposed which has the ability of reducing the problem of image distortion and improving the signal to noise ratio significantly. The technique is based on sensing the environment and adapting the sampling rate of the receiver to the best possible...

  12. Experimental evaluation of optimal Vehicle Dynamic Control based on the State Dependent Riccati Equation technique

    NARCIS (Netherlands)

    Alirezaei, M.; Kanarachos, S.A.; Scheepers, B.T.M.; Maurice, J.P.

    2013-01-01

    Development and experimentally evaluation of an optimal Vehicle Dynamic Control (VDC) strategy based on the State Dependent Riccati Equation (SDRE) control technique is presented. The proposed nonlinear controller is based on a nonlinear vehicle model with nonlinear tire characteristics. A novel

  13. ON THE PAPR REDUCTION IN OFDM SYSTEMS: A NOVEL ZCT PRECODING BASED SLM TECHNIQUE

    Directory of Open Access Journals (Sweden)

    VARUN JEOTI

    2011-06-01

    Full Text Available High Peak to Average Power Ratio (PAPR reduction is still an important challenge in Orthogonal Frequency Division Multiplexing (OFDM systems. In this paper, we propose a novel Zadoff-Chu matrix Transform (ZCT precoding based Selected Mapping (SLM technique for PAPR reduction in OFDM systems. This technique is based on precoding the constellation symbols with ZCT precoder after the multiplication of phase rotation factor and before the Inverse Fast Fourier Transform (IFFT in the SLM based OFDM (SLM-OFDM Systems. Computer simulation results show that, the proposed technique can reduce PAPR up to 5.2 dB for N=64 (System subcarriers and V=16 (Dissimilar phase sequences, at clip rate of 10-3. Additionally, ZCT based SLM-OFDM (ZCT-SLM-OFDM systems also take advantage of frequency variations of the communication channel and can also offer substantial performance gain in fading multipath channels.

  14. A frequency domain radar interferometric imaging (FII) technique based on high-resolution methods

    Science.gov (United States)

    Luce, H.; Yamamoto, M.; Fukao, S.; Helal, D.; Crochet, M.

    2001-01-01

    In the present work, we propose a frequency-domain interferometric imaging (FII) technique for a better knowledge of the vertical distribution of the atmospheric scatterers detected by MST radars. This is an extension of the dual frequency-domain interferometry (FDI) technique to multiple frequencies. Its objective is to reduce the ambiguity (resulting from the use of only two adjacent frequencies), inherent with the FDI technique. Different methods, commonly used in antenna array processing, are first described within the context of application to the FII technique. These methods are the Fourier-based imaging, the Capon's and the singular value decomposition method used with the MUSIC algorithm. Some preliminary simulations and tests performed on data collected with the middle and upper atmosphere (MU) radar (Shigaraki, Japan) are also presented. This work is a first step in the developments of the FII technique which seems to be very promising.

  15. Under-Frequency Load Shedding Technique Considering Event-Based for an Islanded Distribution Network

    Directory of Open Access Journals (Sweden)

    Hasmaini Mohamad

    2016-06-01

    Full Text Available One of the biggest challenge for an islanding operation is to sustain the frequency stability. A large power imbalance following islanding would cause under-frequency, hence an appropriate control is required to shed certain amount of load. The main objective of this research is to develop an adaptive under-frequency load shedding (UFLS technique for an islanding system. The technique is designed considering an event-based which includes the moment system is islanded and a tripping of any DG unit during islanding operation. A disturbance magnitude is calculated to determine the amount of load to be shed. The technique is modeled by using PSCAD simulation tool. A simulation studies on a distribution network with mini hydro generation is carried out to evaluate the UFLS model. It is performed under different load condition: peak and base load. Results show that the load shedding technique have successfully shed certain amount of load and stabilized the system frequency.

  16. Bi Input-extended Kalman filter based estimation technique for speed-sensorless control of induction motors

    International Nuclear Information System (INIS)

    Barut, Murat

    2010-01-01

    This study offers a novel extended Kalman filter (EKF) based estimation technique for the solution of the on-line estimation problem related to uncertainties in the stator and rotor resistances inherent to the speed-sensorless high efficiency control of induction motors (IMs) in the wide speed range as well as extending the limited number of states and parameter estimations possible with a conventional single EKF algorithm. For this aim, the introduced estimation technique in this work utilizes a single EKF algorithm with the consecutive execution of two inputs derived from the two individual extended IM models based on the stator resistance and rotor resistance estimation, differently from the other approaches in past studies, which require two separate EKF algorithms operating in a switching or braided manner; thus, it has superiority over the previous EKF schemes in this regard. The proposed EKF based estimation technique performing the on-line estimations of the stator currents, the rotor flux, the rotor angular velocity, and the load torque involving the viscous friction term together with the rotor and stator resistance is also used in the combination with the speed-sensorless direct vector control of IM and tested with simulations under the challenging 12 scenarios generated instantaneously via step and/or linear variations of the velocity reference, the load torque, the stator resistance, and the rotor resistance in the range of high and zero speed, assuming that the measured stator phase currents and voltages are available. Even under those variations, the performance of the speed-sensorless direct vector control system established on the novel EKF based estimation technique is observed to be quite good.

  17. Bi Input-extended Kalman filter based estimation technique for speed-sensorless control of induction motors

    Energy Technology Data Exchange (ETDEWEB)

    Barut, Murat, E-mail: muratbarut27@yahoo.co [Nigde University, Department of Electrical and Electronics Engineering, 51245 Nigde (Turkey)

    2010-10-15

    This study offers a novel extended Kalman filter (EKF) based estimation technique for the solution of the on-line estimation problem related to uncertainties in the stator and rotor resistances inherent to the speed-sensorless high efficiency control of induction motors (IMs) in the wide speed range as well as extending the limited number of states and parameter estimations possible with a conventional single EKF algorithm. For this aim, the introduced estimation technique in this work utilizes a single EKF algorithm with the consecutive execution of two inputs derived from the two individual extended IM models based on the stator resistance and rotor resistance estimation, differently from the other approaches in past studies, which require two separate EKF algorithms operating in a switching or braided manner; thus, it has superiority over the previous EKF schemes in this regard. The proposed EKF based estimation technique performing the on-line estimations of the stator currents, the rotor flux, the rotor angular velocity, and the load torque involving the viscous friction term together with the rotor and stator resistance is also used in the combination with the speed-sensorless direct vector control of IM and tested with simulations under the challenging 12 scenarios generated instantaneously via step and/or linear variations of the velocity reference, the load torque, the stator resistance, and the rotor resistance in the range of high and zero speed, assuming that the measured stator phase currents and voltages are available. Even under those variations, the performance of the speed-sensorless direct vector control system established on the novel EKF based estimation technique is observed to be quite good.

  18. Solving Linear Equations by Classical Jacobi-SR Based Hybrid Evolutionary Algorithm with Uniform Adaptation Technique

    OpenAIRE

    Jamali, R. M. Jalal Uddin; Hashem, M. M. A.; Hasan, M. Mahfuz; Rahman, Md. Bazlar

    2013-01-01

    Solving a set of simultaneous linear equations is probably the most important topic in numerical methods. For solving linear equations, iterative methods are preferred over the direct methods especially when the coefficient matrix is sparse. The rate of convergence of iteration method is increased by using Successive Relaxation (SR) technique. But SR technique is very much sensitive to relaxation factor, {\\omega}. Recently, hybridization of classical Gauss-Seidel based successive relaxation t...

  19. Cermet based solar selective absorbers : further selectivity improvement and developing new fabrication technique

    OpenAIRE

    Nejati, Mohammadreza

    2008-01-01

    Spectral selectivity of cermet based selective absorbers were increased by inducing surface roughness on the surface of the cermet layer using a roughening technique (deposition on hot substrates) or by micro-structuring the metallic substrates before deposition of the absorber coating using laser and imprint structuring techniques. Cu-Al2O3 cermet absorbers with very rough surfaces and excellent selectivity were obtained by employing a roughness template layer under the infrared reflective l...

  20. Novel anti-jamming technique for OCDMA network through FWM in SOA based wavelength converter

    Science.gov (United States)

    Jyoti, Vishav; Kaler, R. S.

    2013-06-01

    In this paper, we propose a novel anti-jamming technique for optical code division multiple access (OCDMA) network through four wave mixing (FWM) in semiconductor optical amplifier (SOA) based wavelength converter. OCDMA signal can be easily jammed with high power jamming signal. It is shown that wavelength conversion through four wave mixing in SOA has improved capability of jamming resistance. It is observed that jammer has no effect on OCDMA network even at high jamming powers by using the proposed technique.

  1. The Design and Development of Test Platform for Wheat Precision Seeding Based on Image Processing Techniques

    OpenAIRE

    Li , Qing; Lin , Haibo; Xiu , Yu-Feng; Wang , Ruixue; Yi , Chuijie

    2009-01-01

    International audience; The test platform of wheat precision seeding based on image processing techniques is designed to develop the wheat precision seed metering device with high efficiency and precision. Using image processing techniques, this platform gathers images of seeds (wheat) on the conveyer belt which are falling from seed metering device. Then these data are processed and analyzed to calculate the qualified rate, reseeding rate and leakage sowing rate, etc. This paper introduces t...

  2. A comparison of base running start techniques in collegiate fastpitch softball athletes

    OpenAIRE

    Massey, Kelly P.; Brouillette, Kelly Miller; Martino, Mike

    2018-01-01

    This study examined the time difference between three different base running start techniques. Thirteen Division II collegiate softball players performed maximal sprints off a softball bag at two different distances. Sprint times at 4.57 and 18.29 meters for each technique were measured using Fusion Sport’s Smartspeed System. At both 4.57 and 18.29 meters, the rocking start (0.84 ± 0.10; 3.04 ± 0.16 s) was found to be significantly faster (in seconds) than both the split technique (1.01 ± 0.0...

  3. Sample preparation techniques based on combustion reactions in closed vessels - A brief overview and recent applications

    International Nuclear Information System (INIS)

    Flores, Erico M.M.; Barin, Juliano S.; Mesko, Marcia F.; Knapp, Guenter

    2007-01-01

    In this review, a general discussion of sample preparation techniques based on combustion reactions in closed vessels is presented. Applications for several kinds of samples are described, taking into account the literature data reported in the last 25 years. The operational conditions as well as the main characteristics and drawbacks are discussed for bomb combustion, oxygen flask and microwave-induced combustion (MIC) techniques. Recent applications of MIC techniques are discussed with special concern for samples not well digested by conventional microwave-assisted wet digestion as, for example, coal and also for subsequent determination of halogens

  4. Bandwidth-Tunable Fiber Bragg Gratings Based on UV Glue Technique

    Science.gov (United States)

    Fu, Ming-Yue; Liu, Wen-Feng; Chen, Hsin-Tsang; Chuang, Chia-Wei; Bor, Sheau-Shong; Tien, Chuen-Lin

    2007-07-01

    In this study, we have demonstrated that a uniform fiber Bragg grating (FBG) can be transformed into a chirped fiber grating by a simple UV glue adhesive technique without shifting the reflection band with respect to the center wavelength of the FBG. The technique is based on the induced strain of an FBG due to the UV glue adhesive force on the fiber surface that causes a grating period variation and an effective index change. This technique can provide a fast and simple method of obtaining the required chirp value of a grating for applications in the dispersion compensators, gain flattening in erbium-doped fiber amplifiers (EDFAs) or optical filters.

  5. Craniospinal radiotherapy in children: Electron- or photon-based technique of spinal irradiation

    International Nuclear Information System (INIS)

    Chojnacka, M.; Skowronska-Gardas, A.; Pedziwiatr, K.; Morawska-Kaczynska, M.; Zygmuntowicz-Pietka, A.; Semaniak, A.

    2010-01-01

    Background: The prone position and electron-based technique for craniospinal irradiation (CSI) have been standard in our department for many years. But this immobilization is difficult for the anaesthesiologist to gain airway access. The increasing number of children treated under anaesthesia led us to reconsider our technique. Aim: The purpose of this study is to report our new photon-based technique for CSI which could be applied in both the supine and the prone position and to compare this technique with our electron-based technique. Materials and methods: Between November 2007 and May 2008, 11 children with brain tumours were treated in the prone position with CSI. For 9 patients two treatment plans were created: the first one using photons and the second one using electron beams for spinal irradiation. We prepared seven 3D-conformal photon plans and four forward planned segmented field plans. We compared 20 treatment plans in terms of target dose homogeneity and sparing of organs at risk. Results: In segmented field plans better dose homogeneity in the thecal sac volume was achieved than in electron-based plans. Regarding doses in organs at risk, in photon-based plans we obtained a lower dose in the thyroid but a higher one in the heart and liver. Conclusions: Our technique can be applied in both the supine and prone position and it seems to be more feasible and precise than the electron technique. However, more homogeneous target coverage and higher precision of dose delivery for photons are obtained at the cost of slightly higher doses to the heart and liver. (authors)

  6. Highlights from the previous volumes

    Science.gov (United States)

    Vergini Eduardo, G.; Pan, Y.; al., Vardi R. et; al., Akkermans Eric et; et al.

    2014-01-01

    Semiclassical propagation up to the Heisenberg time Superconductivity and magnetic order in the half-Heusler compound ErPdBi An experimental evidence-based computational paradigm for new logic-gates in neuronal activity Universality in the symmetric exclusion process and diffusive systems

  7. Backstepping Based Formation Control of Quadrotors with the State Transformation Technique

    Directory of Open Access Journals (Sweden)

    Keun Uk Lee

    2017-11-01

    Full Text Available In this paper, a backstepping-based formation control of quadrotors with the state transformation technique is proposed. First, the dynamics of a quadrotor is derived by using the Newton–Euler formulation. Next, a backstepping-based formation control for quadrotors using a state transformation technique is presented. In the position control, which is the basis of formation control, it is possible to derive the reference attitude angles employing a state transformation technique without the small angle assumption or the simplified dynamics usually used. Stability analysis based on the Lyapunov theorem shows that the proposed formation controller can provide a quadrotor formation error system that is asymptotically stabilized. Finally, we verify the performance of the proposed formation control method through comparison simulations.

  8. A Biometric Face Recognition System Using an Algorithm Based on the Principal Component Analysis Technique

    Directory of Open Access Journals (Sweden)

    Gheorghe Gîlcă

    2015-06-01

    Full Text Available This article deals with a recognition system using an algorithm based on the Principal Component Analysis (PCA technique. The recognition system consists only of a PC and an integrated video camera. The algorithm is developed in MATLAB language and calculates the eigenfaces considered as features of the face. The PCA technique is based on the matching between the facial test image and the training prototype vectors. The mathcing score between the facial test image and the training prototype vectors is calculated between their coefficient vectors. If the matching is high, we have the best recognition. The results of the algorithm based on the PCA technique are very good, even if the person looks from one side at the video camera.

  9. Distributed Synchronization Technique for OFDMA-Based Wireless Mesh Networks Using a Bio-Inspired Algorithm

    Directory of Open Access Journals (Sweden)

    Mi Jeong Kim

    2015-07-01

    Full Text Available In this paper, a distributed synchronization technique based on a bio-inspired algorithm is proposed for an orthogonal frequency division multiple access (OFDMA-based wireless mesh network (WMN with a time difference of arrival. The proposed time- and frequency-synchronization technique uses only the signals received from the neighbor nodes, by considering the effect of the propagation delay between the nodes. It achieves a fast synchronization with a relatively low computational complexity because it is operated in a distributed manner, not requiring any feedback channel for the compensation of the propagation delays. In addition, a self-organization scheme that can be effectively used to construct 1-hop neighbor nodes is proposed for an OFDMA-based WMN with a large number of nodes. The performance of the proposed technique is evaluated with regard to the convergence property and synchronization success probability using a computer simulation.

  10. Distributed Synchronization Technique for OFDMA-Based Wireless Mesh Networks Using a Bio-Inspired Algorithm.

    Science.gov (United States)

    Kim, Mi Jeong; Maeng, Sung Joon; Cho, Yong Soo

    2015-07-28

    In this paper, a distributed synchronization technique based on a bio-inspired algorithm is proposed for an orthogonal frequency division multiple access (OFDMA)-based wireless mesh network (WMN) with a time difference of arrival. The proposed time- and frequency-synchronization technique uses only the signals received from the neighbor nodes, by considering the effect of the propagation delay between the nodes. It achieves a fast synchronization with a relatively low computational complexity because it is operated in a distributed manner, not requiring any feedback channel for the compensation of the propagation delays. In addition, a self-organization scheme that can be effectively used to construct 1-hop neighbor nodes is proposed for an OFDMA-based WMN with a large number of nodes. The performance of the proposed technique is evaluated with regard to the convergence property and synchronization success probability using a computer simulation.

  11. The Effect of Learning Based on Technology Model and Assessment Technique toward Thermodynamic Learning Achievement

    Science.gov (United States)

    Makahinda, T.

    2018-02-01

    The purpose of this research is to find out the effect of learning model based on technology and assessment technique toward thermodynamic achievement by controlling students intelligence. This research is an experimental research. The sample is taken through cluster random sampling with the total respondent of 80 students. The result of the research shows that the result of learning of thermodynamics of students who taught the learning model of environmental utilization is higher than the learning result of student thermodynamics taught by simulation animation, after controlling student intelligence. There is influence of student interaction, and the subject between models of technology-based learning with assessment technique to student learning result of Thermodynamics, after controlling student intelligence. Based on the finding in the lecture then should be used a thermodynamic model of the learning environment with the use of project assessment technique.

  12. Feasibility of CBCT-based dose calculation: Comparative analysis of HU adjustment techniques

    International Nuclear Information System (INIS)

    Fotina, Irina; Hopfgartner, Johannes; Stock, Markus; Steininger, Thomas; Lütgendorf-Caucig, Carola; Georg, Dietmar

    2012-01-01

    Background and purpose: The aim of this work was to compare the accuracy of different HU adjustments for CBCT-based dose calculation. Methods and materials: Dose calculation was performed on CBCT images of 30 patients. In the first two approaches phantom-based (Pha-CC) and population-based (Pop-CC) conversion curves were used. The third method (WAB) represents override of the structures with standard densities for water, air and bone. In ROI mapping approach all structures were overridden with average HUs from planning CT. All techniques were benchmarked to the Pop-CC and CT-based plans by DVH comparison and γ-index analysis. Results: For prostate plans, WAB and ROI mapping compared to Pop-CC showed differences in PTV D median below 2%. The WAB and Pha-CC methods underestimated the bladder dose in IMRT plans. In lung cases PTV coverage was underestimated by Pha-CC method by 2.3% and slightly overestimated by the WAB and ROI techniques. The use of the Pha-CC method for head–neck IMRT plans resulted in difference in PTV coverage up to 5%. Dose calculation with WAB and ROI techniques showed better agreement with pCT than conversion curve-based approaches. Conclusions: Density override techniques provide an accurate alternative to the conversion curve-based methods for dose calculation on CBCT images.

  13. Key Techniques for the Development of Web-Based PDM System

    Institute of Scientific and Technical Information of China (English)

    WANG Li-juan; ZHANG Xu; NING Ru-xin

    2006-01-01

    Some key techniques for the development of web-based product data management (PDM) system are introduced. The four-tiered B/S architecture of a PDM system-BITPDM is introduced first, followed by its design and implementation, including virtual data vault, flexible coding system, document management,product structure and configuration management, workflow/process and product maturity management. BITPDM can facilitate the activities from new product introduction phase to manufacturing, and manage the product data and their dynamic changing history. Based on Microsoft. NET, XML, web service and SOAP techniques, BITPDM realizes the integration and efficient management of product information.

  14. Stabilizing operation point technique based on the tunable distributed feedback laser for interferometric sensors

    Science.gov (United States)

    Mao, Xuefeng; Zhou, Xinlei; Yu, Qingxu

    2016-02-01

    We describe a stabilizing operation point technique based on the tunable Distributed Feedback (DFB) laser for quadrature demodulation of interferometric sensors. By introducing automatic lock quadrature point and wavelength periodically tuning compensation into an interferometric system, the operation point of interferometric system is stabilized when the system suffers various environmental perturbations. To demonstrate the feasibility of this stabilizing operation point technique, experiments have been performed using a tunable-DFB-laser as light source to interrogate an extrinsic Fabry-Perot interferometric vibration sensor and a diaphragm-based acoustic sensor. Experimental results show that good tracing of Q-point was effectively realized.

  15. Seismic qualification of nuclear control board by using base isolation technique

    International Nuclear Information System (INIS)

    Koizumi, T.; Tsujiuchi, N.; Fujita, T.

    1987-01-01

    The purpose is to adopt base isolation technique as a new approach for seismic qualification of nuclear control board. Basic concept of base isolation technique is expressed. Two dimensional linear motion mechanism with pre-tensioned coil springs and some dampers are included in the isolation device. Control board is regarded as a lamped mass system with inertia moment. Fundamental movement of this device and control board is calculated as a non-linear response problems. Fundamental analysis and numerical estimation, experimental investigation has been undertaken using an actual size control board. Sufficient agreement was recognized between experimental results and numerical estimation. (orig./HP)

  16. A prospective evaluation of treatment with Selective Internal Radiation Therapy (SIR-spheres) in patients with unresectable liver metastases from colorectal cancer previously treated with 5-FU based chemotherapy

    International Nuclear Information System (INIS)

    Lim, L; Gibbs, P; Yip, D; Shapiro, JD; Dowling, R; Smith, D; Little, A; Bailey, W; Liechtenstein, M

    2005-01-01

    To prospectively evaluate the efficacy and safety of selective internal radiation (SIR) spheres in patients with inoperable liver metastases from colorectal cancer who have failed 5FU based chemotherapy. Patients were prospectively enrolled at three Australian centres. All patients had previously received 5-FU based chemotherapy for metastatic colorectal cancer. Patients were ECOG 0–2 and had liver dominant or liver only disease. Concurrent 5-FU was given at investigator discretion. Thirty patients were treated between January 2002 and March 2004. As of July 2004 the median follow-up is 18.3 months. Median patient age was 61.7 years (range 36 – 77). Twenty-nine patients are evaluable for toxicity and response. There were 10 partial responses (33%), with the median duration of response being 8.3 months (range 2–18) and median time to progression of 5.3 mths. Response rates were lower (21%) and progression free survival shorter (3.9 mths) in patients that had received all standard chemotherapy options (n = 14). No responses were seen in patients with a poor performance status (n = 3) or extrahepatic disease (n = 6). Overall treatment related toxicity was acceptable, however significant late toxicity included 4 cases of gastric ulceration. In patients with metastatic colorectal cancer that have previously received treatment with 5-FU based chemotherapy, treatment with SIR-spheres has demonstrated encouraging activity. Further studies are required to better define the subsets of patients most likely to respond

  17. Controller Design of DFIG Based Wind Turbine by Using Evolutionary Soft Computational Techniques

    Directory of Open Access Journals (Sweden)

    O. P. Bharti

    2017-06-01

    Full Text Available This manuscript illustrates the controller design for a doubly fed induction generator based variable speed wind turbine by using a bioinspired scheme. This methodology is based on exploiting two proficient swarm intelligence based evolutionary soft computational procedures. The particle swarm optimization (PSO and bacterial foraging optimization (BFO techniques are employed to design the controller intended for small damping plant of the DFIG. Wind energy overview and DFIG operating principle along with the equivalent circuit model is adequately discussed in this paper. The controller design for DFIG based WECS using PSO and BFO are described comparatively in detail. The responses of the DFIG system regarding terminal voltage, current, active-reactive power, and DC-Link voltage have slightly improved with the evolutionary soft computational procedure. Lastly, the obtained output is equated with a standard technique for performance improvement of DFIG based wind energy conversion system.

  18. High-speed technique based on a parallel projection correlation procedure for digital image correlation

    Science.gov (United States)

    Zaripov, D. I.; Renfu, Li

    2018-05-01

    The implementation of high-efficiency digital image correlation methods based on a zero-normalized cross-correlation (ZNCC) procedure for high-speed, time-resolved measurements using a high-resolution digital camera is associated with big data processing and is often time consuming. In order to speed-up ZNCC computation, a high-speed technique based on a parallel projection correlation procedure is proposed. The proposed technique involves the use of interrogation window projections instead of its two-dimensional field of luminous intensity. This simplification allows acceleration of ZNCC computation up to 28.8 times compared to ZNCC calculated directly, depending on the size of interrogation window and region of interest. The results of three synthetic test cases, such as a one-dimensional uniform flow, a linear shear flow and a turbulent boundary-layer flow, are discussed in terms of accuracy. In the latter case, the proposed technique is implemented together with an iterative window-deformation technique. On the basis of the results of the present work, the proposed technique is recommended to be used for initial velocity field calculation, with further correction using more accurate techniques.

  19. In-cylinder pressure-based direct techniques and time frequency analysis for combustion diagnostics in IC engines

    International Nuclear Information System (INIS)

    D’Ambrosio, S.; Ferrari, A.; Galleani, L.

    2015-01-01

    refined combustion models. The presented results on the pressure-based techniques, including a time frequency analysis, have been compared with the numerical outcomes from previously developed two- and three-zone thermodynamic combustion models

  20. An Efficient Digital Pulse Shape Discrimination Technique for Scintillation Detectors Based on FPGA

    International Nuclear Information System (INIS)

    Kamel, M.S.

    2014-01-01

    Different techniques for pulse discrimination (PSD) of the scintillation pulses have been developed. The PSD of scintillation pulese can been used in several applications as Positron Emission Topography (PET) system. Each technique analyzes the resulting pulses from the absorption of radiation in the scintillation pulses were filtered and digitized then it is captured using DAQ, and it sent to the host computer for processing. The spatial resolution of images that generated in PET system can be improved by applying the proposed PSD. In this thesis various digital PSD techniques are proposed to discriminate the scintillation pulses. These techniques are based on discrete sine transform (DST). discrete cosine transform (DCT). Discrete hartley transform (DHT), Discrete Goertzel transform (DGT),and principal component analysis (PCA). Then the output coefficients of the discrete transforms are classified using one of the following classifiers T-test,tuned, or support vector machine (SVM).

  1. Non-Conventional Techniques for the Study of Phase Transitions in NiTi-Based Alloys

    Science.gov (United States)

    Nespoli, Adelaide; Villa, Elena; Passaretti, Francesca; Albertini, Franca; Cabassi, Riccardo; Pasquale, Massimo; Sasso, Carlo Paolo; Coïsson, Marco

    2014-07-01

    Differential scanning calorimetry and electrical resistance measurements are the two most common techniques for the study of the phase transition path and temperatures of shape memory alloys (SMA) in stress-free condition. Besides, it is well known that internal friction measurements are also useful for this purpose. There are indeed some further techniques which are seldom used for the basic characterization of SMA transition: dilatometric analysis, magnetic measurements, and Seebeck coefficient study. In this work, we discuss the attitude of these techniques for the study of NiTi-based phase transition. Measurements were conducted on several fully annealed Ni50- x Ti50Cu x samples ranging from 3 to 10 at.% in Cu content, fully annealed at 850 °C for 1 h in vacuum and quenched in water at room temperature. Results show that all these techniques are sensitive to phase transition, and they provide significant information about the existence of intermediate phases.

  2. Review on recent Developments on Fabrication Techniques of Distributed Feedback (DFB) Based Organic Lasers

    Science.gov (United States)

    Azrina Talik, Noor; Boon Kar, Yap; Noradhlia Mohamad Tukijan, Siti; Wong, Chuan Ling

    2017-10-01

    To date, the state of art organic semiconductor distributed feedback (DFB) lasers gains tremendous interest in the organic device industry. This paper presents a short reviews on the fabrication techniques of DFB based laser by focusing on the fabrication method of DFB corrugated structure and the deposition of organic gain on the nano-patterned DFB resonator. The fabrication techniques such as Laser Direct Writing (LDW), ultrafast photo excitation dynamics, Laser Interference Lithography (LIL) and Nanoimprint Lithography (NIL) for DFB patterning are presented. In addition to that, the method for gain medium deposition method is also discussed. The technical procedures of the stated fabrication techniques are summarized together with their benefits and comparisons to the traditional fabrication techniques.

  3. Interference Mitigation Technique for Coexistence of Pulse-Based UWB and OFDM

    Directory of Open Access Journals (Sweden)

    Ohno Kohei

    2008-01-01

    Full Text Available Abstract Ultra-wideband (UWB is a useful radio technique for sharing frequency bands between radio systems. It uses very short pulses to spread spectrum. However, there is a potential for interference between systems using the same frequency bands at close range. In some regulatory systems, interference detection and avoidance (DAA techniques are required to prevent interference with existing radio systems. In this paper, the effect of interference on orthogonal frequency division multiplexing (OFDM signals from pulse-based UWB is discussed, and an interference mitigation technique is proposed. This technique focuses on the pulse repetition cycle of UWB. The pulse repetition interval is set the same or half the period of the OFDM symbol excluding the guard interval to mitigate interference. These proposals are also made for direct sequence (DS-UWB. Bit error rate (BER performance is illustrated through both simulation and theoretical approximations.

  4. New directions in point-contact spectroscopy based on scanning tunneling microscopy techniques (Review Article)

    International Nuclear Information System (INIS)

    Tartaglini, E.; Verhagen, T.G.A.; Galli, F.; Trouwborst, M.L.; Aarts, J.; Van-Ruitebbeek, J.M.; Muller, R.; Shiota, T.

    2013-01-01

    Igor Yanson showed 38 years ago for the first time a point-contact measurement where he probed the energy resolved spectroscopy of the electronic scattering inside the metal. Since this first measurement, the pointcontact spectroscopy (PCS) technique improved enormously. The application of the scanning probe microscopy (SPM) techniques in the late 1980s allowed achieving contacts with a diameter of a single atom. With the introduction of the mechanically controlled break junction technique, even spectroscopy on freely suspended chains of atoms could be performed. In this paper, we briefly review the current developments of PCS and show recent experiments in advanced scanning PCS based on SPM techniques. We describe some results obtained with both needle-anvil type of point contacts and scanning tunneling microscopy (STM). We also show our first attempt to lift up with a STM a chain of single gold atoms from a Au(110) surface.

  5. Using a Voltage Domain Programmable Technique for Low-Power Management Cell-Based Design

    Directory of Open Access Journals (Sweden)

    Ching-Hwa Cheng

    2011-09-01

    Full Text Available The Multi-voltage technique is an effective way to reduce power consumption. In the proposed cell-based voltage domain programmable (VDP technique, the high and low voltages applied to logic gates are programmable. The flexible voltage domain reassignment allows the chip performance and power consumption to be dynamically adjusted. In the proposed technique, the power switches possess the feature of flexible programming after chip manufacturing. This VDP method does not use an external voltage regulator to regulate the supply voltage level from outside of the chip but can be easily integrated within the design. This novel technique is proven by use of a video decoder test chip, which shows 55% and 61% power reductions compared to conventional single-Vdd and low-voltage designs, respectively. This power-aware performance adjusting mechanism shows great power reduction with a good power-performance management mechanism.

  6. Mobility Based Key Management Technique for Multicast Security in Mobile Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    B. Madhusudhanan

    2015-01-01

    Full Text Available In MANET multicasting, forward and backward secrecy result in increased packet drop rate owing to mobility. Frequent rekeying causes large message overhead which increases energy consumption and end-to-end delay. Particularly, the prevailing group key management techniques cause frequent mobility and disconnections. So there is a need to design a multicast key management technique to overcome these problems. In this paper, we propose the mobility based key management technique for multicast security in MANET. Initially, the nodes are categorized according to their stability index which is estimated based on the link availability and mobility. A multicast tree is constructed such that for every weak node, there is a strong parent node. A session key-based encryption technique is utilized to transmit a multicast data. The rekeying process is performed periodically by the initiator node. The rekeying interval is fixed depending on the node category so that this technique greatly minimizes the rekeying overhead. By simulation results, we show that our proposed approach reduces the packet drop rate and improves the data confidentiality.

  7. Improvement in QEPAS system utilizing a second harmonic based wavelength calibration technique

    Science.gov (United States)

    Zhang, Qinduan; Chang, Jun; Wang, Fupeng; Wang, Zongliang; Xie, Yulei; Gong, Weihua

    2018-05-01

    A simple laser wavelength calibration technique, based on second harmonic signal, is demonstrated in this paper to improve the performance of quartz enhanced photoacoustic spectroscopy (QEPAS) gas sensing system, e.g. improving the signal to noise ratio (SNR), detection limit and long-term stability. Constant current, corresponding to the gas absorption line, combining f/2 frequency sinusoidal signal are used to drive the laser (constant driving mode), a software based real-time wavelength calibration technique is developed to eliminate the wavelength drift due to ambient fluctuations. Compared to conventional wavelength modulation spectroscopy (WMS), this method allows lower filtering bandwidth and averaging algorithm applied to QEPAS system, improving SNR and detection limit. In addition, the real-time wavelength calibration technique guarantees the laser output is modulated steadily at gas absorption line. Water vapor is chosen as an objective gas to evaluate its performance compared to constant driving mode and conventional WMS system. The water vapor sensor was designed insensitive to the incoherent external acoustic noise by the numerical averaging technique. As a result, the SNR increases 12.87 times in wavelength calibration technique based system compared to conventional WMS system. The new system achieved a better linear response (R2 = 0 . 9995) in concentration range from 300 to 2000 ppmv, and achieved a minimum detection limit (MDL) of 630 ppbv.

  8. An Overview on Base Real-Time Hard Shadow Techniques in Virtual Environments

    Directory of Open Access Journals (Sweden)

    Mohd Shahrizal Sunar

    2012-03-01

    Full Text Available Shadows are elegant to create a realistic scene in virtual environments variety type of shadow techniques encourage us to prepare an overview on all base shadow techniques. Non real-time and real-time techniques are big subdivision of shadow generation. In non real-time techniques ray tracing, ray casting and radiosity are well known and are described deeply. Radiosity is implemented to create very realistic shadow on non real-time scene. Although traditional radiosity algorithm is difficult to implement, we have proposed a simple one. The proposed pseudo code is easier to understand and implement. Ray tracing is used to prevent of collision of movement objects. Projection shadow, shadow volume and shadow mapping are used to create real-time shadow in virtual environments. We have used projection shadow for some objects are static and have shadow on flat surface. Shadow volume is used to create accurate shadow with sharp outline. Shadow mapping that is the base of most recently techniques is reconstructed. The reconstruct algorithm gives some new idea to propose another algorithm based on shadow mapping.

  9. Refinement of homology-based protein structures by molecular dynamics simulation techniques

    NARCIS (Netherlands)

    Fan, H; Mark, AE

    The use of classical molecular dynamics simulations, performed in explicit water, for the refinement of structural models of proteins generated ab initio or based on homology has been investigated. The study involved a test set of 15 proteins that were previously used by Baker and coworkers to

  10. Mobile Augmented Reality Support for Architects Based on Feature Tracking Techniques

    DEFF Research Database (Denmark)

    Bang Nielsen, Michael; Kramp, Gunnar; Grønbæk, Kaj

    2004-01-01

    on the horizon view from an office building, while working on a courtyard garden proposal. The SitePack applies a novel combination of GPS tracking and vision based feature tracking in its support for architects. The SitePack requires no preparation of the site and combines and extends the strengths of previous...

  11. Development of the Risk-Based Inspection Techniques and Pilot Plant Activities

    International Nuclear Information System (INIS)

    Phillips, J.H.

    1997-01-01

    Risk-based techniques have been developed for commercial nuclear power plants. System boundaries and success criteria is defined using the probabilistic risk analysis or probabilistic safety analysis developed to meet the individual plant evaluation. Final ranking of components is by a plant expert panel similar to the one developed for maintenance rule. Components are identified as being high risk-significant or low-risk significant. Maintenance and resources are focused on those components that have the highest risk-significance. The techniques have been developed and applied at a number of pilot plants. Results from the first risk-based inspection pilot plant indicates that safety due to pipe failure can be doubled while the inspection reduced to about 80% when compared with current inspection programs. The reduction in inspection reduces the person-rem exposure resulting in further increases in safety. These techniques have been documented in publication by the ASME CRTD

  12. Novel stability criteria for fuzzy Hopfield neural networks based on an improved homogeneous matrix polynomials technique

    International Nuclear Information System (INIS)

    Feng Yi-Fu; Zhang Qing-Ling; Feng De-Zhi

    2012-01-01

    The global stability problem of Takagi—Sugeno (T—S) fuzzy Hopfield neural networks (FHNNs) with time delays is investigated. Novel LMI-based stability criteria are obtained by using Lyapunov functional theory to guarantee the asymptotic stability of the FHNNs with less conservatism. Firstly, using both Finsler's lemma and an improved homogeneous matrix polynomial technique, and applying an affine parameter-dependent Lyapunov—Krasovskii functional, we obtain the convergent LMI-based stability criteria. Algebraic properties of the fuzzy membership functions in the unit simplex are considered in the process of stability analysis via the homogeneous matrix polynomials technique. Secondly, to further reduce the conservatism, a new right-hand-side slack variables introducing technique is also proposed in terms of LMIs, which is suitable to the homogeneous matrix polynomials setting. Finally, two illustrative examples are given to show the efficiency of the proposed approaches

  13. A robust calibration technique for acoustic emission systems based on momentum transfer from a ball drop

    Science.gov (United States)

    McLaskey, Gregory C.; Lockner, David A.; Kilgore, Brian D.; Beeler, Nicholas M.

    2015-01-01

    We describe a technique to estimate the seismic moment of acoustic emissions and other extremely small seismic events. Unlike previous calibration techniques, it does not require modeling of the wave propagation, sensor response, or signal conditioning. Rather, this technique calibrates the recording system as a whole and uses a ball impact as a reference source or empirical Green’s function. To correctly apply this technique, we develop mathematical expressions that link the seismic moment $M_{0}$ of internal seismic sources (i.e., earthquakes and acoustic emissions) to the impulse, or change in momentum $\\Delta p $, of externally applied seismic sources (i.e., meteor impacts or, in this case, ball impact). We find that, at low frequencies, moment and impulse are linked by a constant, which we call the force‐moment‐rate scale factor $C_{F\\dot{M}} = M_{0}/\\Delta p$. This constant is equal to twice the speed of sound in the material from which the seismic sources were generated. Next, we demonstrate the calibration technique on two different experimental rock mechanics facilities. The first example is a saw‐cut cylindrical granite sample that is loaded in a triaxial apparatus at 40 MPa confining pressure. The second example is a 2 m long fault cut in a granite sample and deformed in a large biaxial apparatus at lower stress levels. Using the empirical calibration technique, we are able to determine absolute source parameters including the seismic moment, corner frequency, stress drop, and radiated energy of these magnitude −2.5 to −7 seismic events.

  14. Overdenture retained by teeth using a definitive denture base technique: a case report.

    Science.gov (United States)

    Nascimento, D F F; dos Santos, J F F; Marchini, L

    2010-09-01

    This paper presents a technique involving the use of a definitive denture base to make overdentures. Cores with ball attachments were cemented over remaining lower teeth. Impressions of the edentulous maxilla and mandible were taken to obtain a definitive acrylic resin base. The definitive base of the mandible was perforated at the location of ball attachments and its female components were fixed to the base using acrylic resin directly in the patient's mouth. Wax rims were then made, jaw relationships recorded, teeth mounted and tried in, and the dentures were cured. This technique allowed for easy fixing of female components and better retention during the recording of jaw relationships, and can also be used in the construction of implant retained dentures.

  15. Development and application of the analyzer-based imaging technique with hard synchrotron radiation

    International Nuclear Information System (INIS)

    Coan, P.

    2006-07-01

    The objective of this thesis is twofold: from one side the application of the analyser-based X-ray phase contrast imaging to study cartilage, bone and bone implants using ESRF synchrotron radiation sources and on the other to contribute to the development of the phase contrast techniques from the theoretical and experimental point of view. Several human samples have been studied in vitro using the analyser based imaging (ABI) technique. Examination included projection and computed tomography imaging and 3-dimensional volume rendering of hip, big toe and ankle articular joints. X-ray ABI images have been critically compared with those obtained with conventional techniques, including radiography, computed tomography, ultrasound, magnetic resonance and histology, the latter taken as gold standard. Results show that only ABI imaging was able to either visualize or correctly estimate the early pathological status of the cartilage. The status of the bone ingrowth in sheep implants have also been examined in vitro: ABI images permitted to correctly distinguish between good and incomplete bone healing. Pioneering in-vivo ABI on guinea pigs were also successfully performed, confirming the possible use of the technique to follow up the progression of joint diseases, the bone/metal ingrowth and the efficacy of drugs treatments. As part of the development of the phase contrast techniques, two objectives have been reached. First, it has been experimentally demonstrated for the first time that the ABI and the propagation based imaging (PBI) can be combined to create images with original features (hybrid imaging, HI). Secondly, it has been proposed and experimentally tested a new simplified set-up capable to produce images with properties similar to those obtained with the ABI technique or HI. Finally, both the ABI and the HI have been theoretically studied with an innovative, wave-based simulation program, which was able to correctly reproduce experimental results. (author)

  16. Compressed sensing techniques for receiver based post-compensation of transmitter's nonlinear distortions in OFDM systems

    KAUST Repository

    Owodunni, Damilola S.; Ali, Anum Z.; Quadeer, Ahmed Abdul; Al-Safadi, Ebrahim B.; Hammi, Oualid; Al-Naffouri, Tareq Y.

    2014-01-01

    -domain, and three compressed sensing based algorithms are presented to estimate and compensate for these distortions at the receiver using a few and, at times, even no frequency-domain free carriers (i.e. pilot carriers). The first technique is a conventional

  17. Mobile Augmented Reality Support for Architects based on feature Tracking Techniques

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Nielsen, Mikkel Bang; Kramp, Gunnar

    2004-01-01

    This paper presents a mobile Augmented Reality (AR) system called the SitePack supporting architects in visualizing 3D models in real-time on site. We describe how vision based feature tracking techniques can help architects making decisions on site concerning visual impact assessment. The AR sys...

  18. Assessment of the impact strength of the denture base resin polymerized by various processing techniques

    Directory of Open Access Journals (Sweden)

    Rajashree Jadhav

    2013-01-01

    Full Text Available Aim : To measure the impact strength of denture base resins polymerized using short and long curing cycles by water bath, pressure cooker and microwave techniques. Materials and Methods: For impact strength testing, 60 samples were made. The sample dimensions were 60 mm × 12 mm × 3 mm, as standardized by the American Standards for Testing and Materials (ASTM. A digital caliper was used to locate the midpoint of sample. The impact strength was measured in IZOD type of impact tester using CEAST Impact tester. The pendulum struck the sample and it broke. The energy required to break the sample was measured in Joules. Data were analyzed using Student′s " t" test. Results: There was statistically significant difference in the impact strength of denture base resins polymerized by long curing cycle and short curing cycle in each technique, with the long curing processing being the best. Conclusion: The polymerization technique plays an important role in the influence of impact strength in the denture base resin. This research demonstrates that the denture base resin polymerized by microwave processing technique possessed the highest impact strength.

  19. New Strategies for Powder Compaction in Powder-based Rapid Prototyping Techniques

    NARCIS (Netherlands)

    Budding, A.; Vaneker, Thomas H.J.

    2013-01-01

    In powder-based rapid prototyping techniques, powder compaction is used to create thin layers of fine powder that are locally bonded. By stacking these layers of locally bonded material, an object is made. The compaction of thin layers of powder mater ials is of interest for a wide range of

  20. Fast high resolution ADC based on the flash type with a special error correcting technique

    Energy Technology Data Exchange (ETDEWEB)

    Xiao-Zhong, Liang; Jing-Xi, Cao [Beijing Univ. (China). Inst. of Atomic Energy

    1984-03-01

    A fast 12 bits ADC based on the flash type with a simple special error correcting technique which can effectively compensate the level drift of the discriminators and the droop of the stretcher voltage is described. The DNL is comparable with the Wilkinson's ADC and long term drift is far better than its.

  1. A New Project-Based Curriculum of Design Thinking with Systems Engineering Techniques

    NARCIS (Netherlands)

    Haruyama, S.; Kim, S.K.; Beiter, K.A.; Dijkema, G.P.J.; De Weck, O.L.

    2012-01-01

    We developed a new education curriculum called "ALPS" (Active Learning Project Sequence) at Keio University that emphasizes team project-based learning and design thinking with systems engineering techniques. ALPS is a 6 month course, in which students work as a team and design and propose

  2. Multiple-output all-optical header processing technique based on two-pulse correlation principle

    NARCIS (Netherlands)

    Calabretta, N.; Liu, Y.; Waardt, de H.; Hill, M.T.; Khoe, G.D.; Dorren, H.J.S.

    2001-01-01

    A serial all-optical header processing technique based on a two-pulse correlation principle in a semiconductor laser amplifier in a loop mirror (SLALOM) configuration that can have a large number of output ports is presented. The operation is demonstrated experimentally at a 10Gbit/s Manchester

  3. Adjustments of microwave-based measurements on coal moisture using natural radioactivity techniques

    Energy Technology Data Exchange (ETDEWEB)

    Prieto-Fernandez, I.; Luengo-Garcia, J.C.; Alonso-Hidalgo, M.; Folgueras-Diaz, B. [University of Oviedo, Gijon (Spain)

    2006-01-07

    The use of nonconventional on-line measurements of moisture and ash content in coal is presented. The background research is briefly reviewed. The possibilities of adjusting microwave-based moisture measurements using natural radioactive techniques, and vice versa, are proposed. The results obtained from the simultaneous analysis of moisture and ash content as well as the correlation improvements are shown.

  4. An Experimental Technique for Structural Diagnostic Based on Laser Vibrometry and Neural Networks

    Directory of Open Access Journals (Sweden)

    Paolo Castellini

    2000-01-01

    Full Text Available In recent years damage detection techniques based on vibration data have been largely investigated with promising results for many applications. In particular, several attempts have been made to determine which kind of data should be extracted for damage monitoring.

  5. Multidimensional Test Assembly Based on Lagrangian Relaxation Techniques. Research Report 98-08.

    Science.gov (United States)

    Veldkamp, Bernard P.

    In this paper, a mathematical programming approach is presented for the assembly of ability tests measuring multiple traits. The values of the variance functions of the estimators of the traits are minimized, while test specifications are met. The approach is based on Lagrangian relaxation techniques and provides good results for the two…

  6. An Active Damping Technique for Small DC-Link Capacitor Based Drive System

    DEFF Research Database (Denmark)

    Maheshwari, Ram Krishan; Munk-Nielsen, Stig; Lu, Kaiyuan

    2013-01-01

    A small dc-link capacitor based drive system shows instability when it is operated with large input line inductance at operating points with high power. This paper presents a simple, new active damping technique that can stabilize effectively the drive system at unstable operating points, offering...

  7. Efficient Bayesian Compressed Sensing-based Channel Estimation Techniques for Massive MIMO-OFDM Systems

    OpenAIRE

    Al-Salihi, Hayder Qahtan Kshash; Nakhai, Mohammad Reza

    2017-01-01

    Efficient and highly accurate channel state information (CSI) at the base station (BS) is essential to achieve the potential benefits of massive multiple input multiple output (MIMO) systems. However, the achievable accuracy that is attainable is limited in practice due to the problem of pilot contamination. It has recently been shown that compressed sensing (CS) techniques can address the pilot contamination problem. However, CS-based channel estimation requires prior knowledge of channel sp...

  8. Behaviour change techniques in home-based cardiac rehabilitation: a systematic review

    OpenAIRE

    Heron, Neil; Kee, Frank; Donnelly, Michael; Cardwell, Christopher; Tully, Mark A; Cupples, Margaret E

    2016-01-01

    BACKGROUND: Cardiac rehabilitation (CR) programmes offering secondary prevention for cardiovascular disease (CVD) advise healthy lifestyle behaviours, with the behaviour change techniques (BCTs) of goals and planning, feedback and monitoring, and social support recommended. More information is needed about BCT use in home-based CR to support these programmes in practice.AIM: To identify and describe the use of BCTs in home-based CR programmes.DESIGN AND SETTING: Randomised controlled trials o...

  9. A comparison of base running and sliding techniques in collegiate baseball with implications for sliding into first base

    Directory of Open Access Journals (Sweden)

    Travis Ficklin

    2016-09-01

    Conclusion: There was a non-significant trend toward an advantage for diving into first base over running through it, but more research is needed, and even if the advantage is real, the risks of executing this technique probably outweigh the miniscule gain.

  10. Training community matrons in basic cognitive behavioural therapy-based techniques for patients with COPD.

    Science.gov (United States)

    Barker, David; Davies, Caroline; Dixon, Brendan; Hodgson, Amanda; Reay, Simon; Barclay, Nicola

    2014-06-01

    People with chronic obstructive pulmonary disease (COPD) experience anxiety and depression with a higher prevalence than the general population. Despite this, they are under-represented in mental health services on a national scale, but do regularly have contact with health-care professionals such as community matrons. We aimed to explore the process of CBT-based skills supervision and the practical implications for community matrons using CBT-based techniques with COPD clients as part of their standard practice. Twenty community matrons took part in a 2-day CBT-based skills training programme. Measurements of their knowledge and understanding of CBT-based skills were taken before and following the training. Additionally, they completed written feedback relating to the training. They were then supervised by CBT therapists for 6 months. Written feedback was obtained following this and some supervisors participated in a 1-hour focus group to discuss the process. Community matrons' knowledge and understanding of CBT-based techniques significantly improved following training, although findings did indicate that merely training did not always seamlessly translate into effective practice. Supervisor feedback suggested that it would be beneficial for community matrons to share positive practice of their use of basic CBT-based techniques among peers to maximise the perceived value of this approach.

  11. Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques

    Science.gov (United States)

    2017-11-01

    on Bio -Inspired Optimization Techniques by Canh Ly, Nghia Tran, and Ozlem Kilic Approved for public release; distribution is...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio -Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio -Inspired Optimization Techniques 5a. CONTRACT NUMBER

  12. Encoding technique for high data compaction in data bases of fusion devices

    International Nuclear Information System (INIS)

    Vega, J.; Cremy, C.; Sanchez, E.; Portas, A.; Dormido, S.

    1996-01-01

    At present, data requirements of hundreds of Mbytes/discharge are typical in devices such as JET, TFTR, DIII-D, etc., and these requirements continue to increase. With these rates, the amount of storage required to maintain discharge information is enormous. Compaction techniques are now essential to reduce storage. However, general compression techniques may distort signals, but this is undesirable for fusion diagnostics. We have developed a general technique for data compression which is described here. The technique, which is based on delta compression, does not require an examination of the data as in delayed methods. Delta values are compacted according to general encoding forms which satisfy a prefix code property and which are defined prior to data capture. Several prefix codes, which are bit oriented and which have variable code lengths, have been developed. These encoding methods are independent of the signal analog characteristics and enable one to store undistorted signals. The technique has been applied to databases of the TJ-I tokamak and the TJ-IU torsatron. Compaction rates of over 80% with negligible computational effort were achieved. Computer programs were written in ANSI C, thus ensuring portability and easy maintenance. We also present an interpretation, based on information theory, of the high compression rates achieved without signal distortion. copyright 1996 American Institute of Physics

  13. Electrocardiogram signal denoising based on empirical mode decomposition technique: an overview

    International Nuclear Information System (INIS)

    Han, G.; Lin, B.; Xu, Z.

    2017-01-01

    Electrocardiogram (ECG) signal is nonlinear and non-stationary weak signal which reflects whether the heart is functioning normally or abnormally. ECG signal is susceptible to various kinds of noises such as high/low frequency noises, powerline interference and baseline wander. Hence, the removal of noises from ECG signal becomes a vital link in the ECG signal processing and plays a significant role in the detection and diagnosis of heart diseases. The review will describe the recent developments of ECG signal denoising based on Empirical Mode Decomposition (EMD) technique including high frequency noise removal, powerline interference separation, baseline wander correction, the combining of EMD and Other Methods, EEMD technique. EMD technique is a quite potential and prospective but not perfect method in the application of processing nonlinear and non-stationary signal like ECG signal. The EMD combined with other algorithms is a good solution to improve the performance of noise cancellation. The pros and cons of EMD technique in ECG signal denoising are discussed in detail. Finally, the future work and challenges in ECG signal denoising based on EMD technique are clarified.

  14. Electrocardiogram signal denoising based on empirical mode decomposition technique: an overview

    Science.gov (United States)

    Han, G.; Lin, B.; Xu, Z.

    2017-03-01

    Electrocardiogram (ECG) signal is nonlinear and non-stationary weak signal which reflects whether the heart is functioning normally or abnormally. ECG signal is susceptible to various kinds of noises such as high/low frequency noises, powerline interference and baseline wander. Hence, the removal of noises from ECG signal becomes a vital link in the ECG signal processing and plays a significant role in the detection and diagnosis of heart diseases. The review will describe the recent developments of ECG signal denoising based on Empirical Mode Decomposition (EMD) technique including high frequency noise removal, powerline interference separation, baseline wander correction, the combining of EMD and Other Methods, EEMD technique. EMD technique is a quite potential and prospective but not perfect method in the application of processing nonlinear and non-stationary signal like ECG signal. The EMD combined with other algorithms is a good solution to improve the performance of noise cancellation. The pros and cons of EMD technique in ECG signal denoising are discussed in detail. Finally, the future work and challenges in ECG signal denoising based on EMD technique are clarified.

  15. On-line diagnostic techniques for air-operated control valves based on time series analysis

    International Nuclear Information System (INIS)

    Ito, Kenji; Matsuoka, Yoshinori; Minamikawa, Shigeru; Komatsu, Yasuki; Satoh, Takeshi.

    1996-01-01

    The objective of this research is to study the feasibility of applying on-line diagnostic techniques based on time series analysis to air-operated control valves - numerous valves of the type which are used in PWR plants. Generally the techniques can detect anomalies by failures in the initial stages for which detection is difficult by conventional surveillance of process parameters measured directly. However, the effectiveness of these techniques depends on the system being diagnosed. The difficulties in applying diagnostic techniques to air-operated control valves seem to come from the reduced sensitivity of their response as compared with hydraulic control systems, as well as the need to identify anomalies in low level signals that fluctuate only slightly but continuously. In this research, simulation tests were performed by setting various kinds of failure modes for a test valve with the same specifications as of a valve actually used in the plants. Actual control signals recorded from an operating plant were then used as input signals for simulation. The results of the tests confirmed the feasibility of applying on-line diagnostic techniques based on time series analysis to air-operated control valves. (author)

  16. Prediction of drug synergy in cancer using ensemble-based machine learning techniques

    Science.gov (United States)

    Singh, Harpreet; Rana, Prashant Singh; Singh, Urvinder

    2018-04-01

    Drug synergy prediction plays a significant role in the medical field for inhibiting specific cancer agents. It can be developed as a pre-processing tool for therapeutic successes. Examination of different drug-drug interaction can be done by drug synergy score. It needs efficient regression-based machine learning approaches to minimize the prediction errors. Numerous machine learning techniques such as neural networks, support vector machines, random forests, LASSO, Elastic Nets, etc., have been used in the past to realize requirement as mentioned above. However, these techniques individually do not provide significant accuracy in drug synergy score. Therefore, the primary objective of this paper is to design a neuro-fuzzy-based ensembling approach. To achieve this, nine well-known machine learning techniques have been implemented by considering the drug synergy data. Based on the accuracy of each model, four techniques with high accuracy are selected to develop ensemble-based machine learning model. These models are Random forest, Fuzzy Rules Using Genetic Cooperative-Competitive Learning method (GFS.GCCL), Adaptive-Network-Based Fuzzy Inference System (ANFIS) and Dynamic Evolving Neural-Fuzzy Inference System method (DENFIS). Ensembling is achieved by evaluating the biased weighted aggregation (i.e. adding more weights to the model with a higher prediction score) of predicted data by selected models. The proposed and existing machine learning techniques have been evaluated on drug synergy score data. The comparative analysis reveals that the proposed method outperforms others in terms of accuracy, root mean square error and coefficient of correlation.

  17. Knowledge based systems: A critical survey of major concepts, issues and techniques. Visuals

    Science.gov (United States)

    Dominick, Wayne D. (Editor); Kavi, Srinu

    1984-01-01

    This Working Paper Series entry represents a collection of presentation visuals associated with the companion report entitled, Knowledge Based Systems: A Critical Survey of Major Concepts, Issues, and Techniques, USL/DBMS NASA/RECON Working Paper Series report number DBMS.NASA/RECON-9. The objectives of the report are to: examine various techniques used to build the KBS; to examine at least one KBS in detail, i.e., a case study; to list and identify limitations and problems with the KBS; to suggest future areas of research; and to provide extensive reference materials.

  18. Synchronization of uncertain time-varying network based on sliding mode control technique

    Science.gov (United States)

    Lü, Ling; Li, Chengren; Bai, Suyuan; Li, Gang; Rong, Tingting; Gao, Yan; Yan, Zhe

    2017-09-01

    We research synchronization of uncertain time-varying network based on sliding mode control technique. The sliding mode control technique is first modified so that it can be applied to network synchronization. Further, by choosing the appropriate sliding surface, the identification law of uncertain parameter, the adaptive law of the time-varying coupling matrix element and the control input of network are designed, it is sure that the uncertain time-varying network can synchronize effectively the synchronization target. At last, we perform some numerical simulations to demonstrate the effectiveness of the proposed results.

  19. Clock-Frequency Switching Technique for Energy Saving of Microcontroller Unit (MCU-Based Sensor Node

    Directory of Open Access Journals (Sweden)

    Pumin Duangmanee

    2018-05-01

    Full Text Available In this paper; a technique is proposed for reducing the energy consumption of microcontroller-based sensor nodes by switching the operating clock between low and high frequencies. The proposed concept is motivated by the fact that if the application codes of the microcontroller unit (MCU consist of no-wait state instruction sets, it consumes less energy when it operates with a higher frequency. When the application code of the MCU consists of wait instruction sets; e.g., a wait acknowledge signal, it switches to low clock frequency. The experimental results confirm that the proposed technique can reduce the MCU energy consumption up to 66.9%.

  20. Auto-correlation based intelligent technique for complex waveform presentation and measurement

    International Nuclear Information System (INIS)

    Rana, K P S; Singh, R; Sayann, K S

    2009-01-01

    Waveform acquisition and presentation forms the heart of many measurement systems. Particularly, data acquisition and presentation of repeating complex signals like sine sweep and frequency-modulated signals introduces the challenge of waveform time period estimation and live waveform presentation. This paper presents an intelligent technique, for waveform period estimation of both the complex and simple waveforms, based on the normalized auto-correlation method. The proposed technique is demonstrated using LabVIEW based intensive simulations on several simple and complex waveforms. Implementation of the technique is successfully demonstrated using LabVIEW based virtual instrumentation. Sine sweep vibration waveforms are successfully presented and measured for electrodynamic shaker system generated vibrations. The proposed method is also suitable for digital storage oscilloscope (DSO) triggering, for complex signals acquisition and presentation. This intelligence can be embodied into the DSO, making it an intelligent measurement system, catering wide varieties of the waveforms. The proposed technique, simulation results, robustness study and implementation results are presented in this paper.

  1. The role of graphene-based sorbents in modern sample preparation techniques.

    Science.gov (United States)

    de Toffoli, Ana Lúcia; Maciel, Edvaldo Vasconcelos Soares; Fumes, Bruno Henrique; Lanças, Fernando Mauro

    2018-01-01

    The application of graphene-based sorbents in sample preparation techniques has increased significantly since 2011. These materials have good physicochemical properties to be used as sorbent and have shown excellent results in different sample preparation techniques. Graphene and its precursor graphene oxide have been considered to be good candidates to improve the extraction and concentration of different classes of target compounds (e.g., parabens, polycyclic aromatic hydrocarbon, pyrethroids, triazines, and so on) present in complex matrices. Its applications have been employed during the analysis of different matrices (e.g., environmental, biological and food). In this review, we highlight the most important characteristics of graphene-based material, their properties, synthesis routes, and the most important applications in both off-line and on-line sample preparation techniques. The discussion of the off-line approaches includes methods derived from conventional solid-phase extraction focusing on the miniaturized magnetic and dispersive modes. The modes of microextraction techniques called stir bar sorptive extraction, solid phase microextraction, and microextraction by packed sorbent are discussed. The on-line approaches focus on the use of graphene-based material mainly in on-line solid phase extraction, its variation called in-tube solid-phase microextraction, and on-line microdialysis systems. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Lamb-Wave-Based Tomographic Imaging Techniques for Hole-Edge Corrosion Monitoring in Plate Structures

    Directory of Open Access Journals (Sweden)

    Dengjiang Wang

    2016-11-01

    Full Text Available This study presents a novel monitoring method for hole-edge corrosion damage in plate structures based on Lamb wave tomographic imaging techniques. An experimental procedure with a cross-hole layout using 16 piezoelectric transducers (PZTs was designed. The A0 mode of the Lamb wave was selected, which is sensitive to thickness-loss damage. The iterative algebraic reconstruction technique (ART method was used to locate and quantify the corrosion damage at the edge of the hole. Hydrofluoric acid with a concentration of 20% was used to corrode the specimen artificially. To estimate the effectiveness of the proposed method, the real corrosion damage was compared with the predicted corrosion damage based on the tomographic method. The results show that the Lamb-wave-based tomographic method can be used to monitor the hole-edge corrosion damage accurately.

  3. A proposed security technique based on watermarking and encryption for digital imaging and communications in medicine

    Directory of Open Access Journals (Sweden)

    Mohamed M. Abd-Eldayem

    2013-03-01

    Full Text Available Nowadays; modern Hospital Data Management Systems (HDMSs are applied in a computer network; in addition medicinal equipments produce medical images in a digital form. HDMS must store and exchange these images in a secured environment to provide image integrity and patient privacy. The reversible watermarking techniques can be used to provide the integrity and the privacy. In this paper, a security technique based on watermarking and encryption is proposed to be used for Digital Imaging and Communications in Medicine (DICOM. It provides patient authentication, information confidentiality and integrity based on reversible watermark. To achieve integrity service at the sender side; a hash value based on encrypted MD5 is determined from the image. And to satisfy the reversible feature; R–S-Vector is determined from the image and is compressed based on a Huffman compression algorithm. After that to provide confidentiality and authentication services: the compressed R–S-Vector, the hash value and patient ID are concatenated to form a watermark then this watermark is encrypted using AES encryption technique, finally the watermark is embedded inside the medical image. Experimental results prove that the proposed technique can provide patient authentication services, image integrity service and information confidentiality service with excellent efficiency. Concluded results for all tested DICOM medical images and natural images show the following: BER equals 0, both of SNR and PSNR are consistent and have large values, and MSE has low value; the average values of SNR, PSNR and MSE are 52 dB, 57 dB and 0.12 respectively. Therefore, watermarked images have high imperceptibility, invisibility and transparency. In addition, the watermark extracted from the image at the receiver side is identical to the watermark embedded into the image in the sender side; as a result, the proposed technique is totally reversible, and the embedded watermark does not

  4. A correction scheme for thermal conductivity measurement using the comparative cut-bar technique based on 3D numerical simulation

    International Nuclear Information System (INIS)

    Xing, Changhu; Folsom, Charles; Jensen, Colby; Ban, Heng; Marshall, Douglas W

    2014-01-01

    As an important factor affecting the accuracy of thermal conductivity measurement, systematic (bias) error in the guarded comparative axial heat flow (cut-bar) method was mostly neglected by previous researches. This bias is primarily due to the thermal conductivity mismatch between sample and meter bars (reference), which is common for a sample of unknown thermal conductivity. A correction scheme, based on finite element simulation of the measurement system, was proposed to reduce the magnitude of the overall measurement uncertainty. This scheme was experimentally validated by applying corrections on four types of sample measurements in which the specimen thermal conductivity is much smaller, slightly smaller, equal and much larger than that of the meter bar. As an alternative to the optimum guarding technique proposed before, the correction scheme can be used to minimize the uncertainty contribution from the measurement system with non-optimal guarding conditions. It is especially necessary for large thermal conductivity mismatches between sample and meter bars. (paper)

  5. Simplified Technique for Incorporating a Metal Mesh into Record Bases for Mandibular Implant Overdentures.

    Science.gov (United States)

    Godoy, Antonio; Siegel, Sharon C

    2015-12-01

    Mandibular implant-retained overdentures have become the standard of care for patients with mandibular complete edentulism. As part of the treatment, the mandibular implant-retained overdenture may require a metal mesh framework to be incorporated to strengthen the denture and avoid fracture of the prosthesis. Integrating the metal mesh framework as part of the acrylic record base and wax occlusion rim before the jaw relation procedure will avoid the distortion of the record base and will minimize the chances of processing errors. A simplified method to incorporate the mesh into the record base and occlusion rim is presented in this technique article. © 2015 by the American College of Prosthodontists.

  6. Synthesis and electrochemical properties of tin oxide-based composite by rheological technique

    International Nuclear Information System (INIS)

    He Zeqiang; Li Xinhai; Xiong Lizhi; Wu Xianming; Xiao Zhuobing; Ma Mingyou

    2005-01-01

    Novel rheological technique was developed to synthesize tin oxide-based composites. The microstructure, morphology, and electrochemical performance of the materials were investigated by X-ray diffraction, scanning electron microscopy and electrochemical methods. The particles of tin oxide-based materials form an inactive matrix. The average size of the particles is about 150 nm. The material delivers a charge capacity of more than 570 mAh g -1 . The capacity loss per cycle is about 0.15% after being cycled 30 times. The good electrochemical performance indicates that this kind of tin oxide-based material is promising anode for lithium-ion battery

  7. A new user-assisted segmentation and tracking technique for an object-based video editing system

    Science.gov (United States)

    Yu, Hong Y.; Hong, Sung-Hoon; Lee, Mike M.; Choi, Jae-Gark

    2004-03-01

    This paper presents a semi-automatic segmentation method which can be used to generate video object plane (VOP) for object based coding scheme and multimedia authoring environment. Semi-automatic segmentation can be considered as a user-assisted segmentation technique. A user can initially mark objects of interest around the object boundaries and then the user-guided and selected objects are continuously separated from the unselected areas through time evolution in the image sequences. The proposed segmentation method consists of two processing steps: partially manual intra-frame segmentation and fully automatic inter-frame segmentation. The intra-frame segmentation incorporates user-assistance to define the meaningful complete visual object of interest to be segmentation and decides precise object boundary. The inter-frame segmentation involves boundary and region tracking to obtain temporal coherence of moving object based on the object boundary information of previous frame. The proposed method shows stable efficient results that could be suitable for many digital video applications such as multimedia contents authoring, content based coding and indexing. Based on these results, we have developed objects based video editing system with several convenient editing functions.

  8. PELE:  Protein Energy Landscape Exploration. A Novel Monte Carlo Based Technique.

    Science.gov (United States)

    Borrelli, Kenneth W; Vitalis, Andreas; Alcantara, Raul; Guallar, Victor

    2005-11-01

    Combining protein structure prediction algorithms and Metropolis Monte Carlo techniques, we provide a novel method to explore all-atom energy landscapes. The core of the technique is based on a steered localized perturbation followed by side-chain sampling as well as minimization cycles. The algorithm and its application to ligand diffusion are presented here. Ligand exit pathways are successfully modeled for different systems containing ligands of various sizes:  carbon monoxide in myoglobin, camphor in cytochrome P450cam, and palmitic acid in the intestinal fatty-acid-binding protein. These initial applications reveal the potential of this new technique in mapping millisecond-time-scale processes. The computational cost associated with the exploration is significantly less than that of conventional MD simulations.

  9. Constrained Optimization Based on Hybrid Evolutionary Algorithm and Adaptive Constraint-Handling Technique

    DEFF Research Database (Denmark)

    Wang, Yong; Cai, Zixing; Zhou, Yuren

    2009-01-01

    A novel approach to deal with numerical and engineering constrained optimization problems, which incorporates a hybrid evolutionary algorithm and an adaptive constraint-handling technique, is presented in this paper. The hybrid evolutionary algorithm simultaneously uses simplex crossover and two...... mutation operators to generate the offspring population. Additionally, the adaptive constraint-handling technique consists of three main situations. In detail, at each situation, one constraint-handling mechanism is designed based on current population state. Experiments on 13 benchmark test functions...... and four well-known constrained design problems verify the effectiveness and efficiency of the proposed method. The experimental results show that integrating the hybrid evolutionary algorithm with the adaptive constraint-handling technique is beneficial, and the proposed method achieves competitive...

  10. A Review of the Piezoelectric Electromechanical Impedance Based Structural Health Monitoring Technique for Engineering Structures

    Directory of Open Access Journals (Sweden)

    Wongi S. Na

    2018-04-01

    Full Text Available The birth of smart materials such as piezoelectric (PZT transducers has aided in revolutionizing the field of structural health monitoring (SHM based on non-destructive testing (NDT methods. While a relatively new NDT method known as the electromechanical (EMI technique has been investigated for more than two decades, there are still various problems that must be solved before it is applied to real structures. The technique, which has a significant potential to contribute to the creation of one of the most effective SHM systems, involves the use of a single PZT for exciting and sensing of the host structure. In this paper, studies applied for the past decade related to the EMI technique have been reviewed to understand its trend. In addition, new concepts and ideas proposed by various authors are also surveyed, and the paper concludes with a discussion of the potential directions for future works.

  11. A new slit lamp-based technique for anterior chamber angle estimation.

    Science.gov (United States)

    Gispets, Joan; Cardona, Genís; Tomàs, Núria; Fusté, Cèlia; Binns, Alison; Fortes, Miguel A

    2014-06-01

    To design and test a new noninvasive method for anterior chamber angle (ACA) estimation based on the slit lamp that is accessible to all eye-care professionals. A new technique (slit lamp anterior chamber estimation [SLACE]) that aims to overcome some of the limitations of the van Herick procedure was designed. The technique, which only requires a slit lamp, was applied to estimate the ACA of 50 participants (100 eyes) using two different slit lamp models, and results were compared with gonioscopy as the clinical standard. The Spearman nonparametric correlation between ACA values as determined by gonioscopy and SLACE were 0.81 (p gonioscopy (Spaeth classification). The SLACE technique, when compared with gonioscopy, displayed good accuracy in the detection of narrow angles, and it may be useful for eye-care clinicians without access to expensive alternative equipment or those who cannot perform gonioscopy because of legal constraints regarding the use of diagnostic drugs.

  12. Technique Based on Image Pyramid and Bayes Rule for Noise Reduction in Unsupervised Change Detection

    Institute of Scientific and Technical Information of China (English)

    LI Zhi-qiang; HUO hong; FANG Tao; ZHU Ju-lian; GE Wei-li

    2009-01-01

    In this paper, a technique based on image pyramid and Bayes rule for reducing noise effects in unsupervised change detection is proposed. By using Gaussian pyramid to process two multitemporal images respectively, two image pyramids are constructed. The difference pyramid images are obtained by point-by-point subtraction between the same level images of the two image pyramids. By resizing all difference pyramid images to the size of the original multitemporal image and then making product operator among them, a map being similar to the difference image is obtained. The difference image is generated by point-by-point subtraction between the two multitemporal images directly. At last, the Bayes rule is used to distinguish the changed pixels. Both synthetic and real data sets are used to evaluate the performance of the proposed technique. Experimental results show that the map from the proposed technique is more robust to noise than the difference image.

  13. A NEW TECHNIQUE BASED ON CHAOTIC STEGANOGRAPHY AND ENCRYPTION TEXT IN DCT DOMAIN FOR COLOR IMAGE

    Directory of Open Access Journals (Sweden)

    MELAD J. SAEED

    2013-10-01

    Full Text Available Image steganography is the art of hiding information into a cover image. This paper presents a new technique based on chaotic steganography and encryption text in DCT domain for color image, where DCT is used to transform original image (cover image from spatial domain to frequency domain. This technique used chaotic function in two phases; firstly; for encryption secret message, second; for embedding in DCT cover image. With this new technique, good results are obtained through satisfying the important properties of steganography such as: imperceptibility; improved by having mean square error (MSE, peak signal to noise ratio (PSNR and normalized correlation (NC, to phase and capacity; improved by encoding the secret message characters with variable length codes and embedding the secret message in one level of color image only.

  14. Wind Turbine Rotor Simulation via CFD Based Actuator Disc Technique Compared to Detailed Measurement

    Directory of Open Access Journals (Sweden)

    Esmail Mahmoodi

    2015-10-01

    Full Text Available In this paper, a generalized Actuator Disc (AD is used to model the wind turbine rotor of the MEXICO experiment, a collaborative European wind turbine project. The AD model as a combination of CFD technique and User Defined Functions codes (UDF, so-called UDF/AD model is used to simulate loads and performance of the rotor in three different wind speed tests. Distributed force on the blade, thrust and power production of the rotor as important designing parameters of wind turbine rotors are focused to model. A developed Blade Element Momentum (BEM theory as a code based numerical technique as well as a full rotor simulation both from the literature are included into the results to compare and discuss. The output of all techniques is compared to detailed measurements for validation, which led us to final conclusions.

  15. A study on laser-based ultrasonic technique by the use of guided wave tomographic imaging

    Energy Technology Data Exchange (ETDEWEB)

    Park, Junpil, E-mail: jpp@pusan.ac.kr; Lim, Juyoung, E-mail: jpp@pusan.ac.kr [Graduate school, School of Mechanical Engineering, Pusan National University (Korea, Republic of); Cho, Younho [School of Mechanical Engineering, Pusan National University (Korea, Republic of); Krishnaswamy, Sridhar [Center for Quality Engineering and Failure Prevention, Northwestern University, Evanston, IL (United States)

    2015-03-31

    Guided wave tests are impractical for investigating specimens with limited accessibility and coarse surfaces or geometrically complicated features. A non-contact setup with a laser ultrasonic transmitter and receiver is the classic attractive for guided wave inspection. The present work was done to develop a non-contact guided-wave tomography technique by laser ultrasonic technique in a plate-like structure. A method for Lam wave generation and detection in an aluminum plate with a pulse laser ultrasonic transmitter and a Michelson interferometer receiver has been developed. In the images obtained by laser scanning, the defect shape and area showed good agreement with the actual defect. The proposed approach can be used as a non-contact-based online inspection and monitoring technique.

  16. Kidnapping Detection and Recognition in Previous Unknown Environment

    Directory of Open Access Journals (Sweden)

    Yang Tian

    2017-01-01

    Full Text Available An unaware event referred to as kidnapping makes the estimation result of localization incorrect. In a previous unknown environment, incorrect localization result causes incorrect mapping result in Simultaneous Localization and Mapping (SLAM by kidnapping. In this situation, the explored area and unexplored area are divided to make the kidnapping recovery difficult. To provide sufficient information on kidnapping, a framework to judge whether kidnapping has occurred and to identify the type of kidnapping with filter-based SLAM is proposed. The framework is called double kidnapping detection and recognition (DKDR by performing two checks before and after the “update” process with different metrics in real time. To explain one of the principles of DKDR, we describe a property of filter-based SLAM that corrects the mapping result of the environment using the current observations after the “update” process. Two classical filter-based SLAM algorithms, Extend Kalman Filter (EKF SLAM and Particle Filter (PF SLAM, are modified to show that DKDR can be simply and widely applied in existing filter-based SLAM algorithms. Furthermore, a technique to determine the adapted thresholds of metrics in real time without previous data is presented. Both simulated and experimental results demonstrate the validity and accuracy of the proposed method.

  17. Cardiac-driven Pulsatile Motion of Intracranial Cerebrospinal Fluid Visualized Based on a Correlation Mapping Technique.

    Science.gov (United States)

    Yatsushiro, Satoshi; Sunohara, Saeko; Hayashi, Naokazu; Hirayama, Akihiro; Matsumae, Mitsunori; Atsumi, Hideki; Kuroda, Kagayaki

    2018-04-10

    A correlation mapping technique delineating delay time and maximum correlation for characterizing pulsatile cerebrospinal fluid (CSF) propagation was proposed. After proofing its technical concept, this technique was applied to healthy volunteers and idiopathic normal pressure hydrocephalus (iNPH) patients. A time-resolved three dimensional-phase contrast (3D-PC) sampled the cardiac-driven CSF velocity at 32 temporal points per cardiac period at each spatial location using retrospective cardiac gating. The proposed technique visualized distributions of propagation delay and correlation coefficient of the PC-based CSF velocity waveform with reference to a waveform at a particular point in the CSF space. The delay time was obtained as the amount of time-shift, giving the maximum correlation for the velocity waveform at an arbitrary location with that at the reference location. The validity and accuracy of the technique were confirmed in a flow phantom equipped with a cardiovascular pump. The technique was then applied to evaluate the intracranial CSF motions in young, healthy (N = 13), and elderly, healthy (N = 13) volunteers and iNPH patients (N = 13). The phantom study demonstrated that root mean square error of the delay time was 2.27%, which was less than the temporal resolution of PC measurement used in this study (3.13% of a cardiac cycle). The human studies showed a significant difference (P correlation coefficient between the young, healthy group and the other two groups. A significant difference (P correlation coefficients in intracranial CSF space among all groups. The result suggests that the CSF space compliance of iNPH patients was lower than that of healthy volunteers. The correlation mapping technique allowed us to visualize pulsatile CSF velocity wave propagations as still images. The technique may help to classify diseases related to CSF dynamics, such as iNPH.

  18. A preclustering-based ensemble learning technique for acute appendicitis diagnoses.

    Science.gov (United States)

    Lee, Yen-Hsien; Hu, Paul Jen-Hwa; Cheng, Tsang-Hsiang; Huang, Te-Chia; Chuang, Wei-Yao

    2013-06-01

    Acute appendicitis is a common medical condition, whose effective, timely diagnosis can be difficult. A missed diagnosis not only puts the patient in danger but also requires additional resources for corrective treatments. An acute appendicitis diagnosis constitutes a classification problem, for which a further fundamental challenge pertains to the skewed outcome class distribution of instances in the training sample. A preclustering-based ensemble learning (PEL) technique aims to address the associated imbalanced sample learning problems and thereby support the timely, accurate diagnosis of acute appendicitis. The proposed PEL technique employs undersampling to reduce the number of majority-class instances in a training sample, uses preclustering to group similar majority-class instances into multiple groups, and selects from each group representative instances to create more balanced samples. The PEL technique thereby reduces potential information loss from random undersampling. It also takes advantage of ensemble learning to improve performance. We empirically evaluate this proposed technique with 574 clinical cases obtained from a comprehensive tertiary hospital in southern Taiwan, using several prevalent techniques and a salient scoring system as benchmarks. The comparative results show that PEL is more effective and less biased than any benchmarks. The proposed PEL technique seems more sensitive to identifying positive acute appendicitis than the commonly used Alvarado scoring system and exhibits higher specificity in identifying negative acute appendicitis. In addition, the sensitivity and specificity values of PEL appear higher than those of the investigated benchmarks that follow the resampling approach. Our analysis suggests PEL benefits from the more representative majority-class instances in the training sample. According to our overall evaluation results, PEL records the best overall performance, and its area under the curve measure reaches 0.619. The

  19. A Novel Technique for Shape Feature Extraction Using Content Based Image Retrieval

    Directory of Open Access Journals (Sweden)

    Dhanoa Jaspreet Singh

    2016-01-01

    Full Text Available With the advent of technology and multimedia information, digital images are increasing very quickly. Various techniques are being developed to retrieve/search digital information or data contained in the image. Traditional Text Based Image Retrieval System is not plentiful. Since it is time consuming as it require manual image annotation. Also, the image annotation differs with different peoples. An alternate to this is Content Based Image Retrieval (CBIR system. It retrieves/search for image using its contents rather the text, keywords etc. A lot of exploration has been compassed in the range of Content Based Image Retrieval (CBIR with various feature extraction techniques. Shape is a significant image feature as it reflects the human perception. Moreover, Shape is quite simple to use by the user to define object in an image as compared to other features such as Color, texture etc. Over and above, if applied alone, no descriptor will give fruitful results. Further, by combining it with an improved classifier, one can use the positive features of both the descriptor and classifier. So, a tryout will be made to establish an algorithm for accurate feature (Shape extraction in Content Based Image Retrieval (CBIR. The main objectives of this project are: (a To propose an algorithm for shape feature extraction using CBIR, (b To evaluate the performance of proposed algorithm and (c To compare the proposed algorithm with state of art techniques.

  20. Biosensor-based microRNA detection: techniques, design, performance, and challenges.

    Science.gov (United States)

    Johnson, Blake N; Mutharasan, Raj

    2014-04-07

    The current state of biosensor-based techniques for amplification-free microRNA (miRNA) detection is critically reviewed. Comparison with non-sensor and amplification-based molecular techniques (MTs), such as polymerase-based methods, is made in terms of transduction mechanism, associated protocol, and sensitivity. Challenges associated with miRNA hybridization thermodynamics which affect assay selectivity and amplification bias are briefly discussed. Electrochemical, electromechanical, and optical classes of miRNA biosensors are reviewed in terms of transduction mechanism, limit of detection (LOD), time-to-results (TTR), multiplexing potential, and measurement robustness. Current trends suggest that biosensor-based techniques (BTs) for miRNA assay will complement MTs due to the advantages of amplification-free detection, LOD being femtomolar (fM)-attomolar (aM), short TTR, multiplexing capability, and minimal sample preparation requirement. Areas of future importance in miRNA BT development are presented which include focus on achieving high measurement confidence and multiplexing capabilities.

  1. PERFORMANCE ANALYSIS OF PILOT BASED CHANNEL ESTIMATION TECHNIQUES IN MB OFDM SYSTEMS

    Directory of Open Access Journals (Sweden)

    M. Madheswaran

    2011-12-01

    Full Text Available Ultra wideband (UWB communication is mainly used for short range of communication in wireless personal area networks. Orthogonal Frequency Division Multiplexing (OFDM is being used as a key physical layer technology for Fourth Generation (4G wireless communication. OFDM based communication gives high spectral efficiency and mitigates Inter-symbol Interference (ISI in a wireless medium. In this paper the IEEE 802.15.3a based Multiband OFDM (MB OFDM system is considered. The pilot based channel estimation techniques are considered to analyze the performance of MB OFDM systems over Liner Time Invariant (LTI Channel models. In this paper, pilot based Least Square (LS and Least Minimum Mean Square Error (LMMSE channel estimation technique has been considered for UWB OFDM system. In the proposed method, the estimated Channel Impulse Responses (CIRs are filtered in the time domain for the consideration of the channel delay spread. Also the performance of proposed system has been analyzed for different modulation techniques for various pilot density patterns.

  2. Weighted Least Squares Techniques for Improved Received Signal Strength Based Localization

    Directory of Open Access Journals (Sweden)

    José R. Casar

    2011-09-01

    Full Text Available The practical deployment of wireless positioning systems requires minimizing the calibration procedures while improving the location estimation accuracy. Received Signal Strength localization techniques using propagation channel models are the simplest alternative, but they are usually designed under the assumption that the radio propagation model is to be perfectly characterized a priori. In practice, this assumption does not hold and the localization results are affected by the inaccuracies of the theoretical, roughly calibrated or just imperfect channel models used to compute location. In this paper, we propose the use of weighted multilateration techniques to gain robustness with respect to these inaccuracies, reducing the dependency of having an optimal channel model. In particular, we propose two weighted least squares techniques based on the standard hyperbolic and circular positioning algorithms that specifically consider the accuracies of the different measurements to obtain a better estimation of the position. These techniques are compared to the standard hyperbolic and circular positioning techniques through both numerical simulations and an exhaustive set of real experiments on different types of wireless networks (a wireless sensor network, a WiFi network and a Bluetooth network. The algorithms not only produce better localization results with a very limited overhead in terms of computational cost but also achieve a greater robustness to inaccuracies in channel modeling.

  3. Bit Plane Coding based Steganography Technique for JPEG2000 Images and Videos

    Directory of Open Access Journals (Sweden)

    Geeta Kasana

    2016-02-01

    Full Text Available In this paper, a Bit Plane Coding (BPC based steganography technique for JPEG2000 images and Motion JPEG2000 video is proposed. Embedding in this technique is performed in the lowest significant bit planes of the wavelet coefficients of a cover image. In JPEG2000 standard, the number of bit planes of wavelet coefficients to be used in encoding is dependent on the compression rate and are used in Tier-2 process of JPEG2000. In the proposed technique, Tier-1 and Tier-2 processes of JPEG2000 and Motion JPEG2000 are executed twice on the encoder side to collect the information about the lowest bit planes of all code blocks of a cover image, which is utilized in embedding and transmitted to the decoder. After embedding secret data, Optimal Pixel Adjustment Process (OPAP is applied on stego images to enhance its visual quality. Experimental results show that proposed technique provides large embedding capacity and better visual quality of stego images than existing steganography techniques for JPEG2000 compressed images and videos. Extracted secret image is similar to the original secret image.

  4. Innovative Hyperspectral Imaging-Based Techniques for Quality Evaluation of Fruits and Vegetables: A Review

    Directory of Open Access Journals (Sweden)

    Yuzhen Lu

    2017-02-01

    Full Text Available New, non-destructive sensing techniques for fast and more effective quality assessment of fruits and vegetables are needed to meet the ever-increasing consumer demand for better, more consistent and safer food products. Over the past 15 years, hyperspectral imaging has emerged as a new generation of sensing technology for non-destructive food quality and safety evaluation, because it integrates the major features of imaging and spectroscopy, thus enabling the acquisition of both spectral and spatial information from an object simultaneously. This paper first provides a brief overview of hyperspectral imaging configurations and common sensing modes used for food quality and safety evaluation. The paper is, however, focused on the three innovative hyperspectral imaging-based techniques or sensing platforms, i.e., spectral scattering, integrated reflectance and transmittance, and spatially-resolved spectroscopy, which have been developed in our laboratory for property and quality evaluation of fruits, vegetables and other food products. The basic principle and instrumentation of each technique are described, followed by the mathematical methods for processing and extracting critical information from the acquired data. Applications of these techniques for property and quality evaluation of fruits and vegetables are then presented. Finally, concluding remarks are given on future research needs to move forward these hyperspectral imaging techniques.

  5. A Review of Financial Accounting Fraud Detection based on Data Mining Techniques

    Science.gov (United States)

    Sharma, Anuj; Kumar Panigrahi, Prabin

    2012-02-01

    With an upsurge in financial accounting fraud in the current economic scenario experienced, financial accounting fraud detection (FAFD) has become an emerging topic of great importance for academic, research and industries. The failure of internal auditing system of the organization in identifying the accounting frauds has lead to use of specialized procedures to detect financial accounting fraud, collective known as forensic accounting. Data mining techniques are providing great aid in financial accounting fraud detection, since dealing with the large data volumes and complexities of financial data are big challenges for forensic accounting. This paper presents a comprehensive review of the literature on the application of data mining techniques for the detection of financial accounting fraud and proposes a framework for data mining techniques based accounting fraud detection. The systematic and comprehensive literature review of the data mining techniques applicable to financial accounting fraud detection may provide a foundation to future research in this field. The findings of this review show that data mining techniques like logistic models, neural networks, Bayesian belief network, and decision trees have been applied most extensively to provide primary solutions to the problems inherent in the detection and classification of fraudulent data.

  6. Reliability Assessment of Wind Farm Electrical System Based on a Probability Transfer Technique

    Directory of Open Access Journals (Sweden)

    Hejun Yang

    2018-03-01

    Full Text Available The electrical system of a wind farm has a significant influence on the wind farm reliability and electrical energy yield. The disconnect switch installed in an electrical system cannot only improve the operating flexibility, but also enhance the reliability for a wind farm. Therefore, this paper develops a probabilistic transfer technique for integrating the electrical topology structure, the isolation operation of disconnect switch, and stochastic failure of electrical equipment into the reliability assessment of wind farm electrical system. Firstly, as the traditional two-state reliability model of electrical equipment cannot consider the isolation operation, so the paper develops a three-state reliability model to replace the two-state model for incorporating the isolation operation. In addition, a proportion apportion technique is presented to evaluate the state probability. Secondly, this paper develops a probabilistic transfer technique based on the thoughts that through transfer the unreliability of electrical system to the energy transmission interruption of wind turbine generators (WTGs. Finally, some novel indices for describing the reliability of wind farm electrical system are designed, and the variance coefficient of the designed indices is used as a convergence criterion to determine the termination of the assessment process. The proposed technique is applied to the reliability assessment of a wind farm with the different topologies. The simulation results show that the proposed techniques are effective in practical applications.

  7. Stable adaptive PI control for permanent magnet synchronous motor drive based on improved JITL technique.

    Science.gov (United States)

    Zheng, Shiqi; Tang, Xiaoqi; Song, Bao; Lu, Shaowu; Ye, Bosheng

    2013-07-01

    In this paper, a stable adaptive PI control strategy based on the improved just-in-time learning (IJITL) technique is proposed for permanent magnet synchronous motor (PMSM) drive. Firstly, the traditional JITL technique is improved. The new IJITL technique has less computational burden and is more suitable for online identification of the PMSM drive system which is highly real-time compared to traditional JITL. In this way, the PMSM drive system is identified by IJITL technique, which provides information to an adaptive PI controller. Secondly, the adaptive PI controller is designed in discrete time domain which is composed of a PI controller and a supervisory controller. The PI controller is capable of automatically online tuning the control gains based on the gradient descent method and the supervisory controller is developed to eliminate the effect of the approximation error introduced by the PI controller upon the system stability in the Lyapunov sense. Finally, experimental results on the PMSM drive system show accurate identification and favorable tracking performance. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Development of accelerator-based γ-ray-induced positron annihilation spectroscopy technique

    International Nuclear Information System (INIS)

    Selim, F.A.; Wells, D.P.; Harmon, J. F.; Williams, J.

    2005-01-01

    Accelerator-based γ-ray-induced positron annihilation spectroscopy performs positron annihilation spectroscopy by utilizing MeV bremsstrahlung radiation generated from an accelerator (We have named the technique 'accelerator-based γ-ray-induced PAS', even though 'bremsstrahlung' is more correct here than 'γ rays'. The reason for that is to make the name of the technique more general, since PAS may be performed by utilizing MeV γ rays emitted from nuclei through the use of accelerators as described later in this article and as in the case of positron lifetime spectroscopy [F.A. Selim, D.P. Wells, and J.F. Harmon, Rev. Sci. Instrum. 76, 033905 (2005)].) instead of using positrons from radioactive sources or positron beams. MeV γ rays create positrons inside the materials by pair production. The induced positrons annihilate with the material electrons emitting a 511-keV annihilation radiation. Doppler broadening spectroscopy of the 511-keV radiation provides information about open-volume defects and plastic deformation in solids. The high penetration of MeV γ rays allows probing of defects at high depths in thick materials up to several centimeters, which is not possible with most of the current nondestructive techniques. In this article, a detailed description of the technique will be presented, including its benefits and limitations relative to the other nondestructive methods. Its application on the investigation of plastic deformation in thick steel alloys will be shown

  9. GIS-based bivariate statistical techniques for groundwater potential analysis (an example of Iran)

    Science.gov (United States)

    Haghizadeh, Ali; Moghaddam, Davoud Davoudi; Pourghasemi, Hamid Reza

    2017-12-01

    Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster-Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran. The research was done using 11 groundwater conditioning factors and 496 spring positions. Based on the ground water potential maps (GPMs) of SI and DST methods, 24.22% and 23.74% of the study area is covered by poor zone of groundwater potential, and 43.93% and 36.3% of Broujerd region is covered by good and very good potential zones, respectively. The validation of outcomes displayed that area under the curve (AUC) of SI and DST techniques are 81.23% and 79.41%, respectively, which shows SI method has slightly a better performance than the DST technique. Therefore, SI and DST methods are advantageous to analyze groundwater capacity and scrutinize the complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts.

  10. Evaluation of irradiation damage effect by applying electric properties based techniques

    International Nuclear Information System (INIS)

    Acosta, B.; Sevini, F.

    2004-01-01

    The most important effect of the degradation by radiation is the decrease in the ductility of the pressure vessel of the reactor (RPV) ferritic steels. The main way to determine the mechanical behaviour of the RPV steels is tensile and impact tests, from which the ductile to brittle transition temperature (DBTT) and its increase due to neutron irradiation can be calculated. These tests are destructive and regularly applied to surveillance specimens to assess the integrity of RPV. The possibility of applying validated non-destructive ageing monitoring techniques would however facilitate the surveillance of the materials that form the reactor vessel. The JRC-IE has developed two devices, focused on the measurement of the electrical properties to assess non-destructively the embrittlement state of materials. The first technique, called Seebeck and Thomson Effects on Aged Material (STEAM), is based on the measurement of the Seebeck coefficient, characteristic of the material and related to the microstructural changes induced by irradiation embrittlement. With the same aim the second technique, named Resistivity Effects on Aged Material (REAM), measures instead the resistivity of the material. The purpose of this research is to correlate the results of the impact tests, STEAM and REAM measurements with the change in the mechanical properties due to neutron irradiation. These results will make possible the improvement of such techniques based on the measurement of material electrical properties for their application to the irradiation embrittlement assessment

  11. Books average previous decade of economic misery.

    Science.gov (United States)

    Bentley, R Alexander; Acerbi, Alberto; Ormerod, Paul; Lampos, Vasileios

    2014-01-01

    For the 20(th) century since the Depression, we find a strong correlation between a 'literary misery index' derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade.

  12. Books Average Previous Decade of Economic Misery

    Science.gov (United States)

    Bentley, R. Alexander; Acerbi, Alberto; Ormerod, Paul; Lampos, Vasileios

    2014-01-01

    For the 20th century since the Depression, we find a strong correlation between a ‘literary misery index’ derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade. PMID:24416159

  13. A comparison of approximation techniques for variance-based sensitivity analysis of biochemical reaction systems

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2010-05-01

    Full Text Available Abstract Background Sensitivity analysis is an indispensable tool for the analysis of complex systems. In a recent paper, we have introduced a thermodynamically consistent variance-based sensitivity analysis approach for studying the robustness and fragility properties of biochemical reaction systems under uncertainty in the standard chemical potentials of the activated complexes of the reactions and the standard chemical potentials of the molecular species. In that approach, key sensitivity indices were estimated by Monte Carlo sampling, which is computationally very demanding and impractical for large biochemical reaction systems. Computationally efficient algorithms are needed to make variance-based sensitivity analysis applicable to realistic cellular networks, modeled by biochemical reaction systems that consist of a large number of reactions and molecular species. Results We present four techniques, derivative approximation (DA, polynomial approximation (PA, Gauss-Hermite integration (GHI, and orthonormal Hermite approximation (OHA, for analytically approximating the variance-based sensitivity indices associated with a biochemical reaction system. By using a well-known model of the mitogen-activated protein kinase signaling cascade as a case study, we numerically compare the approximation quality of these techniques against traditional Monte Carlo sampling. Our results indicate that, although DA is computationally the most attractive technique, special care should be exercised when using it for sensitivity analysis, since it may only be accurate at low levels of uncertainty. On the other hand, PA, GHI, and OHA are computationally more demanding than DA but can work well at high levels of uncertainty. GHI results in a slightly better accuracy than PA, but it is more difficult to implement. OHA produces the most accurate approximation results and can be implemented in a straightforward manner. It turns out that the computational cost of the

  14. Techniques for intergranular crack formation and assessment in alloy 600 base and alloy 182 weld metals

    International Nuclear Information System (INIS)

    Lee, Tae Hyun; Hwang, Il Soon; Kim, Hong Deok; Kim, Ji Hyun

    2015-01-01

    A technique developed to produce artificial intergranular stress corrosion cracks in structural components was applied to thick, forged alloy 600 base and alloy 182 weld metals for use in the qualification of nondestructive examination techniques for welded components in nuclear power plants. An externally controlled procedure was demonstrated to produce intergranular stress corrosion cracks that are comparable to service-induced cracks in both the base and weld metals. During the process of crack generation, an online direct current potential drop method using array probes was used to measure and monitor the sizes and shapes of the cracks. A microstructural characterization of the produced cracks revealed realistic conformation of the crack faces unlike those in machined notches produced by an electrodischarge machine or simple fatigue loading using a universal testing machine. A comparison with a destructive metallographic examination showed that the characteristics, orientations, and sizes of the intergranular cracks produced in this study are highly reproducible.

  15. Enhancement of Twins Fetal ECG Signal Extraction Based on Hybrid Blind Extraction Techniques

    Directory of Open Access Journals (Sweden)

    Ahmed Kareem Abdullah

    2017-07-01

    Full Text Available ECG machines are noninvasive system used to measure the heartbeat signal. It’s very important to monitor the fetus ECG signals during pregnancy to check the heat activity and to detect any problem early before born, therefore the monitoring of ECG signals have clinical significance and importance. For multi-fetal pregnancy case the classical filtering algorithms are not sufficient to separate the ECG signals between mother and fetal. In this paper the mixture consists of mixing from three ECG signals, the first signal is the mother ECG (M-ECG signal, second signal the Fetal-1 ECG (F1-ECG, and third signal is the Fetal-2 ECG (F2-ECG, these signals are extracted based on modified blind source extraction (BSE techniques. The proposed work based on hybridization between two BSE techniques to ensure that the extracted signals separated well. The results demonstrate that the proposed work very efficiently to extract the useful ECG signals

  16. Application of learning techniques based on kernel methods for the fault diagnosis in industrial processes

    Directory of Open Access Journals (Sweden)

    Jose M. Bernal-de-Lázaro

    2016-05-01

    Full Text Available This article summarizes the main contributions of the PhD thesis titled: "Application of learning techniques based on kernel methods for the fault diagnosis in Industrial processes". This thesis focuses on the analysis and design of fault diagnosis systems (DDF based on historical data. Specifically this thesis provides: (1 new criteria for adjustment of the kernel methods used to select features with a high discriminative capacity for the fault diagnosis tasks, (2 a proposed approach process monitoring using statistical techniques multivariate that incorporates a reinforced information concerning to the dynamics of the Hotelling's T2 and SPE statistics, whose combination with kernel methods improves the detection of small-magnitude faults; (3 an robustness index to compare the diagnosis classifiers performance taking into account their insensitivity to possible noise and disturbance on historical data.

  17. The Effectiveness of Song Technique in Teaching Paper Based TOEFL (PBT’S Listening Comprehension Section

    Directory of Open Access Journals (Sweden)

    Heri Kuswoyo

    2013-07-01

    Full Text Available Among three sections that follow the Paper-Based TOEFL (PBT, many test takers find listening comprehension section is the most difficult. Thus, in this research the researcher aims to explore how students learn PBT’s listening comprehension section effectively through song technique. This sounds like a more interesting and engaging way to learn language because music is a very powerful motivational tool for learning language. To reach the goal of this study, the researcher applied the grammar approach. It is an appropriate approach since the main idea of grammar-based listening exercises is to analyze the language by its components and reconstruct an incomplete text. Besides, the researcher employed an English song as the media the researcher uses the top- down model for the Listening Process.  In this research, the writer tries to share his experience in teaching listening in English department of Teknokrat College by implementing song technique.

  18. ICF implosion hotspot ion temperature diagnostic techniques based on neutron time-of-flight method

    International Nuclear Information System (INIS)

    Tang Qi; Song Zifeng; Chen Jiabin; Zhan Xiayu

    2013-01-01

    Ion temperature of implosion hotspot is a very important parameter for inertial confinement fusion. It reflects the energy level of the hotspot, and it is very sensitive to implosion symmetry and implosion speed. ICF implosion hotspot ion temperature diagnostic techniques based on neutron time-of-flight method were described. A neutron TOF spectrometer was developed using a ultrafast plastic scintillator as the neutron detector. Time response of the spectrometer has 1.1 ns FWHM and 0.5 ns rising time. TOF spectrum resolving method based on deconvolution and low pass filter was illuminated. Implosion hotspot ion temperature in low neutron yield and low ion temperature condition at Shenguang-Ⅲ facility was acquired using the diagnostic techniques. (authors)

  19. K-means-clustering-based fiber nonlinearity equalization techniques for 64-QAM coherent optical communication system.

    Science.gov (United States)

    Zhang, Junfeng; Chen, Wei; Gao, Mingyi; Shen, Gangxiang

    2017-10-30

    In this work, we proposed two k-means-clustering-based algorithms to mitigate the fiber nonlinearity for 64-quadrature amplitude modulation (64-QAM) signal, the training-sequence assisted k-means algorithm and the blind k-means algorithm. We experimentally demonstrated the proposed k-means-clustering-based fiber nonlinearity mitigation techniques in 75-Gb/s 64-QAM coherent optical communication system. The proposed algorithms have reduced clustering complexity and low data redundancy and they are able to quickly find appropriate initial centroids and select correctly the centroids of the clusters to obtain the global optimal solutions for large k value. We measured the bit-error-ratio (BER) performance of 64-QAM signal with different launched powers into the 50-km single mode fiber and the proposed techniques can greatly mitigate the signal impairments caused by the amplified spontaneous emission noise and the fiber Kerr nonlinearity and improve the BER performance.

  20. At technique for visualizing electrostatic fields based on their topological structures

    International Nuclear Information System (INIS)

    Handa, Susumu

    2004-01-01

    In molecular science, visualization techniques based on computer graphics are now well established as a tool to interpret simulation results, since molecules are complicated in the structures and mutual interactions. As a probe to study such molecular interactions, electrostatic fields are considered to be useful. However, since they are given as 3D vector fields having complicated distributions, conventional drawing techniques are inadequate. In this article, a new approach based on topological structures in vector fields is presented to visualize the electrostatic fields of molecules. The scheme is to select regions of interest only from the topological structures of the fields. An example in applications to chemical reactions of an amino acid complex is presented to show how the scheme is used. (author)

  1. Electrical-Based Diagnostic Techniques for Assessing Insulation Condition in Aged Transformers

    Directory of Open Access Journals (Sweden)

    Issouf Fofana

    2016-08-01

    Full Text Available The condition of the internal cellulosic paper and oil insulation are of concern for the performance of power transformers. Over the years, a number of methods have been developed to diagnose and monitor the degradation/aging of the transformer internal insulation system. Some of this degradation/aging can be assessed from electrical responses. Currently there are a variety of electrical-based diagnostic techniques available for insulation condition monitoring of power transformers. In most cases, the electrical signals being monitored are due to mechanical or electric changes caused by physical changes in resistivity, inductance or capacitance, moisture, contamination or aging by-products in the insulation. This paper presents a description of commonly used and modern electrical-based diagnostic techniques along with their interpretation schemes.

  2. Verification of FPGA-based NPP I and C systems. General approach and techniques

    International Nuclear Information System (INIS)

    Andrashov, Anton; Kharchenko, Vyacheslav; Sklyar, Volodymir; Reva, Lubov; Siora, Alexander

    2011-01-01

    This paper presents a general approach and techniques for design and verification of Field Programmable Gates Arrays (FPGA)-based Instrumentation and Control (I and C) systems for Nuclear Power Plants (NPP). Appropriate regulatory documents used for I and C systems design, development, verification and validation (V and V) are discussed considering the latest international standards and guidelines. Typical development and V and V processes of FPGA electronic design for FPGA-based NPP I and C systems are presented. Some safety-related features of implementation process are discussed. Corresponding development artifacts, related to design and implementation activities are outlined. An approach to test-based verification of FPGA electronic design algorithms, used in FPGA-based reactor trip systems is proposed. The results of application of test-based techniques for assessment of FPGA electronic design algorithms for reactor trip system (RTS) produced by Research and Production Corporation (RPC) 'Radiy' are presented. Some principles of invariant-oriented verification for FPGA-based safety-critical systems are outlined. (author)

  3. Experimental Study on Damage Detection in Timber Specimens Based on an Electromechanical Impedance Technique and RMSD-Based Mahalanobis Distance

    Directory of Open Access Journals (Sweden)

    Dansheng Wang

    2016-10-01

    Full Text Available In the electromechanical impedance (EMI method, the PZT patch performs the functions of both sensor and exciter. Due to the high frequency actuation and non-model based characteristics, the EMI method can be utilized to detect incipient structural damage. In recent years EMI techniques have been widely applied to monitor the health status of concrete and steel materials, however, studies on application to timber are limited. This paper will explore the feasibility of using the EMI technique for damage detection in timber specimens. In addition, the conventional damage index, namely root mean square deviation (RMSD is employed to evaluate the level of damage. On that basis, a new damage index, Mahalanobis distance based on RMSD, is proposed to evaluate the damage severity of timber specimens. Experimental studies are implemented to detect notch and hole damage in the timber specimens. Experimental results verify the availability and robustness of the proposed damage index and its superiority over the RMSD indexes.

  4. MLVA Typing of Streptococcus pneumoniae Isolates with Emphasis on Serotypes 14, 9N and 9V: Comparison of Previously Described Panels and Proposal of a Novel 7 VNTR Loci-Based Simplified Scheme.

    Science.gov (United States)

    Costa, Natália S; Pinto, Tatiana C A; Merquior, Vânia L C; Castro, Luciana F S; da Rocha, Filomena S P; Morais, Jaqueline M; Peralta, José M; Teixeira, Lúcia M

    2016-01-01

    Streptococcus pneumoniae remains as an important cause of community-acquired bacterial infections, and the nasopharynx of asymptomatic carriers is the major reservoir of this microorganism. Pneumococcal strains of serotype 14 and serogroup 9 are among the most frequently isolated from both asymptomatic carriers and patients with invasive disease living in Brazil. Internationally disseminated clones belonging to such serotypes have been associated with the emergence and spread of antimicrobial resistance in our setting, highlighting the need for epidemiological tracking of these isolates. In this scenario, Multiple Loci VNTR Analysis (MLVA) has emerged as an alternative tool for the molecular characterization of pneumococci, in addition to more traditional techniques such as Multi-Locus Sequence Typing (MLST) and Pulsed-Field Gel Electrophoresis (PFGE). In the present study, 18 VNTR loci, as well as other previously described reduced MLVA panels (7 VNTR loci), were evaluated as tools to characterize pneumococcal strains of serotypes 14, 9N and 9V belonging to international and regional clones isolated in Brazil. The 18 VNTR loci panel was highly congruent with MLST and PFGE, being also useful for indicating the genetic relationship with international clones and for discriminating among strains with indistinguishable STs and PFGE profiles. Analysis of the results also allowed deducing a novel shorter 7 VNTR loci panel, keeping a high discriminatory power for isolates of the serotypes investigated and a high congruence level with MLST and PFGE. The newly proposed simplified panel was then evaluated for typing pneumococcal strains of other commonly isolated serotypes. The results indicate that MLVA is a faster and easier to perform, reliable approach for the molecular characterization of S. pneumoniae isolates, with potential for cost-effective application, especially in resource-limited countries.

  5. Comparison of small-group training with self-directed internet-based training in inhaler techniques.

    Science.gov (United States)

    Toumas, Mariam; Basheti, Iman A; Bosnic-Anticevich, Sinthia Z

    2009-08-28

    To compare the effectiveness of small-group training in correct inhaler technique with self-directed Internet-based training. Pharmacy students were randomly allocated to 1 of 2 groups: small-group training (n = 123) or self-directed Internet-based training (n = 113). Prior to intervention delivery, all participants were given a placebo Turbuhaler and product information leaflet and received inhaler technique training based on their group. Technique was assessed following training and predictors of correct inhaler technique were examined. There was a significant improvement in the number of participants demonstrating correct technique in both groups (small group training, 12% to 63%; p training, 9% to 59%; p groups in the percent change (n = 234, p > 0.05). Increased student confidence following the intervention was a predictor for correct inhaler technique. Self-directed Internet-based training is as effective as small-group training in improving students' inhaler technique.

  6. Techniques for Handling and Removal of Spectral Channels in Fourier Transform Synchrotron-Based Spectra

    International Nuclear Information System (INIS)

    Ibrahim, Amr; Predoi-Cross, Adriana; Teillet, Philippe M.

    2010-01-01

    Channel spectra are a big problem for those attempting to use synchrotron-based Fourier transform spectra for spectral lineshape studies. Due to the layout of the optical system at the CLS far-infrared beamline, the synchrotron beam undergoes unavoidable multiple reflections on the steering mirrors, beam splitter, several sets of windows, and filters. We present a method for eliminating channel spectra and compare the results of our technique with other methods available in the literature.

  7. Development of remote handling system based on 3-D shape recognition technique

    International Nuclear Information System (INIS)

    Tomizuka, Chiaki; Takeuchi, Yutaka

    2006-01-01

    In a nuclear facility, the maintenance and repair activities must be done remotely in a radioactive environment. Fuji Electric Systems Co., Ltd. has developed a remote handling system based on 3-D recognition technique. The system recognizes the pose and position of the target to manipulate, and visualizes the scene with the target in 3-D, enabling an operator to handle it easily. This paper introduces the concept and the key features of this system. (author)

  8. Spectral interference of zirconium on 24 analyte elements using CCD based ICP-AES technique

    International Nuclear Information System (INIS)

    Adya, V.C.; Sengupta, Arijit; Godbole, S.V.

    2014-01-01

    In the present studies, the spectral interference of zirconium on different analytical lines of 24 critical analytes using CCD based ICP-AES technique is described. Suitable analytical lines for zirconium were identified along with their detection limits. The sensitivity and the detection limits of analytical channels for different elements in presence of Zr matrix were calculated. Subsequently analytical lines with least interference from Zr and better detection limits were selected for their determinations. (author)

  9. Temperature Control of Gas Chromatograph Based on Switched Delayed System Techniques

    Directory of Open Access Journals (Sweden)

    Xiao-Liang Wang

    2014-01-01

    Full Text Available We address the temperature control problem of the gas chromatograph. We model the temperature control system of the gas chromatograph into a switched delayed system and analyze the stability by common Lyapunov functional technique. The PI controller parameters can be given based on the proposed linear matrix inequalities (LMIs condition and the designed controller can make the temperature of gas chromatograph track the reference signal asymptotically. An experiment is given to illustrate the effectiveness of the stability criterion.

  10. Effective Design for Optical CDMA Based on Radio over Fiber (RoF Technique

    Directory of Open Access Journals (Sweden)

    Rashidi C. B. M.

    2017-01-01

    Full Text Available In this paper, the performance of OCDMA coding systems utilizing the radio over fiber (RoF technique is presented. It has been done by means of conventional OptiSystem simulation tools, where the propagation of radio signals up to 50 km using standard single mode fiber (SMF was investigated. The analysis was made based on the performance of eye diagram, bit rate, bit error rate and optical received power.

  11. All-optical optoacoustic microscopy based on probe beam deflection technique

    OpenAIRE

    Maswadi, Saher M.; Ibey, Bennett L.; Roth, Caleb C.; Tsyboulski, Dmitri A.; Beier, Hope T.; Glickman, Randolph D.; Oraevsky, Alexander A.

    2016-01-01

    Optoacoustic (OA) microscopy using an all-optical system based on the probe beam deflection technique (PBDT) for detection of laser-induced acoustic signals was investigated as an alternative to conventional piezoelectric transducers. PBDT provides a number of advantages for OA microscopy including (i) efficient coupling of laser excitation energy to the samples being imaged through the probing laser beam, (ii) undistorted coupling of acoustic waves to the detector without the need for separa...

  12. Active Vibration damping of Smart composite beams based on system identification technique

    Science.gov (United States)

    Bendine, Kouider; Satla, Zouaoui; Boukhoulda, Farouk Benallel; Nouari, Mohammed

    2018-03-01

    In the present paper, the active vibration control of a composite beam using piezoelectric actuator is investigated. The space state equation is determined using system identification technique based on the structure input output response provided by ANSYS APDL finite element package. The Linear Quadratic (LQG) control law is designed and integrated into ANSYS APDL to perform closed loop simulations. Numerical examples for different types of excitation loads are presented to test the efficiency and the accuracy of the proposed model.

  13. [A comprehensive approach to designing of magnetotherapy techniques based on the Atos device].

    Science.gov (United States)

    Raĭgorodskiĭ, Iu M; Semiachkin, G P; Tatarenko, D A

    1995-01-01

    The paper determines how to apply a comprehensive approach to designing magnetic therapeutical techniques based on concomitant exposures to two or more physical factors. It shows the advantages of the running pattern of a magnetic field and photostimuli in terms of optimization of physiotherapeutical exposures. An Atos apparatus with an Amblio-1 attachment is used as an example to demonstrate how to apply the comprehensive approach for ophthalmology.

  14. A NEW RECOGNITION TECHNIQUE NAMED SOMP BASED ON PALMPRINT USING NEURAL NETWORK BASED SELF ORGANIZING MAPS

    Directory of Open Access Journals (Sweden)

    A. S. Raja

    2012-08-01

    Full Text Available The word biometrics refers to the use of physiological or biological characteristics of human to recognize and verify the identity of an individual. Palmprint has become a new class of human biometrics for passive identification with uniqueness and stability. This is considered to be reliable due to the lack of expressions and the lesser effect of aging. In this manuscript a new Palmprint based biometric system based on neural networks self organizing maps (SOM is presented. The method is named as SOMP. The paper shows that the proposed SOMP method improves the performance and robustness of recognition. The proposed method is applied to a variety of datasets and the results are shown.

  15. Vibration measurement-based simple technique for damage detection of truss bridges: A case study

    Directory of Open Access Journals (Sweden)

    Sudath C. Siriwardane

    2015-10-01

    Full Text Available The bridges experience increasing traffic volume and weight, deteriorating of components and large number of stress cycles. Therefore, assessment of the current condition of steel railway bridges becomes necessary. Most of the commonly available approaches for structural health monitoring are based on visual inspection and non-destructive testing methods. The visual inspection is unreliable as those depend on uncertainty behind inspectors and their experience. Also, the non-destructive testing methods are found to be expensive. Therefore, recent researches have noticed that dynamic modal parameters or vibration measurement-based structural health monitoring methods are economical and may also provide more realistic predictions to damage state of civil infrastructure. Therefore this paper proposes a simple technique to locate the damage region of railway truss bridges based on measured modal parameters. The technique is discussed with a case study. Initially paper describes the details of considered railway bridge. Then observations of visual inspection, material testing and in situ load testing are discussed under separate sections. Development of validated finite element model of the considered bridge is comprehensively discussed. Hence, variations of modal parameters versus position of the damage are plotted. These plots are considered as the main reference for locating the damage of the railway bridge in future periodical inspection by comparing the measured corresponding modal parameters. Finally the procedure of periodical vibration measurement and damage locating technique are clearly illustrated.

  16. 3D-TV System with Depth-Image-Based Rendering Architectures, Techniques and Challenges

    CERN Document Server

    Zhao, Yin; Yu, Lu; Tanimoto, Masayuki

    2013-01-01

    Riding on the success of 3D cinema blockbusters and advances in stereoscopic display technology, 3D video applications have gathered momentum in recent years. 3D-TV System with Depth-Image-Based Rendering: Architectures, Techniques and Challenges surveys depth-image-based 3D-TV systems, which are expected to be put into applications in the near future. Depth-image-based rendering (DIBR) significantly enhances the 3D visual experience compared to stereoscopic systems currently in use. DIBR techniques make it possible to generate additional viewpoints using 3D warping techniques to adjust the perceived depth of stereoscopic videos and provide for auto-stereoscopic displays that do not require glasses for viewing the 3D image.   The material includes a technical review and literature survey of components and complete systems, solutions for technical issues, and implementation of prototypes. The book is organized into four sections: System Overview, Content Generation, Data Compression and Transmission, and 3D V...

  17. Advanced Laser-Based Techniques for Gas-Phase Diagnostics in Combustion and Aerospace Engineering.

    Science.gov (United States)

    Ehn, Andreas; Zhu, Jiajian; Li, Xuesong; Kiefer, Johannes

    2017-03-01

    Gaining information of species, temperature, and velocity distributions in turbulent combustion and high-speed reactive flows is challenging, particularly for conducting measurements without influencing the experimental object itself. The use of optical and spectroscopic techniques, and in particular laser-based diagnostics, has shown outstanding abilities for performing non-intrusive in situ diagnostics. The development of instrumentation, such as robust lasers with high pulse energy, ultra-short pulse duration, and high repetition rate along with digitized cameras exhibiting high sensitivity, large dynamic range, and frame rates on the order of MHz, has opened up for temporally and spatially resolved volumetric measurements of extreme dynamics and complexities. The aim of this article is to present selected important laser-based techniques for gas-phase diagnostics focusing on their applications in combustion and aerospace engineering. Applicable laser-based techniques for investigations of turbulent flows and combustion such as planar laser-induced fluorescence, Raman and Rayleigh scattering, coherent anti-Stokes Raman scattering, laser-induced grating scattering, particle image velocimetry, laser Doppler anemometry, and tomographic imaging are reviewed and described with some background physics. In addition, demands on instrumentation are further discussed to give insight in the possibilities that are offered by laser flow diagnostics.

  18. Comparison of acrylamide intake from Western and guideline based diets using probabilistic techniques and linear programming.

    Science.gov (United States)

    Katz, Josh M; Winter, Carl K; Buttrey, Samuel E; Fadel, James G

    2012-03-01

    Western and guideline based diets were compared to determine if dietary improvements resulting from following dietary guidelines reduce acrylamide intake. Acrylamide forms in heat treated foods and is a human neurotoxin and animal carcinogen. Acrylamide intake from the Western diet was estimated with probabilistic techniques using teenage (13-19 years) National Health and Nutrition Examination Survey (NHANES) food consumption estimates combined with FDA data on the levels of acrylamide in a large number of foods. Guideline based diets were derived from NHANES data using linear programming techniques to comport to recommendations from the Dietary Guidelines for Americans, 2005. Whereas the guideline based diets were more properly balanced and rich in consumption of fruits, vegetables, and other dietary components than the Western diets, acrylamide intake (mean±SE) was significantly greater (Plinear programming and results demonstrate that linear programming techniques can be used to model specific diets for the assessment of toxicological and nutritional dietary components. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Remediation of textile effluents by membrane based treatment techniques: a state of the art review.

    Science.gov (United States)

    Dasgupta, Jhilly; Sikder, Jaya; Chakraborty, Sudip; Curcio, Stefano; Drioli, Enrico

    2015-01-01

    The textile industries hold an important position in the global industrial arena because of their undeniable contributions to basic human needs satisfaction and to the world economy. These industries are however major consumers of water, dyes and other toxic chemicals. The effluents generated from each processing step comprise substantial quantities of unutilized resources. The effluents if discharged without prior treatment become potential sources of pollution due to their several deleterious effects on the environment. The treatment of heterogeneous textile effluents therefore demands the application of environmentally benign technology with appreciable quality water reclamation potential. These features can be observed in various innovative membrane based techniques. The present review paper thus elucidates the contributions of membrane technology towards textile effluent treatment and unexhausted raw materials recovery. The reuse possibilities of water recovered through membrane based techniques, such as ultrafiltration and nanofiltration in primary dye houses or auxiliary rinse vats have also been explored. Advantages and bottlenecks, such as membrane fouling associated with each of these techniques have also been highlighted. Additionally, several pragmatic models simulating transport mechanism across membranes have been documented. Finally, various accounts dealing with techno-economic evaluation of these membrane based textile wastewater treatment processes have been provided. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Development Model of Basic Technique Skills Training Shot-Put Obrien Style Based Biomechanics Review

    Directory of Open Access Journals (Sweden)

    danang rohmat hidayanto

    2018-03-01

    Full Text Available The background of this research is the unavailability of learning model of basic technique technique of O'Brien style force that integrated in skill program based on biomechanics study which is used as a reference to build the basic technique skill of the O'Brien style force among students. The purpose of this study is to develop a model of basic-style technique of rejecting the O'Brien-style shot put based on biomechanical studies for beginner levels, including basic prefix technique, glide, final stage, repulsion, further motion and repulsion performance of O'Brien style, all of which arranged in a medium that is easily accessible whenever, by anyone and anywhere, especially in SMK Negeri 1 Kalijambe Sragen . The research method used is "Reasearch and Developement" approach. "Preliminary studies show that 43.0% of respondents considered that the O'Brien style was very important to be developed with a model of skill-based exercise based on biomechanics, as many as 40.0% ressponden stated that it is important to be developed with biomechanics based learning media. Therefore, it is deemed necessary to develop the learning media of the O'Brien style-based training skills based on biomechanical studies. Development of media starts from the design of the storyboard and script form that will be used as media. The design of this model is called the draft model. Draft models that have been prepared are reviewed by the multimedia expert and the O'Brien style expert to get the product's validity. A total of 78.24% of experts declare a viable product with some input. In small groups with n = 6, earned value 72.2% was obtained or valid enough to be tested in large groups. In the large group test with n = 12,values obtained 70.83% or quite feasible to be tested in the field. In the field test, experimental group was prepared with treatment according to media and control group with free treatment. From result of counting of significance test can be

  1. Linear accelerator-based intensity-modulated total marrow irradiation technique for treatment of hematologic malignancies: a dosimetric feasibility study.

    Science.gov (United States)

    Yeginer, Mete; Roeske, John C; Radosevich, James A; Aydogan, Bulent

    2011-03-15

    To investigate the dosimetric feasibility of linear accelerator-based intensity-modulated total marrow irradiation (IM-TMI) in patients with hematologic malignancies. Linear accelerator-based IM-TMI treatment planning was performed for 9 patients using the Eclipse treatment planning system. The planning target volume (PTV) consisted of all the bones in the body from the head to the mid-femur, except for the forearms and hands. Organs at risk (OAR) to be spared included the lungs, heart, liver, kidneys, brain, eyes, oral cavity, and bowel and were contoured by a physician on the axial computed tomography images. The three-isocenter technique previously developed by our group was used for treatment planning. We developed and used a common dose-volume objective method to reduce the planning time and planner subjectivity in the treatment planning process. A 95% PTV coverage with the 99% of the prescribed dose of 12 Gy was achieved for all nine patients. The average dose reduction in OAR ranged from 19% for the lungs to 68% for the lenses. The common dose-volume objective method decreased the planning time by an average of 35% and reduced the inter- and intra- planner subjectivity. The results from the present study suggest that the linear accelerator-based IM-TMI technique is clinically feasible. We have demonstrated that linear accelerator-based IM-TMI plans with good PTV coverage and improved OAR sparing can be obtained within a clinically reasonable time using the common dose-volume objective method proposed in the present study. Copyright © 2011. Published by Elsevier Inc.

  2. Carbon Dioxide Capture and Separation Techniques for Gasification-based Power Generation Point Sources

    Energy Technology Data Exchange (ETDEWEB)

    Pennline, H.W.; Luebke, D.R.; Jones, K.L.; Morsi, B.I. (Univ. of Pittsburgh, PA); Heintz, Y.J. (Univ. of Pittsburgh, PA); Ilconich, J.B. (Parsons)

    2007-06-01

    The capture/separation step for carbon dioxide (CO2) from large-point sources is a critical one with respect to the technical feasibility and cost of the overall carbon sequestration scenario. For large-point sources, such as those found in power generation, the carbon dioxide capture techniques being investigated by the in-house research area of the National Energy Technology Laboratory possess the potential for improved efficiency and reduced costs as compared to more conventional technologies. The investigated techniques can have wide applications, but the research has focused on capture/separation of carbon dioxide from flue gas (post-combustion from fossil fuel-fired combustors) and from fuel gas (precombustion, such as integrated gasification combined cycle or IGCC). With respect to fuel gas applications, novel concepts are being developed in wet scrubbing with physical absorption; chemical absorption with solid sorbents; and separation by membranes. In one concept, a wet scrubbing technique is being investigated that uses a physical solvent process to remove CO2 from fuel gas of an IGCC system at elevated temperature and pressure. The need to define an ideal solvent has led to the study of the solubility and mass transfer properties of various solvents. Pertaining to another separation technology, fabrication techniques and mechanistic studies for membranes separating CO2 from the fuel gas produced by coal gasification are also being performed. Membranes that consist of CO2-philic ionic liquids encapsulated into a polymeric substrate have been investigated for permeability and selectivity. Finally, dry, regenerable processes based on sorbents are additional techniques for CO2 capture from fuel gas. An overview of these novel techniques is presented along with a research progress status of technologies related to membranes and physical solvents.

  3. Secure positioning technique based on encrypted visible light map for smart indoor service

    Science.gov (United States)

    Lee, Yong Up; Jung, Gillyoung

    2018-03-01

    Indoor visible light (VL) positioning systems for smart indoor services are negatively affected by both cochannel interference from adjacent light sources and VL reception position irregularity in the three-dimensional (3-D) VL channel. A secure positioning methodology based on a two-dimensional (2-D) encrypted VL map is proposed, implemented in prototypes of the specific positioning system, and analyzed based on performance tests. The proposed positioning technique enhances the positioning performance by more than 21.7% compared to the conventional method in real VL positioning tests. Further, the pseudonoise code is found to be the optimal encryption key for secure VL positioning for this smart indoor service.

  4. New Strategies for Powder Compaction in Powder-based Rapid Prototyping Techniques

    OpenAIRE

    Budding, A.; Vaneker, T.H.J.

    2013-01-01

    In powder-based rapid prototyping techniques, powder compaction is used to create thin layers of fine powder that are locally bonded. By stacking these layers of locally bonded material, an object is made. The compaction of thin layers of powder mater ials is of interest for a wide range of applications, but this study solely focuses on the application for powder -based three-dimensional printing (e.g. SLS, 3DP). This research is primarily interested in powder compaction for creating membrane...

  5. Constructing a Soil Class Map of Denmark based on the FAO Legend Using Digital Techniques

    DEFF Research Database (Denmark)

    Adhikari, Kabindra; Minasny, Budiman; Greve, Mette Balslev

    2014-01-01

    Soil mapping in Denmark has a long history and a series of soil maps based on conventional mapping approaches have been produced. In this study, a national soil map of Denmark was constructed based on the FAO–Unesco Revised Legend 1990 using digital soil mapping techniques, existing soil profile......) confirmed that the output is reliable and can be used in various soil and environmental studies without major difficulties. This study also verified the importance of GlobalSoilMap products and a priori pedological information that improved prediction performance and quality of the new FAO soil map...

  6. Multi-disciplinary techniques for understanding time-varying space-based imagery

    Science.gov (United States)

    Casasent, D.; Sanderson, A.; Kanade, T.

    1984-06-01

    A multidisciplinary program for space-based image processing is reported. This project combines optical and digital processing techniques and pattern recognition, image understanding and artificial intelligence methodologies. Time change image processing was recognized as the key issue to be addressed. Three time change scenarios were defined based on the frame rate of the data change. This report details the recent research on: various statistical and deterministic image features, recognition of sub-pixel targets in time varying imagery, and 3-D object modeling and recognition.

  7. The effect of numerical techniques on differential equation based chaotic generators

    KAUST Repository

    Zidan, Mohammed A.

    2012-07-29

    In this paper, we study the effect of the numerical solution accuracy on the digital implementation of differential chaos generators. Four systems are built on a Xilinx Virtex 4 FPGA using Euler, mid-point, and Runge-Kutta fourth order techniques. The twelve implementations are compared based on the FPGA used area, maximum throughput, maximum Lyapunov exponent, and autocorrelation confidence region. Based on circuit performance and the chaotic response of the different implementations, it was found that less complicated numerical solution has better chaotic response and higher throughput.

  8. Spectral-based features ranking for gamelan instruments identification using filter techniques

    Directory of Open Access Journals (Sweden)

    Diah P Wulandari

    2013-03-01

    Full Text Available In this paper, we describe an approach of spectral-based features ranking for Javanese gamelaninstruments identification using filter techniques. The model extracted spectral-based features set of thesignal using Short Time Fourier Transform (STFT. The rank of the features was determined using the fivealgorithms; namely ReliefF, Chi-Squared, Information Gain, Gain Ratio, and Symmetric Uncertainty. Then,we tested the ranked features by cross validation using Support Vector Machine (SVM. The experimentshowed that Gain Ratio algorithm gave the best result, it yielded accuracy of 98.93%.

  9. Critical test of isotropic periodic sum techniques with group-based cut-off schemes.

    Science.gov (United States)

    Nozawa, Takuma; Yasuoka, Kenji; Takahashi, Kazuaki Z

    2018-03-08

    Truncation is still chosen for many long-range intermolecular interaction calculations to efficiently compute free-boundary systems, macromolecular systems and net-charge molecular systems, for example. Advanced truncation methods have been developed for long-range intermolecular interactions. Every truncation method can be implemented as one of two basic cut-off schemes, namely either an atom-based or a group-based cut-off scheme. The former computes interactions of "atoms" inside the cut-off radius, whereas the latter computes interactions of "molecules" inside the cut-off radius. In this work, the effect of group-based cut-off is investigated for isotropic periodic sum (IPS) techniques, which are promising cut-off treatments to attain advanced accuracy for many types of molecular system. The effect of group-based cut-off is clearly different from that of atom-based cut-off, and severe artefacts are observed in some cases. However, no severe discrepancy from the Ewald sum is observed with the extended IPS techniques.

  10. TU-CD-304-08: Feasibility of a VMAT-Based Spatially Fractionated Grid Therapy Technique

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, B; Liu, M; Huang, Y; Kim, J; Brown, S; Siddiqui, F; Chetty, I; Wen, N [Henry Ford Health System, Detroit, MI (United States); Jin, J [Georgia Regents University, Augusta, GA (Georgia)

    2015-06-15

    Purpose: Grid therapy (GT) uses spatially modulated radiation doses to treat large tumors without significant toxicities. Incorporating 3D conformal-RT or IMRT improved single-field GT by reducing dose to normal tissues spatially through the use of multiple fields. The feasibility of a MLC-based, inverse-planned multi-field GT technique has been demonstrated. Volumetric modulated arc therapy (VMAT) provides conformal dose distributions with the additional potential advantage of reduced treatment times. In this study, we characterize a new VMAT-based GT (VMAT-GT) technique with respect to its deliverability and dosimetric accuracy. Methods: A lattice of 5mm-diameter spheres was created as the boost volume within a large treatment target. A simultaneous boost VMAT (RapidArc) plan with 8Gy to the target and 20Gy to the boost volume was generated using the Eclipse treatment planning system (AAA-v11). The linac utilized HD120 MLC and 6MV flattening-filter free beam. Four non-coplanar arcs, with couch angles at 0, 45, 90 and 317° were used. Collimator angles were at 45 and 315°. The plan was mapped to a phantom. Calibrated Gafchromic EBT3 films were used to measure the delivered dose. Results: The VMAT plan generated a highly spatially modulated dose distribution in the target. D95%, D50%, D5% for the spheres and the targets in Gy were 18.9, 20.6, 23 and 8.0, 9.6, 14.8, respectively. D50% for a 1cm ring 1cm outside the target was 3.0Gy. The peak-to-valley ratio of this technique is comparable to previously proposed techniques, but the MUs were reduced by almost 50%. Film dosimetry showed good agreement between calculated and delivered dose, with an overall gamma passing rate of >98% (3% and 1mm). The point dose differences at sphere centers varied from 2–8%. Conclusion: The deliverability and dose calculation accuracy of the proposed VMAT-GT technique demonstrates that ablative radiation doses are deliverable to large tumors safely and efficiently.

  11. Virtual Lead Identification of Farnesyltransferase Inhibitors Based on Ligand and Structure-Based Pharmacophore Techniques

    Directory of Open Access Journals (Sweden)

    Nizar M. Mhaidat

    2013-05-01

    Full Text Available Farnesyltransferase enzyme (FTase is considered an essential enzyme in the Ras signaling pathway associated with cancer. Thus, designing inhibitors for this enzyme might lead to the discovery of compounds with effective anticancer activity. In an attempt to obtain effective FTase inhibitors, pharmacophore hypotheses were generated using structure-based and ligand-based approaches built in Discovery Studio v3.1. Knowing the presence of the zinc feature is essential for inhibitor’s binding to the active site of FTase enzyme; further customization was applied to include this feature in the generated pharmacophore hypotheses. These pharmacophore hypotheses were thoroughly validated using various procedures such as ROC analysis and ligand pharmacophore mapping. The validated pharmacophore hypotheses were used to screen 3D databases to identify possible hits. Those which were both high ranked and showed sufficient ability to bind the zinc feature in active site, were further refined by applying drug-like criteria such as Lipiniski’s “rule of five” and ADMET filters. Finally, the two candidate compounds (ZINC39323901 and ZINC01034774 were allowed to dock using CDOCKER and GOLD in the active site of FTase enzyme to optimize hit selection.

  12. When are network coding based dynamic multi-homing techniques beneficial?

    DEFF Research Database (Denmark)

    Pereira, Carlos; Aguiar, Ana; Roetter, Daniel Enrique Lucani

    2016-01-01

    Mechanisms that can cope with unreliable wireless channels in an efficient manner are required due to the increasing number of resource constrained devices. Concurrent use of multiple communications technologies can be instrumental towards improving services to mobile devices in heterogeneous...... networks. In our previous work, we developed an optimization framework to generate channel-aware transmission policies for multi-homed devices under different cost criteria. Our formulation considers network coding as a key technique that simplifies load allocation across multiple channels and provides...... high resiliency under time-varying channel conditions. This paper seeks to explore the parameter space and identify the operating regions where dynamic coded policies offer most improvement over static ones in terms of energy consumption and channel utilization. We leverage meta-heuristics to find...

  13. Optical asymmetric cryptography based on elliptical polarized light linear truncation and a numerical reconstruction technique.

    Science.gov (United States)

    Lin, Chao; Shen, Xueju; Wang, Zhisong; Zhao, Cheng

    2014-06-20

    We demonstrate a novel optical asymmetric cryptosystem based on the principle of elliptical polarized light linear truncation and a numerical reconstruction technique. The device of an array of linear polarizers is introduced to achieve linear truncation on the spatially resolved elliptical polarization distribution during image encryption. This encoding process can be characterized as confusion-based optical cryptography that involves no Fourier lens and diffusion operation. Based on the Jones matrix formalism, the intensity transmittance for this truncation is deduced to perform elliptical polarized light reconstruction based on two intensity measurements. Use of a quick response code makes the proposed cryptosystem practical, with versatile key sensitivity and fault tolerance. Both simulation and preliminary experimental results that support theoretical analysis are presented. An analysis of the resistance of the proposed method on a known public key attack is also provided.

  14. Novel Machine Learning-Based Techniques for Efficient Resource Allocation in Next Generation Wireless Networks

    KAUST Repository

    AlQuerm, Ismail A.

    2018-02-21

    resources management in diverse wireless networks. The core operation of the proposed architecture is decision-making for resource allocation and system’s parameters adaptation. Thus, we develop the decision-making mechanism using different artificial intelligence techniques, evaluate the performance achieved and determine the tradeoff of using one technique over the others. The techniques include decision-trees, genetic algorithm, hybrid engine based on decision-trees and case based reasoning, and supervised engine with machine learning contribution to determine the ultimate technique that suits the current environment conditions. All the proposed techniques are evaluated using testbed implementation in different topologies and scenarios. LTE networks have been considered as a potential environment for demonstration of our proposed cognitive based resource allocation techniques as they lack of radio resource management. In addition, we explore the use of enhanced online learning to perform efficient resource allocation in the upcoming 5G networks to maximize energy efficiency and data rate. The considered 5G structures are heterogeneous multi-tier networks with device to device communication and heterogeneous cloud radio access networks. We propose power and resource blocks allocation schemes to maximize energy efficiency and data rate in heterogeneous 5G networks. Moreover, traffic offloading from large cells to small cells in 5G heterogeneous networks is investigated and an online learning based traffic offloading strategy is developed to enhance energy efficiency. Energy efficiency problem in heterogeneous cloud radio access networks is tackled using online learning in centralized and distributed fashions. The proposed online learning comprises improvement features that reduce the algorithms complexities and enhance the performance achieved.

  15. SLIM-MAUD - a computer based technique for human reliability assessment

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1985-01-01

    The Success Likelihood Index Methodology (SLIM) is a widely applicable technique which can be used to assess human error probabilities in both proceduralized and cognitive tasks (i.e. those involving decision making, problem solving, etc.). It assumes that expert assessors are able to evaluate the relative importance (or weights) of different factors called Performance Shaping Factors (PSFs), in determining the likelihood of error for the situations being assessed. Typical PSFs are the extent to which good procedures are available, operators are adequately trained, the man-machine interface is well designed, etc. If numerical ratings are made of the PSFs for the specific tasks being evaluated, these can be combined with the weights to give a numerical index, called the Success Likelihood Index (SLI). The SLI represents, in numerical form, the overall assessment of the experts of the likelihood of task success. The SLI can be subsequently transformed to a corresponding human error probability (HEP) estimate. The latest form of the SLIM technique is implemented using a microcomputer based system called MAUD (Multi-Attribute Utility Decomposition), the resulting technique being called SLIM-MAUD. A detailed description of the SLIM-MAUD technique and case studies of applications are available. An illustrative example of the application of SLIM-MAUD in probabilistic risk assessment is given

  16. Evaluation of Clipping Based Iterative PAPR Reduction Techniques for FBMC Systems

    Directory of Open Access Journals (Sweden)

    Zsolt Kollár

    2014-01-01

    to conventional orthogonal frequency division multiplexing (OFDM technique. The low ACLR of the transmitted FBMC signal makes it especially favorable in cognitive radio applications, where strict requirements are posed on out-of-band radiation. Large dynamic range resulting in high peak-to-average power ratio (PAPR is characteristic of all sorts of multicarrier signals. The advantageous spectral properties of the high-PAPR FBMC signal are significantly degraded if nonlinearities are present in the transceiver chain. Spectral regrowth may appear, causing harmful interference in the neighboring frequency bands. This paper presents novel clipping based PAPR reduction techniques, evaluated and compared by simulations and measurements, with an emphasis on spectral aspects. The paper gives an overall comparison of PAPR reduction techniques, focusing on the reduction of the dynamic range of FBMC signals without increasing out-of-band radiation. An overview is presented on transmitter oriented techniques employing baseband clipping, which can maintain the system performance with a desired bit error rate (BER.

  17. Compressed sensing techniques for receiver based post-compensation of transmitter's nonlinear distortions in OFDM systems

    KAUST Repository

    Owodunni, Damilola S.

    2014-04-01

    In this paper, compressed sensing techniques are proposed to linearize commercial power amplifiers driven by orthogonal frequency division multiplexing signals. The nonlinear distortion is considered as a sparse phenomenon in the time-domain, and three compressed sensing based algorithms are presented to estimate and compensate for these distortions at the receiver using a few and, at times, even no frequency-domain free carriers (i.e. pilot carriers). The first technique is a conventional compressed sensing approach, while the second incorporates a priori information about the distortions to enhance the estimation. Finally, the third technique involves an iterative data-aided algorithm that does not require any pilot carriers and hence allows the system to work at maximum bandwidth efficiency. The performances of all the proposed techniques are evaluated on a commercial power amplifier and compared. The error vector magnitude and symbol error rate results show the ability of compressed sensing to compensate for the amplifier\\'s nonlinear distortions. © 2013 Elsevier B.V.

  18. Technique for Determination of Rational Boundaries in Combining Construction and Installation Processes Based on Quantitative Estimation of Technological Connections

    Science.gov (United States)

    Gusev, E. V.; Mukhametzyanov, Z. R.; Razyapov, R. V.

    2017-11-01

    The problems of the existing methods for the determination of combining and technologically interlinked construction processes and activities are considered under the modern construction conditions of various facilities. The necessity to identify common parameters that characterize the interaction nature of all the technology-related construction and installation processes and activities is shown. The research of the technologies of construction and installation processes for buildings and structures with the goal of determining a common parameter for evaluating the relationship between technologically interconnected processes and construction works are conducted. The result of this research was to identify the quantitative evaluation of interaction construction and installation processes and activities in a minimum technologically necessary volume of the previous process allowing one to plan and organize the execution of a subsequent technologically interconnected process. The quantitative evaluation is used as the basis for the calculation of the optimum range of the combination of processes and activities. The calculation method is based on the use of the graph theory. The authors applied a generic characterization parameter to reveal the technological links between construction and installation processes, and the proposed technique has adaptive properties which are key for wide use in organizational decisions forming. The article provides a written practical significance of the developed technique.

  19. Digital Control Techniques Based on Voltage Source Inverters in Renewable Energy Applications: A Review

    Directory of Open Access Journals (Sweden)

    Sohaib Tahir

    2018-02-01

    Full Text Available In the modern era, distributed generation is considered as an alternative source for power generation. Especially, need of the time is to provide the three-phase loads with smooth sinusoidal voltages having fixed frequency and amplitude. A common solution is the integration of power electronics converters in the systems for connecting distributed generation systems to the stand-alone loads. Thus, the presence of suitable control techniques, in the power electronic converters, for robust stability, abrupt response, optimal tracking ability and error eradication are inevitable. A comprehensive review based on design, analysis, validation of the most suitable digital control techniques and the options available for the researchers for improving the power quality is presented in this paper with their pros and cons. Comparisons based on the cost, schemes, performance, modulation techniques and coordinates system are also presented. Finally, the paper describes the performance evaluation of the control schemes on a voltage source inverter (VSI and proposes the different aspects to be considered for selecting a power electronics inverter topology, reference frames, filters, as well as control strategy.

  20. A Simple Density with Distance Based Initial Seed Selection Technique for K Means Algorithm

    Directory of Open Access Journals (Sweden)

    Sajidha Syed Azimuddin

    2017-01-01

    Full Text Available Open issues with respect to K means algorithm are identifying the number of clusters, initial seed concept selection, clustering tendency, handling empty clusters, identifying outliers etc. In this paper we propose a novel and a simple technique considering both density and distance of the concepts in a dataset to identify initial seed concepts for clustering. Many authors have proposed different techniques to identify initial seed concepts; but our method ensures that the initial seed concepts are chosen from different clusters that are to be generated by the clustering solution. The hallmark of our algorithm is that it is a single pass algorithm that does not require any extra parameters to be estimated. Further, our seed concepts are one among the actual concepts and not the mean of representative concepts as is the case in many other algorithms. We have implemented our proposed algorithm and compared the results with the interval based technique of Fouad Khan. We see that our method outperforms the interval based method. We have also compared our method with the original random K means and K Means++ algorithms.

  1. Does Angling Technique Selectively Target Fishes Based on Their Behavioural Type?

    Directory of Open Access Journals (Sweden)

    Alexander D M Wilson

    Full Text Available Recently, there has been growing recognition that fish harvesting practices can have important impacts on the phenotypic distributions and diversity of natural populations through a phenomenon known as fisheries-induced evolution. Here we experimentally show that two common recreational angling techniques (active crank baits versus passive soft plastics differentially target wild largemouth bass (Micropterus salmoides and rock bass (Ambloplites rupestris based on variation in their behavioural tendencies. Fish were first angled in the wild using both techniques and then brought back to the laboratory and tested for individual-level differences in common estimates of personality (refuge emergence, flight-initiation-distance, latency-to-recapture and with a net, and general activity in an in-lake experimental arena. We found that different angling techniques appear to selectively target these species based on their boldness (as characterized by refuge emergence, a standard measure of boldness in fishes but not other assays of personality. We also observed that body size was independently a significant predictor of personality in both species, though this varied between traits and species. Our results suggest a context-dependency for vulnerability to capture relative to behaviour in these fish species. Ascertaining the selective pressures angling practices exert on natural populations is an important area of fisheries research with significant implications for ecology, evolution, and resource management.

  2. High-spatial-resolution sub-surface imaging using a laser-based acoustic microscopy technique.

    Science.gov (United States)

    Balogun, Oluwaseyi; Cole, Garrett D; Huber, Robert; Chinn, Diane; Murray, Todd W; Spicer, James B

    2011-01-01

    Scanning acoustic microscopy techniques operating at frequencies in the gigahertz range are suitable for the elastic characterization and interior imaging of solid media with micrometer-scale spatial resolution. Acoustic wave propagation at these frequencies is strongly limited by energy losses, particularly from attenuation in the coupling media used to transmit ultrasound to a specimen, leading to a decrease in the depth in a specimen that can be interrogated. In this work, a laser-based acoustic microscopy technique is presented that uses a pulsed laser source for the generation of broadband acoustic waves and an optical interferometer for detection. The use of a 900-ps microchip pulsed laser facilitates the generation of acoustic waves with frequencies extending up to 1 GHz which allows for the resolution of micrometer-scale features in a specimen. Furthermore, the combination of optical generation and detection approaches eliminates the use of an ultrasonic coupling medium, and allows for elastic characterization and interior imaging at penetration depths on the order of several hundred micrometers. Experimental results illustrating the use of the laser-based acoustic microscopy technique for imaging micrometer-scale subsurface geometrical features in a 70-μm-thick single-crystal silicon wafer with a (100) orientation are presented.

  3. Gravity Matching Aided Inertial Navigation Technique Based on Marginal Robust Unscented Kalman Filter

    Directory of Open Access Journals (Sweden)

    Ming Liu

    2015-01-01

    Full Text Available This paper is concerned with the topic of gravity matching aided inertial navigation technology using Kalman filter. The dynamic state space model for Kalman filter is constructed as follows: the error equation of the inertial navigation system is employed as the process equation while the local gravity model based on 9-point surface interpolation is employed as the observation equation. The unscented Kalman filter is employed to address the nonlinearity of the observation equation. The filter is refined in two ways as follows. The marginalization technique is employed to explore the conditionally linear substructure to reduce the computational load; specifically, the number of the needed sigma points is reduced from 15 to 5 after this technique is used. A robust technique based on Chi-square test is employed to make the filter insensitive to the uncertainties in the above constructed observation model. Numerical simulation is carried out, and the efficacy of the proposed method is validated by the simulation results.

  4. VIDEO DENOISING USING SWITCHING ADAPTIVE DECISION BASED ALGORITHM WITH ROBUST MOTION ESTIMATION TECHNIQUE

    Directory of Open Access Journals (Sweden)

    V. Jayaraj

    2010-08-01

    Full Text Available A Non-linear adaptive decision based algorithm with robust motion estimation technique is proposed for removal of impulse noise, Gaussian noise and mixed noise (impulse and Gaussian with edge and fine detail preservation in images and videos. The algorithm includes detection of corrupted pixels and the estimation of values for replacing the corrupted pixels. The main advantage of the proposed algorithm is that an appropriate filter is used for replacing the corrupted pixel based on the estimation of the noise variance present in the filtering window. This leads to reduced blurring and better fine detail preservation even at the high mixed noise density. It performs both spatial and temporal filtering for removal of the noises in the filter window of the videos. The Improved Cross Diamond Search Motion Estimation technique uses Least Median Square as a cost function, which shows improved performance than other motion estimation techniques with existing cost functions. The results show that the proposed algorithm outperforms the other algorithms in the visual point of view and in Peak Signal to Noise Ratio, Mean Square Error and Image Enhancement Factor.

  5. Three-Dimensional Inverse Transport Solver Based on Compressive Sensing Technique

    Science.gov (United States)

    Cheng, Yuxiong; Wu, Hongchun; Cao, Liangzhi; Zheng, Youqi

    2013-09-01

    According to the direct exposure measurements from flash radiographic image, a compressive sensing-based method for three-dimensional inverse transport problem is presented. The linear absorption coefficients and interface locations of objects are reconstructed directly at the same time. It is always very expensive to obtain enough measurements. With limited measurements, compressive sensing sparse reconstruction technique orthogonal matching pursuit is applied to obtain the sparse coefficients by solving an optimization problem. A three-dimensional inverse transport solver is developed based on a compressive sensing-based technique. There are three features in this solver: (1) AutoCAD is employed as a geometry preprocessor due to its powerful capacity in graphic. (2) The forward projection matrix rather than Gauss matrix is constructed by the visualization tool generator. (3) Fourier transform and Daubechies wavelet transform are adopted to convert an underdetermined system to a well-posed system in the algorithm. Simulations are performed and numerical results in pseudo-sine absorption problem, two-cube problem and two-cylinder problem when using compressive sensing-based solver agree well with the reference value.

  6. Using the critical incident technique in community-based participatory research: a case study.

    Science.gov (United States)

    Belkora, Jeffrey; Stupar, Lauren; O'Donnell, Sara

    2011-01-01

    Successful community-based participatory research involves the community partner in every step of the research process. The primary study for this paper took place in rural, Northern California. Collaborative partners included an academic researcher and two community based resource centers that provide supportive services to people diagnosed with cancer. This paper describes our use of the Critical Incident Technique (CIT) to conduct Community-based Participatory Research. We ask: Did the CIT facilitate or impede the active engagement of the community in all steps of the study process? We identified factors about the Critical Incident Technique that were either barriers or facilitators to involving the community partner in every step of the research process. Facilitators included the CIT's ability to accommodate involvement from a large spectrum of the community, its flexible design, and its personal approach. Barriers to community engagement included training required to conduct interviews, depth of interview probes, and time required. Overall, our academic-community partners felt that our use of the CIT facilitated community involvement in our Community-Based Participatory Research Project, where we used it to formally document the forces promoting and inhibiting successful achievement of community aims.

  7. Post-fire debris flow prediction in Western United States: Advancements based on a nonparametric statistical technique

    Science.gov (United States)

    Nikolopoulos, E. I.; Destro, E.; Bhuiyan, M. A. E.; Borga, M., Sr.; Anagnostou, E. N.

    2017-12-01

    Fire disasters affect modern societies at global scale inducing significant economic losses and human casualties. In addition to their direct impacts they have various adverse effects on hydrologic and geomorphologic processes of a region due to the tremendous alteration of the landscape characteristics (vegetation, soil properties etc). As a consequence, wildfires often initiate a cascade of hazards such as flash floods and debris flows that usually follow the occurrence of a wildfire thus magnifying the overall impact in a region. Post-fire debris flows (PFDF) is one such type of hazards frequently occurring in Western United States where wildfires are a common natural disaster. Prediction of PDFD is therefore of high importance in this region and over the last years a number of efforts from United States Geological Survey (USGS) and National Weather Service (NWS) have been focused on the development of early warning systems that will help mitigate PFDF risk. This work proposes a prediction framework that is based on a nonparametric statistical technique (random forests) that allows predicting the occurrence of PFDF at regional scale with a higher degree of accuracy than the commonly used approaches that are based on power-law thresholds and logistic regression procedures. The work presented is based on a recently released database from USGS that reports a total of 1500 storms that triggered and did not trigger PFDF in a number of fire affected catchments in Western United States. The database includes information on storm characteristics (duration, accumulation, max intensity etc) and other auxiliary information of land surface properties (soil erodibility index, local slope etc). Results show that the proposed model is able to achieve a satisfactory prediction accuracy (threat score > 0.6) superior of previously published prediction frameworks highlighting the potential of nonparametric statistical techniques for development of PFDF prediction systems.

  8. Mathematical Foundation Based Inter-Connectivity modelling of Thermal Image processing technique for Fire Protection

    Directory of Open Access Journals (Sweden)

    Sayantan Nath

    2015-09-01

    Full Text Available In this paper, integration between multiple functions of image processing and its statistical parameters for intelligent alarming series based fire detection system is presented. The proper inter-connectivity mapping between processing elements of imagery based on classification factor for temperature monitoring and multilevel intelligent alarm sequence is introduced by abstractive canonical approach. The flow of image processing components between core implementation of intelligent alarming system with temperature wise area segmentation as well as boundary detection technique is not yet fully explored in the present era of thermal imaging. In the light of analytical perspective of convolutive functionalism in thermal imaging, the abstract algebra based inter-mapping model between event-calculus supported DAGSVM classification for step-by-step generation of alarm series with gradual monitoring technique and segmentation of regions with its affected boundaries in thermographic image of coal with respect to temperature distinctions is discussed. The connectedness of the multifunctional operations of image processing based compatible fire protection system with proper monitoring sequence is presently investigated here. The mathematical models representing the relation between the temperature affected areas and its boundary in the obtained thermal image defined in partial derivative fashion is the core contribution of this study. The thermal image of coal sample is obtained in real-life scenario by self-assembled thermographic camera in this study. The amalgamation between area segmentation, boundary detection and alarm series are described in abstract algebra. The principal objective of this paper is to understand the dependency pattern and the principles of working of image processing components and structure an inter-connected modelling technique also for those components with the help of mathematical foundation.

  9. Comparison of two structured illumination techniques based on different 3D illumination patterns

    Science.gov (United States)

    Shabani, H.; Patwary, N.; Doblas, A.; Saavedra, G.; Preza, C.

    2017-02-01

    Manipulating the excitation pattern in optical microscopy has led to several super-resolution techniques. Among different patterns, the lateral sinusoidal excitation was used for the first demonstration of structured illumination microscopy (SIM), which provides the fastest SIM acquisition system (based on the number of raw images required) compared to the multi-spot illumination approach. Moreover, 3D patterns that include lateral and axial variations in the illumination have attracted more attention recently as they address resolution enhancement in three dimensions. A threewave (3W) interference technique based on coherent illumination has already been shown to provide super-resolution and optical sectioning in 3D-SIM. In this paper, we investigate a novel tunable technique that creates a 3D pattern from a set of multiple incoherently illuminated parallel slits that act as light sources for a Fresnel biprism. This setup is able to modulate the illumination pattern in the object space both axially and laterally with adjustable modulation frequencies. The 3D forward model for the new system is developed here to consider the effect of the axial modulation due to the 3D patterned illumination. The performance of 3D-SIM based on 3W interference and the tunable system are investigated in simulation and compared based on two different criteria. First, restored images obtained for both 3D-SIM systems using a generalized Wiener filter are compared to determine the effect of the illumination pattern on the reconstruction. Second, the effective frequency response of both systems is studied to determine the axial and lateral resolution enhancement that is obtained in each case.

  10. The role of model-based methods in the development of single scan techniques

    International Nuclear Information System (INIS)

    Laruelle, Marc

    2000-01-01

    Single scan techniques are highly desirable for clinical trials involving radiotracers because they increase logistical feasibility, improve patient compliance, and decrease the cost associated with the study. However, the information derived from single scans usually are biased by factors unrelated to the process of interest. Therefore, identification of these factors and evaluation of their impact on the proposed outcome measure is important. In this paper, the impact of confounding factors on single scan measurements is illustrated by discussing the effect of between-subject or between-condition differences in radiotracer plasma clearance on normalized activity ratios (specific to nonspecific ratios) in the tissue of interest. Computer simulation based on kinetic analyses are presented to demonstrate this effect. It is proposed that the presence of this and other confounding factors should not necessarily preclude clinical trials based on single scan techniques. First, knowledge of the distribution of plasma clearance values in a sample of the investigated population allows researchers to assign limits to this potential bias. This information can be integrated in the power analysis. Second, the impact of this problem will vary according to the characteristic of the radiotracer, and this information can be used in the development and selection of the radiotracer. Third, simple modification of the experimental design (such as administration of the radiotracer as a bolus, followed by constant infusion, rather than as a single bolus) might remove this potential confounding factor and allow appropriate quantification within the limits of a single scanning session. In conclusion, model-based kinetic characterization of radiotracer distribution and uptake is critical to the design and interpretation of clinical trials based on single scan techniques

  11. A Novel FCC Catalyst Based on a Porous Composite Material Synthesized via an In Situ Technique

    Directory of Open Access Journals (Sweden)

    Shu-Qin Zheng

    2015-11-01

    Full Text Available To overcome diffusion limitations and improve transport in microporous zeolite, the materials with a wide-pore structure have been developed. In this paper, composite microspheres with hierarchical porous structure were synthesized by an in situ technique using sepiolite, kaolin and pseudoboehmite as raw material. A novel fluid catalytic cracking (FCC catalyst for maximizing light oil yield was prepared based on the composite materials. The catalyst was characterized by XRD, FT-IR, SEM, nitrogen adsorption-desorption techniques and tested in a bench FCC unit. The results indicated that the catalyst had more meso- and macropores and more acid sites than the reference catalyst, and thus can increase light oil yield by 1.31 %, while exhibiting better gasoline and coke selectivity.

  12. New imaging technique based on diffraction of a focused x-ray beam

    Energy Technology Data Exchange (ETDEWEB)

    Kazimirov, A [Cornell High Energy Synchrotron Source (CHESS), Cornell University, Ithaca, NY 14853 (United States); Kohn, V G [Russian Research Center ' Kurchatov Institute, 123182 Moscow (Russian Federation); Cai, Z-H [Advanced Photon Source, 9700 S. Cass Avenue, Argonne, IL 60439 (United States)], E-mail: ayk7@cornell.edu

    2009-01-07

    We present first experimental results from a new diffraction depth-sensitive imaging technique. It is based on the diffraction of a focused x-ray beam from a crystalline sample and recording the intensity pattern on a high-resolution CCD detector positioned at a focal plane. Structural non-uniformity inside the sample results in a region of enhanced intensity in the diffraction pattern. The technique was applied to study silicon-on-insulator thin layers of various thicknesses which revealed a complex strain profile within the layers. A circular Fresnel zone plate was used as a focusing optic. Incoherent diffuse scattering spreads out of the diffraction plane and results in intensity recorded outside of the focal spot providing a new approach to separately register x-rays scattered coherently and incoherently from the sample. (fast track communication)

  13. A Suboptimal PTS Algorithm Based on Particle Swarm Optimization Technique for PAPR Reduction in OFDM Systems

    Directory of Open Access Journals (Sweden)

    Ho-Lung Hung

    2008-08-01

    Full Text Available A suboptimal partial transmit sequence (PTS based on particle swarm optimization (PSO algorithm is presented for the low computation complexity and the reduction of the peak-to-average power ratio (PAPR of an orthogonal frequency division multiplexing (OFDM system. In general, PTS technique can improve the PAPR statistics of an OFDM system. However, it will come with an exhaustive search over all combinations of allowed phase weighting factors and the search complexity increasing exponentially with the number of subblocks. In this paper, we work around potentially computational intractability; the proposed PSO scheme exploits heuristics to search the optimal combination of phase factors with low complexity. Simulation results show that the new technique can effectively reduce the computation complexity and PAPR reduction.

  14. A Suboptimal PTS Algorithm Based on Particle Swarm Optimization Technique for PAPR Reduction in OFDM Systems

    Directory of Open Access Journals (Sweden)

    Lee Shu-Hong

    2008-01-01

    Full Text Available Abstract A suboptimal partial transmit sequence (PTS based on particle swarm optimization (PSO algorithm is presented for the low computation complexity and the reduction of the peak-to-average power ratio (PAPR of an orthogonal frequency division multiplexing (OFDM system. In general, PTS technique can improve the PAPR statistics of an OFDM system. However, it will come with an exhaustive search over all combinations of allowed phase weighting factors and the search complexity increasing exponentially with the number of subblocks. In this paper, we work around potentially computational intractability; the proposed PSO scheme exploits heuristics to search the optimal combination of phase factors with low complexity. Simulation results show that the new technique can effectively reduce the computation complexity and PAPR reduction.

  15. On-line nuclear ash gauge for coal based on gamma-ray transmission techniques

    International Nuclear Information System (INIS)

    Rizk, R.A.M.; El-Kateb, A.H.; Abdul-Kader, A.M.

    1999-01-01

    Developments and applications of on-line nuclear gauges in the coal industry are highly requested. A nuclear ash gauge for coal, based on γ-ray transmission techniques is developed. Single and dual energy γ-ray beams are used to determine the ash content of coal. The percentage ash content as a function of the γ-ray intensities transmitted through coal samples is measured and sensitivity curves are obtained. An empirical formulation relating the ash content values to the γ-ray intensities is derived. Preliminary results show that both single and dual energy γ-ray transmission techniques can be used to give a rapid on-line estimation of the ash concentration values in coal with low cost and reasonable accuracy, but the dual one is much preferable. (author)

  16. Virtual Power Plant and Microgrids controller for Energy Management based on optimization techniques

    Directory of Open Access Journals (Sweden)

    Maher G. M. Abdolrasol

    2017-06-01

    Full Text Available This paper discuss virtual power plant (VPP and Microgrid controller for energy management system (EMS based on optimization techniques by using two optimization techniques namely Backtracking search algorithm (BSA and particle swarm optimization algorithm (PSO. The research proposes use of multi Microgrid in the distribution networks to aggregate the power form distribution generation and form it into single Microgrid and let these Microgrid deal directly with the central organizer called virtual power plant. VPP duties are price forecast, demand forecast, weather forecast, production forecast, shedding loads, make intelligent decision and for aggregate & optimizes the data. This huge system has been tested and simulated by using Matlab simulink. These paper shows optimizations of two methods were really significant in the results. But BSA is better than PSO to search for better parameters which could make more power saving as in the results and the discussion.

  17. A hybrid firefly algorithm and pattern search technique for SSSC based power oscillation damping controller design

    Directory of Open Access Journals (Sweden)

    Srikanta Mahapatra

    2014-12-01

    Full Text Available In this paper, a novel hybrid Firefly Algorithm and Pattern Search (h-FAPS technique is proposed for a Static Synchronous Series Compensator (SSSC-based power oscillation damping controller design. The proposed h-FAPS technique takes the advantage of global search capability of FA and local search facility of PS. In order to tackle the drawback of using the remote signal that may impact reliability of the controller, a modified signal equivalent to the remote speed deviation signal is constructed from the local measurements. The performances of the proposed controllers are evaluated in SMIB and multi-machine power system subjected to various transient disturbances. To show the effectiveness and robustness of the proposed design approach, simulation results are presented and compared with some recently published approaches such as Differential Evolution (DE and Particle Swarm Optimization (PSO. It is observed that the proposed approach yield superior damping performance compared to some recently reported approaches.

  18. High-efficiency photorealistic computer-generated holograms based on the backward ray-tracing technique

    Science.gov (United States)

    Wang, Yuan; Chen, Zhidong; Sang, Xinzhu; Li, Hui; Zhao, Linmin

    2018-03-01

    Holographic displays can provide the complete optical wave field of a three-dimensional (3D) scene, including the depth perception. However, it often takes a long computation time to produce traditional computer-generated holograms (CGHs) without more complex and photorealistic rendering. The backward ray-tracing technique is able to render photorealistic high-quality images, which noticeably reduce the computation time achieved from the high-degree parallelism. Here, a high-efficiency photorealistic computer-generated hologram method is presented based on the ray-tracing technique. Rays are parallelly launched and traced under different illuminations and circumstances. Experimental results demonstrate the effectiveness of the proposed method. Compared with the traditional point cloud CGH, the computation time is decreased to 24 s to reconstruct a 3D object of 100 ×100 rays with continuous depth change.

  19. A Novel Technique for Steganography Method Based on Improved Genetic Algorithm Optimization in Spatial Domain

    Directory of Open Access Journals (Sweden)

    M. Soleimanpour-moghadam

    2013-06-01

    Full Text Available This paper devotes itself to the study of secret message delivery using cover image and introduces a novel steganographic technique based on genetic algorithm to find a near-optimum structure for the pair-wise least-significant-bit (LSB matching scheme. A survey of the related literatures shows that the LSB matching method developed by Mielikainen, employs a binary function to reduce the number of changes of LSB values. This method verifiably reduces the probability of detection and also improves the visual quality of stego images. So, our proposal draws on the Mielikainen's technique to present an enhanced dual-state scoring model, structured upon genetic algorithm which assesses the performance of different orders for LSB matching and searches for a near-optimum solution among all the permutation orders. Experimental results confirm superiority of the new approach compared to the Mielikainen’s pair-wise LSB matching scheme.

  20. A novel input-parasitic compensation technique for a nanopore-based CMOS DNA detection sensor

    Science.gov (United States)

    Kim, Jungsuk

    2016-12-01

    This paper presents a novel input-parasitic compensation (IPC) technique for a nanopore-based complementary metal-oxide-semiconductor (CMOS) DNA detection sensor. A resistive-feedback transimpedance amplifier is typically adopted as the headstage of a DNA detection sensor to amplify the minute ionic currents generated from a nanopore and convert them to a readable voltage range for digitization. But, parasitic capacitances arising from the headstage input and the nanopore often cause headstage saturation during nanopore sensing, thereby resulting in significant DNA data loss. To compensate for the unwanted saturation, in this work, we propose an area-efficient and automated IPC technique, customized for a low-noise DNA detection sensor, fabricated using a 0.35- μm CMOS process; we demonstrated this prototype in a benchtop test using an α-hemolysin ( α-HL) protein nanopore.

  1. Wilcoxon signed-rank-based technique for the pulse-shape analysis of HPGe detectors

    Science.gov (United States)

    Martín, S.; Quintana, B.; Barrientos, D.

    2016-07-01

    The characterization of the electric response of segmented-contact high-purity germanium detectors requires scanning systems capable of accurately associating each pulse with the position of the interaction that generated it. This process requires an algorithm sensitive to changes above the electronic noise in the pulse shapes produced at different positions, depending on the resolution of the Ge crystal. In this work, a pulse-shape comparison technique based on the Wilcoxon signed-rank test has been developed. It provides a method to distinguish pulses coming from different interaction points in the germanium crystal. Therefore, this technique is a necessary step for building a reliable pulse-shape database that can be used later for the determination of the position of interaction for γ-ray tracking spectrometry devices such as AGATA, GRETA or GERDA. The method was validated by comparison with a χ2 test using simulated and experimental pulses corresponding to a Broad Energy germanium detector (BEGe).

  2. Wilcoxon signed-rank-based technique for the pulse-shape analysis of HPGe detectors

    International Nuclear Information System (INIS)

    Martín, S.; Quintana, B.; Barrientos, D.

    2016-01-01

    The characterization of the electric response of segmented-contact high-purity germanium detectors requires scanning systems capable of accurately associating each pulse with the position of the interaction that generated it. This process requires an algorithm sensitive to changes above the electronic noise in the pulse shapes produced at different positions, depending on the resolution of the Ge crystal. In this work, a pulse-shape comparison technique based on the Wilcoxon signed-rank test has been developed. It provides a method to distinguish pulses coming from different interaction points in the germanium crystal. Therefore, this technique is a necessary step for building a reliable pulse-shape database that can be used later for the determination of the position of interaction for γ-ray tracking spectrometry devices such as AGATA, GRETA or GERDA. The method was validated by comparison with a χ"2 test using simulated and experimental pulses corresponding to a Broad Energy germanium detector (BEGe).

  3. Design of on-line steam generator leak monitoring system based on Cherenkov counting technique

    International Nuclear Information System (INIS)

    Dileep, B.N.; D'Cruz, S.J.; Biju, P.; Jashi, K.B.; Prabhakaran, V.; Venkataramana, K.; Managanvi, S.S.

    2006-01-01

    The methodology developed by Nuclear Power Corporation of India Ltd. for identification of leaky Steam Generator (SG) by monitoring 134 I activity in the blow down water is a very high sensitive method. However, this technique can not be put into use as an on-line system. A new method of on-line detection of SG leak and identify the offending SG based on Cherenkov counting technique is explained in this paper. It identifies the leak by detecting Cherenkov radiation produced by the hard beta emitting radio nuclides escaped into feed water during leak in an operating reactor. A simulated system shows that a leak rate of 2 kg/h can be detected by the proposed system, while coolant 134 I activity is 3.7 MBq/l (100μCi/l). (author)

  4. Detection of Moving Targets Based on Doppler Spectrum Analysis Technique for Passive Coherent Radar

    Directory of Open Access Journals (Sweden)

    Zhao Yao-dong

    2013-06-01

    Full Text Available A novel method of moving targets detection taking Doppler spectrum analysis technique for Passive Coherent Radar (PCR is provided. After dividing the receiving signals into segments as pulse series, it utilizes the technique of pulse compress and Doppler processing to detect and locate the targets. Based on the algorithm for Pulse-Doppler (PD radar, the equipollence between continuous and pulsed wave in match filtering is proved and details of this method are introduced. To compare it with the traditional method of Cross-Ambiguity Function (CAF calculation, the relationship and mathematical modes of them are analyzed, with some suggestions on parameters choosing. With little influence to the gain of targets, the method can greatly promote the processing efficiency. The validity of the proposed method is demonstrated by offline processing real collected data sets and simulation results.

  5. A Novel (DDCC-SFG-Based Systematic Design Technique of Active Filters

    Directory of Open Access Journals (Sweden)

    M. Fakhfakh

    2013-09-01

    Full Text Available This paper proposes a novel idea for the synthesis of active filters that is based on the use of signal-flow graph (SFG stamps of differential difference current conveyors (DDCCs. On the basis of an RLC passive network or a filter symbolic transfer function, an equivalent SFG is constructed. DDCCs’ SFGs are identified inside the constructed ‘active’ graph, and thus the equivalent circuit can be easily synthesized. We show that the DDCC and its ‘derivatives’, i.e. differential voltage current conveyors and the conventional current conveyors, are the main basic building blocks in such design. The practicability of the proposed technique is showcased via three application examples. Spice simulations are given to show the viability of the proposed technique.

  6. A Framework for Automatic Web Service Discovery Based on Semantics and NLP Techniques

    Directory of Open Access Journals (Sweden)

    Asma Adala

    2011-01-01

    Full Text Available As a greater number of Web Services are made available today, automatic discovery is recognized as an important task. To promote the automation of service discovery, different semantic languages have been created that allow describing the functionality of services in a machine interpretable form using Semantic Web technologies. The problem is that users do not have intimate knowledge about semantic Web service languages and related toolkits. In this paper, we propose a discovery framework that enables semantic Web service discovery based on keywords written in natural language. We describe a novel approach for automatic discovery of semantic Web services which employs Natural Language Processing techniques to match a user request, expressed in natural language, with a semantic Web service description. Additionally, we present an efficient semantic matching technique to compute the semantic distance between ontological concepts.

  7. Subsurface Scattering-Based Object Rendering Techniques for Real-Time Smartphone Games

    Directory of Open Access Journals (Sweden)

    Won-Sun Lee

    2014-01-01

    Full Text Available Subsurface scattering that simulates the path of a light through the material in a scene is one of the advanced rendering techniques in the field of computer graphics society. Since it takes a number of long operations, it cannot be easily implemented in real-time smartphone games. In this paper, we propose a subsurface scattering-based object rendering technique that is optimized for smartphone games. We employ our subsurface scattering method that is utilized for a real-time smartphone game. And an example game is designed to validate how the proposed method can be operated seamlessly in real time. Finally, we show the comparison results between bidirectional reflectance distribution function, bidirectional scattering distribution function, and our proposed subsurface scattering method on a smartphone game.

  8. Debugging systems-on-chip communication-centric and abstraction-based techniques

    CERN Document Server

    Vermeulen, Bart

    2014-01-01

    This book describes an approach and supporting infrastructure to facilitate debugging the silicon implementation of a System-on-Chip (SOC), allowing its associated product to be introduced into the market more quickly.  Readers learn step-by-step the key requirements for debugging a modern, silicon SOC implementation, nine factors that complicate this debugging task, and a new debug approach that addresses these requirements and complicating factors.  The authors’ novel communication-centric, scan-based, abstraction-based, run/stop-based (CSAR) debug approach is discussed in detail, showing how it helps to meet debug requirements and address the nine, previously identified factors that complicate debugging silicon implementations of SOCs. The authors also derive the debug infrastructure requirements to support debugging of a silicon implementation of an SOC with their CSAR debug approach. This debug infrastructure consists of a generic on-chip debug architecture, a configurable automated design-for-debug ...

  9. A scanning tunneling microscopy based potentiometry technique and its application to the local sensing of the spin Hall effect

    Directory of Open Access Journals (Sweden)

    Ting Xie

    2017-12-01

    Full Text Available A scanning tunneling microscopy based potentiometry technique for the measurements of the local surface electric potential is presented. A voltage compensation circuit based on this potentiometry technique is developed and employed to maintain a desired tunneling voltage independent of the bias current flow through the film. The application of this potentiometry technique to the local sensing of the spin Hall effect is outlined and some experimental results are reported.

  10. Chatter identification in milling of Inconel 625 based on recurrence plot technique and Hilbert vibration decomposition

    Directory of Open Access Journals (Sweden)

    Lajmert Paweł

    2018-01-01

    Full Text Available In the paper a cutting stability in the milling process of nickel based alloy Inconel 625 is analysed. This problem is often considered theoretically, but the theoretical finding do not always agree with experimental results. For this reason, the paper presents different methods for instability identification during real machining process. A stability lobe diagram is created based on data obtained in impact test of an end mill. Next, the cutting tests were conducted in which the axial cutting depth of cut was gradually increased in order to find a stability limit. Finally, based on the cutting force measurements the stability estimation problem is investigated using the recurrence plot technique and Hilbert vibration decomposition method.

  11. Nuclear power plant monitoring and fault diagnosis methods based on the artificial intelligence technique

    International Nuclear Information System (INIS)

    Yoshikawa, S.; Saiki, A.; Ugolini, D.; Ozawa, K.

    1996-01-01

    The main objective of this paper is to develop an advanced diagnosis system based on the artificial intelligence technique to monitor the operation and to improve the operational safety of nuclear power plants. Three different methods have been elaborated in this study: an artificial neural network local diagnosis (NN ds ) scheme that acting at the component level discriminates between normal and abnormal transients, a model-based diagnostic reasoning mechanism that combines a physical causal network model-based knowledge compiler (KC) that generates applicable diagnostic rules from widely accepted physical knowledge compiler (KC) that generates applicable diagnostic rules from widely accepted physical knowledge. Although the three methods have been developed and verified independently, they are highly correlated and, when connected together, form a effective and robust diagnosis and monitoring tool. (authors)

  12. Acoustic Biometric System Based on Preprocessing Techniques and Linear Support Vector Machines.

    Science.gov (United States)

    del Val, Lara; Izquierdo-Fuente, Alberto; Villacorta, Juan J; Raboso, Mariano

    2015-06-17

    Drawing on the results of an acoustic biometric system based on a MSE classifier, a new biometric system has been implemented. This new system preprocesses acoustic images, extracts several parameters and finally classifies them, based on Support Vector Machine (SVM). The preprocessing techniques used are spatial filtering, segmentation-based on a Gaussian Mixture Model (GMM) to separate the person from the background, masking-to reduce the dimensions of images-and binarization-to reduce the size of each image. An analysis of classification error and a study of the sensitivity of the error versus the computational burden of each implemented algorithm are presented. This allows the selection of the most relevant algorithms, according to the benefits required by the system. A significant improvement of the biometric system has been achieved by reducing the classification error, the computational burden and the storage requirements.

  13. THE APPLICATION OF GRAPHOLOGY AND ENNEAGRAM TECHNIQUES IN DETERMINING PERSONALITY TYPE BASED ON HANDWRITING FEATURES

    Directory of Open Access Journals (Sweden)

    Dian Pratiwi

    2017-02-01

    Full Text Available This research was conducted with the aim of developing previous studies that have successfully applied the science of graphology to analyze digital handwriting and characteristics of his personality through shape based feature extraction, which in the present study will be applied one method of psychological tests commonly used by psychologists to recognize human’s personality that is Enneagram. The Enneagram method in principle will classify the personality traits of a person into nine types through a series of questions, which then calculated the amount of the overall weight of the answer. Thickness is what will provide direction personality type, which will then be matched with the personality type of the result of the graphology analysis of the handwriting. Personality type of handwritten analysis results is processed based on the personality traits that are the result of the identification of a combination of four dominant form of handwriting through the software output of previous studies, that Slant (tilt writing, Size (font size, Baseline, and Breaks (respite each word. From the results of this research can be found there is a correlation between personality analysis based on the psychology science to the graphology science, which results matching personality types by 81.6% of 49  respondents data who successfully tested.

  14. The influence of polishing techniques on pre-polymerized CAD\\CAM acrylic resin denture bases.

    Science.gov (United States)

    Alammari, Manal Rahma

    2017-10-01

    Lately, computer-aided design and computer-aided manufacturing (CAD/CAM) has broadly been successfully employed in dentistry. The CAD/CAM systems have recently become commercially available for fabrication of complete dentures, and are considered as an alternative technique to conventionally processed acrylic resin bases. However, they have not yet been fully investigated. The purpose of this study was to inspect the effects of mechanical polishing and chemical polishing on the surface roughness (Ra) and contact angle (wettability) of heat-cured, auto-cured and CAD/CAM denture base acrylic resins. This study was conducted at the Advanced Dental Research Laboratory Center of King Abdulaziz University from March to June 2017. Three denture base materials were selected: heat cure poly-methylmethacrylate resin, thermoplastic (polyamide resin) and (CAD\\CAM) denture base resin. Sixty specimens were prepared and divided into three groups, twenty in each. Each group was divided according to the polishing techniques into (Mech P) and (Chem P), ten specimens in each; surface roughness and wettability were investigated. Data were analyzed by SPSS version 22, using one-way ANOVA and Pearson coefficient. One-way analysis of variance (ANOVA) and post hoc tests were used for comparing the surface roughness values between three groups which revealed a statistical significant difference between them (p 1 CAD\\CAM denture base material (group III) showed the least mean values (1.08±0.23, 1.39±0.31, Mech P and Chem P respectively). CAD/CAM showed the least contact angle in both polishing methods, which were statistically significant at 5% level (p=0.034 and pCAD\\CAM denture base resin with superior smooth surface compared to chemical polishing. Mechanical polishing is considered the best effective polishing technique. CAD/CAM denture base material should be considered as the material of choice for complete denture construction in the near future, especially for older dental

  15. Tsunami Hazard Preventing Based Land Use Planning Model Using GIS Techniques in Muang Krabi, Thailand

    Directory of Open Access Journals (Sweden)

    Abdul Salam Soomro

    2012-10-01

    Full Text Available The terrible tsunami disaster, on 26 December 2004 hit Krabi, one of the ecotourist and very fascinating provinces of southern Thailand including its various regions e.g. Phangna and Phuket by devastating the human lives, coastal communications and the financially viable activities. This research study has been aimed to generate the tsunami hazard preventing based lands use planning model using GIS (Geographical Information Systems based on the hazard suitability analysis approach. The different triggering factors e.g. elevation, proximity to shore line, population density, mangrove, forest, stream and road have been used based on the land use zoning criteria. Those criteria have been used by using Saaty scale of importance one, of the mathematical techniques. This model has been classified according to the land suitability classification. The various techniques of GIS, namely subsetting, spatial analysis, map difference and data conversion have been used. The model has been generated with five categories such as high, moderate, low, very low and not suitable regions illustrating with their appropriate definition for the decision makers to redevelop the region.

  16. Satellite Angular Velocity Estimation Based on Star Images and Optical Flow Techniques

    Directory of Open Access Journals (Sweden)

    Giancarmine Fasano

    2013-09-01

    Full Text Available An optical flow-based technique is proposed to estimate spacecraft angular velocity based on sequences of star-field images. It does not require star identification and can be thus used to also deliver angular rate information when attitude determination is not possible, as during platform de tumbling or slewing. Region-based optical flow calculation is carried out on successive star images preprocessed to remove background. Sensor calibration parameters, Poisson equation, and a least-squares method are then used to estimate the angular velocity vector components in the sensor rotating frame. A theoretical error budget is developed to estimate the expected angular rate accuracy as a function of camera parameters and star distribution in the field of view. The effectiveness of the proposed technique is tested by using star field scenes generated by a hardware-in-the-loop testing facility and acquired by a commercial-off-the shelf camera sensor. Simulated cases comprise rotations at different rates. Experimental results are presented which are consistent with theoretical estimates. In particular, very accurate angular velocity estimates are generated at lower slew rates, while in all cases the achievable accuracy in the estimation of the angular velocity component along boresight is about one order of magnitude worse than the other two components.

  17. Atlas-based segmentation technique incorporating inter-observer delineation uncertainty for whole breast

    International Nuclear Information System (INIS)

    Bell, L R; Pogson, E M; Metcalfe, P; Holloway, L; Dowling, J A

    2017-01-01

    Accurate, efficient auto-segmentation methods are essential for the clinical efficacy of adaptive radiotherapy delivered with highly conformal techniques. Current atlas based auto-segmentation techniques are adequate in this respect, however fail to account for inter-observer variation. An atlas-based segmentation method that incorporates inter-observer variation is proposed. This method is validated for a whole breast radiotherapy cohort containing 28 CT datasets with CTVs delineated by eight observers. To optimise atlas accuracy, the cohort was divided into categories by mean body mass index and laterality, with atlas’ generated for each in a leave-one-out approach. Observer CTVs were merged and thresholded to generate an auto-segmentation model representing both inter-observer and inter-patient differences. For each category, the atlas was registered to the left-out dataset to enable propagation of the auto-segmentation from atlas space. Auto-segmentation time was recorded. The segmentation was compared to the gold-standard contour using the dice similarity coefficient (DSC) and mean absolute surface distance (MASD). Comparison with the smallest and largest CTV was also made. This atlas-based auto-segmentation method incorporating inter-observer variation was shown to be efficient (<4min) and accurate for whole breast radiotherapy, with good agreement (DSC>0.7, MASD <9.3mm) between the auto-segmented contours and CTV volumes. (paper)

  18. IMAGE-BASED MODELING TECHNIQUES FOR ARCHITECTURAL HERITAGE 3D DIGITALIZATION: LIMITS AND POTENTIALITIES

    Directory of Open Access Journals (Sweden)

    C. Santagati

    2013-07-01

    Full Text Available 3D reconstruction from images has undergone a revolution in the last few years. Computer vision techniques use photographs from data set collection to rapidly build detailed 3D models. The simultaneous applications of different algorithms (MVS, the different techniques of image matching, feature extracting and mesh optimization are inside an active field of research in computer vision. The results are promising: the obtained models are beginning to challenge the precision of laser-based reconstructions. Among all the possibilities we can mainly distinguish desktop and web-based packages. Those last ones offer the opportunity to exploit the power of cloud computing in order to carry out a semi-automatic data processing, thus allowing the user to fulfill other tasks on its computer; whereas desktop systems employ too much processing time and hard heavy approaches. Computer vision researchers have explored many applications to verify the visual accuracy of 3D model but the approaches to verify metric accuracy are few and no one is on Autodesk 123D Catch applied on Architectural Heritage Documentation. Our approach to this challenging problem is to compare the 3Dmodels by Autodesk 123D Catch and 3D models by terrestrial LIDAR considering different object size, from the detail (capitals, moldings, bases to large scale buildings for practitioner purpose.

  19. Tsunami hazard preventing based land use planing model using GIS technique in Muang Krabi, Thailand

    International Nuclear Information System (INIS)

    Soormo, A.S.

    2012-01-01

    The terrible tsunami disaster, on 26 December 2004 hit Krabi, one of the ecotourist and very fascinating provinces of southern Thailand including its various regions e.g. Phangna and Phuket by devastating the human lives, coastal communications and the financially viable activities. This research study has been aimed to generate the tsunami hazard preventing based lands use planning model using GIS (Geographical Information Systems) based on the hazard suitability analysis approach. The different triggering factors e.g. elevation, proximity to shore line, population density, mangrove, forest, stream and road have been used based on the land use zoning criteria. Those criteria have been used by using Saaty scale of importance one, of the mathematical techniques. This model has been classified according to the land suitability classification. The various techniques of GIS, namely subsetting, spatial analysis, map difference and data conversion have been used. The model has been generated with five categories such as high, moderate, low, very low and not suitable regions illustrating with their appropriate definition for the decision makers to redevelop the region. (author)

  20. Fabrication of superconducting MgB2 nanostructures by an electron beam lithography-based technique

    Science.gov (United States)

    Portesi, C.; Borini, S.; Amato, G.; Monticone, E.

    2006-03-01

    In this work, we present the results obtained in fabrication and characterization of magnesium diboride nanowires realized by an electron beam lithography (EBL)-based method. For fabricating MgB2 thin films, an all in situ technique has been used, based on the coevaporation of B and Mg by means of an e-gun and a resistive heater, respectively. Since the high temperatures required for the fabrication of good quality MgB2 thin films do not allow the nanostructuring approach based on the lift-off technique, we structured the samples combining EBL, optical lithography, and Ar milling. In this way, reproducible nanowires 1 μm long have been obtained. To illustrate the impact of the MgB2 film processing on its superconducting properties, we measured the temperature dependence of the resistance on a nanowire and compared it to the original magnesium diboride film. The electrical properties of the films are not degraded as a consequence of the nanostructuring process, so that superconducting nanodevices may be obtained by this method.

  1. A Review and Performance Investigation of NPCC Based UPQC by Using Various Artificial Intelligence Techniques

    Directory of Open Access Journals (Sweden)

    Venkata Rami Reddy K

    2017-03-01

    Full Text Available This paper presents a comprehensive review and performance investigation of Neutral Point Clamped Converter (NPCC based Unified Power Quality Conditioner (UPQC by using Artificial Intelligent (AI techniques. A Novel application of various levels of Diode Clamped Multi-Level Inverters [DCMLI] with Anti Phase Opposition and Disposition (APOD Pulse Width Modulation (PWM Scheme to Unified Power Quality Conditioner (UPQC. The Power Quality problem became a burning issues since the starting of high voltage AC transmission system. Hence, in this article it has been discussed to mitigate the PQ issues in high voltage AC systems through a three phase four wire Unified Power Quality Conditioner (UPQC under non-linear loads. The emphasised PQ problems such as voltage and current harmonics along with voltage sags and swells have also been discussed with improved performance. Also, it proposes to control the DCMLI based UPQC through conventional control schemes. Thus application of these control technique makes the system performance in par with the standards and also compared with existing system. The simulation results based on MATLAB/Simulink are discussed in detail to support the concept developed in the paper.

  2. Critical assessment of the deposition based dosimetric technique for radon/thoron decay products

    International Nuclear Information System (INIS)

    Mayya, Y.S.

    2010-01-01

    Inhalation doses due to radon ( 222 Rn) and thoron ( 220 Rn) are predominantly contributed by their decay products and not due to the gases themselves. Decay product measurements are being carried out essentially by either short-term active measurement like by air-sampling on a substrate followed by alpha or beta counting or by continuous active monitoring techniques based on silicon barrier detector. However, due to non-availability of satisfactory passive measurement techniques for the progeny species, it has been a usual practice to estimate the long time averaged progeny concentration from measured gas concentration using an assumed equilibrium factor. To be accurate, one is required to measure the equilibrium factor in situ along with the gas concentration. This being not practical, the assigned equilibrium factor (0.4 for indoor and 0.8 for outdoor for 222 Rn) approach has been an inevitable, though uncertain, part of the dosimetric strategies in both occupational and public domains. Further, in the case of thoron decay products however, equilibrium factor is of far more questionable validity. Thus, there is a need to shift from gas based dosimetric paradigm to that based on direct detection of progeny species

  3. Image-Based Modeling Techniques for Architectural Heritage 3d Digitalization: Limits and Potentialities

    Science.gov (United States)

    Santagati, C.; Inzerillo, L.; Di Paola, F.

    2013-07-01

    3D reconstruction from images has undergone a revolution in the last few years. Computer vision techniques use photographs from data set collection to rapidly build detailed 3D models. The simultaneous applications of different algorithms (MVS), the different techniques of image matching, feature extracting and mesh optimization are inside an active field of research in computer vision. The results are promising: the obtained models are beginning to challenge the precision of laser-based reconstructions. Among all the possibilities we can mainly distinguish desktop and web-based packages. Those last ones offer the opportunity to exploit the power of cloud computing in order to carry out a semi-automatic data processing, thus allowing the user to fulfill other tasks on its computer; whereas desktop systems employ too much processing time and hard heavy approaches. Computer vision researchers have explored many applications to verify the visual accuracy of 3D model but the approaches to verify metric accuracy are few and no one is on Autodesk 123D Catch applied on Architectural Heritage Documentation. Our approach to this challenging problem is to compare the 3Dmodels by Autodesk 123D Catch and 3D models by terrestrial LIDAR considering different object size, from the detail (capitals, moldings, bases) to large scale buildings for practitioner purpose.

  4. Recent mass spectrometry-based techniques and considerations for disulfide bond characterization in proteins.

    Science.gov (United States)

    Lakbub, Jude C; Shipman, Joshua T; Desaire, Heather

    2018-04-01

    Disulfide bonds are important structural moieties of proteins: they ensure proper folding, provide stability, and ensure proper function. With the increasing use of proteins for biotherapeutics, particularly monoclonal antibodies, which are highly disulfide bonded, it is now important to confirm the correct disulfide bond connectivity and to verify the presence, or absence, of disulfide bond variants in the protein therapeutics. These studies help to ensure safety and efficacy. Hence, disulfide bonds are among the critical quality attributes of proteins that have to be monitored closely during the development of biotherapeutics. However, disulfide bond analysis is challenging because of the complexity of the biomolecules. Mass spectrometry (MS) has been the go-to analytical tool for the characterization of such complex biomolecules, and several methods have been reported to meet the challenging task of mapping disulfide bonds in proteins. In this review, we describe the relevant, recent MS-based techniques and provide important considerations needed for efficient disulfide bond analysis in proteins. The review focuses on methods for proper sample preparation, fragmentation techniques for disulfide bond analysis, recent disulfide bond mapping methods based on the fragmentation techniques, and automated algorithms designed for rapid analysis of disulfide bonds from liquid chromatography-MS/MS data. Researchers involved in method development for protein characterization can use the information herein to facilitate development of new MS-based methods for protein disulfide bond analysis. In addition, individuals characterizing biotherapeutics, especially by disulfide bond mapping in antibodies, can use this review to choose the best strategies for disulfide bond assignment of their biologic products. Graphical Abstract This review, describing characterization methods for disulfide bonds in proteins, focuses on three critical components: sample preparation, mass

  5. Commutative discrete filtering on unstructured grids based on least-squares techniques

    International Nuclear Information System (INIS)

    Haselbacher, Andreas; Vasilyev, Oleg V.

    2003-01-01

    The present work is concerned with the development of commutative discrete filters for unstructured grids and contains two main contributions. First, building on the work of Marsden et al. [J. Comp. Phys. 175 (2002) 584], a new commutative discrete filter based on least-squares techniques is constructed. Second, a new analysis of the discrete commutation error is carried out. The analysis indicates that the discrete commutation error is not only dependent on the number of vanishing moments of the filter weights, but also on the order of accuracy of the discrete gradient operator. The results of the analysis are confirmed by grid-refinement studies

  6. Studies of industrial emissions by accelerator-based techniques: A review of applications at CEDAD

    Science.gov (United States)

    Calcagnile, L.; Quarta, G.

    2012-04-01

    Different research activities are in progress at the Centre for Dating and Diagnostics (CEDAD), University of Salento, in the field of environmental monitoring by exploiting the potentialities given by the different experimental beam lines implemented on the 3 MV Tande-tron accelerator and dedicated to AMS (Accelerator Mass Spectrome-try) radiocarbon dating and IB A (Ion Beam Analysis). An overview of these activities is presented by showing how accelerator-based analytical techniques can be a powerful tool for monitoring the anthropogenic carbon dioxide emissions from industrial sources and for the assessment of the biogenic content in SRF (Solid Recovered Fuel) burned in WTE (Waste to Energy) plants.

  7. Studies of industrial emissions by accelerator-based techniques: A review of applications at CEDAD

    Directory of Open Access Journals (Sweden)

    Calcagnile L.

    2012-04-01

    Full Text Available Different research activities are in progress at the Centre for Dating and Diagnostics (CEDAD, University of Salento, in the field of environmental monitoring by exploiting the potentialities given by the different experimental beam lines implemented on the 3 MV Tande-tron accelerator and dedicated to AMS (Accelerator Mass Spectrome-try radiocarbon dating and IB A (Ion Beam Analysis. An overview of these activities is presented by showing how accelerator-based analytical techniques can be a powerful tool for monitoring the anthropogenic carbon dioxide emissions from industrial sources and for the assessment of the biogenic content in SRF (Solid Recovered Fuel burned in WTE (Waste to Energy plants.

  8. “In vitro” Implantation Technique Based on 3D Printed Prosthetic Prototypes

    Science.gov (United States)

    Tarnita, D.; Boborelu, C.; Geonea, I.; Malciu, R.; Grigorie, L.; Tarnita, D. N.

    2018-06-01

    In this paper, Rapid Prototyping ZCorp 310 system, based on high-performance composite powder and on resin-high strength infiltration system and three-dimensional printing as a manufacturing method are used to obtain physical prototypes of orthopaedic implants and prototypes of complex functional prosthetic systems directly from the 3D CAD data. These prototypes are useful for in vitro experimental tests and measurements to optimize and obtain final physical prototypes. Using a new elbow prosthesis model prototype obtained by 3D printing, the surgical technique of implantation is established. Surgical implantation was performed on male corpse elbow joint.

  9. A three-stage strategy for optimal price offering by a retailer based on clustering techniques

    International Nuclear Information System (INIS)

    Mahmoudi-Kohan, N.; Shayesteh, E.; Moghaddam, M. Parsa; Sheikh-El-Eslami, M.K.

    2010-01-01

    In this paper, an innovative strategy for optimal price offering to customers for maximizing the profit of a retailer is proposed. This strategy is based on load profile clustering techniques and includes three stages. For the purpose of clustering, an improved weighted fuzzy average K-means is proposed. Also, in this paper a new acceptance function for increasing the profit of the retailer is proposed. The new method is evaluated by implementation on a group of 300 customers of a 20 kV distribution network. (author)

  10. SEU mitigation technique by Dynamic Reconfiguration method in FPGA based DSP application

    International Nuclear Information System (INIS)

    Dey, Madhusudan; Singh, Abhishek; Roy, Amitava

    2012-01-01

    Field Programmable Gate Array (FPGA), an SRAM based configurable devices meant for implementation of any digital circuits is susceptible to malfunction in the harsh radiation environment. It causes the corruption of the configuration memory of FPGA and the digital circuits starts malfunctioning. There is a need to restore the system as early as possible. This paper discusses about one such technique named dynamic partial reconfiguration (DPR) method. This paper also touches upon the signal processing by DPR method. The framework consisting of ADC, DAC and ICAP controllers designed using dedicated state machines to study the best possible downtime also for verifying the performance of digital filters for signal processing

  11. A novel wavelet neural network based pathological stage detection technique for an oral precancerous condition

    Science.gov (United States)

    Paul, R R; Mukherjee, A; Dutta, P K; Banerjee, S; Pal, M; Chatterjee, J; Chaudhuri, K; Mukkerjee, K

    2005-01-01

    Aim: To describe a novel neural network based oral precancer (oral submucous fibrosis; OSF) stage detection method. Method: The wavelet coefficients of transmission electron microscopy images of collagen fibres from normal oral submucosa and OSF tissues were used to choose the feature vector which, in turn, was used to train the artificial neural network. Results: The trained network was able to classify normal and oral precancer stages (less advanced and advanced) after obtaining the image as an input. Conclusions: The results obtained from this proposed technique were promising and suggest that with further optimisation this method could be used to detect and stage OSF, and could be adapted for other conditions. PMID:16126873

  12. Nanomaterials-Based Optical Techniques for the Detection of Acetylcholinesterase and Pesticides

    Directory of Open Access Journals (Sweden)

    Ning Xia

    2014-12-01

    Full Text Available The large amount of pesticide residues in the environment is a threat to global health by inhibition of acetylcholinesterase (AChE. Biosensors for inhibition of AChE have been thus developed for the detection of pesticides. In line with the rapid development of nanotechnology, nanomaterials have attracted great attention and have been intensively studied in biological analysis due to their unique chemical, physical and size properties. The aim of this review is to provide insight into nanomaterial-based optical techniques for the determination of AChE and pesticides, including colorimetric and fluorescent assays and surface plasmon resonance.

  13. Exploring Techniques for Vision Based Human Activity Recognition: Methods, Systems, and Evaluation

    Directory of Open Access Journals (Sweden)

    Hong Zhang

    2013-01-01

    Full Text Available With the wide applications of vision based intelligent systems, image and video analysis technologies have attracted the attention of researchers in the computer vision field. In image and video analysis, human activity recognition is an important research direction. By interpreting and understanding human activity, we can recognize and predict the occurrence of crimes and help the police or other agencies react immediately. In the past, a large number of papers have been published on human activity recognition in video and image sequences. In this paper, we provide a comprehensive survey of the recent development of the techniques, including methods, systems, and quantitative evaluation towards the performance of human activity recognition.

  14. Analysis on the Metrics used in Optimizing Electronic Business based on Learning Techniques

    Directory of Open Access Journals (Sweden)

    Irina-Steliana STAN

    2014-09-01

    Full Text Available The present paper proposes a methodology of analyzing the metrics related to electronic business. The drafts of the optimizing models include KPIs that can highlight the business specific, if only they are integrated by using learning-based techniques. Having set the most important and high-impact elements of the business, the models should get in the end the link between them, by automating business flows. The human resource will be found in the situation of collaborating more and more with the optimizing models which will translate into high quality decisions followed by profitability increase.

  15. Electroluminescence Analysis by Tilt Polish Technique of InP-Based Semiconductor Lasers

    Science.gov (United States)

    Ichikawa, Hiroyuki; Sasaki, Kouichi; Hamada, Kotaro; Yamaguchi, Akira

    2010-03-01

    We developed an effective electroluminescence (EL) analysis method to specify the degraded region of InP-based semiconductor lasers. The EL analysis method is one of the most important methods for failure analysis. However, EL observation was difficult because opaque electrodes surround an active layer. A portion of each electrode had to be left intact for wiring to inject the current. Thus, we developed a partial polish technique for the bottom electrode. Tilt polish equipment with a rotating table was introduced; a flat polished surface and a sufficiently wide remaining portion of the bottom electrode were obtained. As a result, clear EL from the back surface of the laser was observed.

  16. A three-stage strategy for optimal price offering by a retailer based on clustering techniques

    Energy Technology Data Exchange (ETDEWEB)

    Mahmoudi-Kohan, N.; Shayesteh, E. [Islamic Azad University (Garmsar Branch), Garmsar (Iran); Moghaddam, M. Parsa; Sheikh-El-Eslami, M.K. [Tarbiat Modares University, Tehran (Iran)

    2010-12-15

    In this paper, an innovative strategy for optimal price offering to customers for maximizing the profit of a retailer is proposed. This strategy is based on load profile clustering techniques and includes three stages. For the purpose of clustering, an improved weighted fuzzy average K-means is proposed. Also, in this paper a new acceptance function for increasing the profit of the retailer is proposed. The new method is evaluated by implementation on a group of 300 customers of a 20 kV distribution network. (author)

  17. Graph-based Techniques for Topic Classification of Tweets in Spanish

    Directory of Open Access Journals (Sweden)

    Hector Cordobés

    2014-03-01

    Full Text Available Topic classification of texts is one of the most interesting challenges in Natural Language Processing (NLP. Topic classifiers commonly use a bag-of-words approach, in which the classifier uses (and is trained with selected terms from the input texts. In this work we present techniques based on graph similarity to classify short texts by topic. In our classifier we build graphs from the input texts, and then use properties of these graphs to classify them. We have tested the resulting algorithm by classifying Twitter messages in Spanish among a predefined set of topics, achieving more than 70% accuracy.

  18. A microcontroller-based compensated optical proximity detector employing the switching-mode synchronous detection technique

    International Nuclear Information System (INIS)

    Rakshit, Anjan; Chatterjee, Amitava

    2012-01-01

    This paper describes the development of a microcontroller-based optical proximity detector that can provide a low-cost yet powerful obstacle-sensing mechanism for mobile robots. The system is developed with the switching-mode synchronous detection technique to provide satisfactory performance over a wide range of operating conditions and is developed with the facility of externally setting a threshold, for reliable operation. The system is dynamically compensated against ambient illumination variations. Experimental studies demonstrate how the minimum distance of activation can be varied with different choices of thresholds. (paper)

  19. Structure and properties of composite iron-based coatings obtained by the electromechanical technique

    Science.gov (United States)

    Dubinskii, N. A.

    2007-09-01

    The influence of the electrolyte temperature and current density on the content of inclusions of powder particles in composite coatings obtained by the electrochemical technique has been investigated. It has been found that the wear resistance of iron coatings with inclusions of powder particles of aluminum, kaolin, and calcium silicate increases from 5 to 10 times compared to coating without inclusions of disperse particles, and the friction coefficient therewith decreases from 0.097 to 0.026. It has been shown that the mechanical properties of iron obtained by the method of electrochemical deposition depend on their fine structure. The regimes of deposition of iron-based coatings have been optimized.

  20. Optimization of a Fuzzy-Logic-Control-Based MPPT Algorithm Using the Particle Swarm Optimization Technique

    Directory of Open Access Journals (Sweden)

    Po-Chen Cheng

    2015-06-01

    Full Text Available In this paper, an asymmetrical fuzzy-logic-control (FLC-based maximum power point tracking (MPPT algorithm for photovoltaic (PV systems is presented. Two membership function (MF design methodologies that can improve the effectiveness of the proposed asymmetrical FLC-based MPPT methods are then proposed. The first method can quickly determine the input MF setting values via the power–voltage (P–V curve of solar cells under standard test conditions (STC. The second method uses the particle swarm optimization (PSO technique to optimize the input MF setting values. Because the PSO approach must target and optimize a cost function, a cost function design methodology that meets the performance requirements of practical photovoltaic generation systems (PGSs is also proposed. According to the simulated and experimental results, the proposed asymmetrical FLC-based MPPT method has the highest fitness value, therefore, it can successfully address the tracking speed/tracking accuracy dilemma compared with the traditional perturb and observe (P&O and symmetrical FLC-based MPPT algorithms. Compared to the conventional FLC-based MPPT method, the obtained optimal asymmetrical FLC-based MPPT can improve the transient time and the MPPT tracking accuracy by 25.8% and 0.98% under STC, respectively.

  1. Performance Evaluation of Frequency Transform Based Block Classification of Compound Image Segmentation Techniques

    Science.gov (United States)

    Selwyn, Ebenezer Juliet; Florinabel, D. Jemi

    2018-04-01

    Compound image segmentation plays a vital role in the compression of computer screen images. Computer screen images are images which are mixed with textual, graphical, or pictorial contents. In this paper, we present a comparison of two transform based block classification of compound images based on metrics like speed of classification, precision and recall rate. Block based classification approaches normally divide the compound images into fixed size blocks of non-overlapping in nature. Then frequency transform like Discrete Cosine Transform (DCT) and Discrete Wavelet Transform (DWT) are applied over each block. Mean and standard deviation are computed for each 8 × 8 block and are used as features set to classify the compound images into text/graphics and picture/background block. The classification accuracy of block classification based segmentation techniques are measured by evaluation metrics like precision and recall rate. Compound images of smooth background and complex background images containing text of varying size, colour and orientation are considered for testing. Experimental evidence shows that the DWT based segmentation provides significant improvement in recall rate and precision rate approximately 2.3% than DCT based segmentation with an increase in block classification time for both smooth and complex background images.

  2. Coronary artery plaques: Cardiac CT with model-based and adaptive-statistical iterative reconstruction technique

    International Nuclear Information System (INIS)

    Scheffel, Hans; Stolzmann, Paul; Schlett, Christopher L.; Engel, Leif-Christopher; Major, Gyöngi Petra; Károlyi, Mihály; Do, Synho; Maurovich-Horvat, Pál; Hoffmann, Udo

    2012-01-01

    Objectives: To compare image quality of coronary artery plaque visualization at CT angiography with images reconstructed with filtered back projection (FBP), adaptive statistical iterative reconstruction (ASIR), and model based iterative reconstruction (MBIR) techniques. Methods: The coronary arteries of three ex vivo human hearts were imaged by CT and reconstructed with FBP, ASIR and MBIR. Coronary cross-sectional images were co-registered between the different reconstruction techniques and assessed for qualitative and quantitative image quality parameters. Readers were blinded to the reconstruction algorithm. Results: A total of 375 triplets of coronary cross-sectional images were co-registered. Using MBIR, 26% of the images were rated as having excellent overall image quality, which was significantly better as compared to ASIR and FBP (4% and 13%, respectively, all p < 0.001). Qualitative assessment of image noise demonstrated a noise reduction by using ASIR as compared to FBP (p < 0.01) and further noise reduction by using MBIR (p < 0.001). The contrast-to-noise-ratio (CNR) using MBIR was better as compared to ASIR and FBP (44 ± 19, 29 ± 15, 26 ± 9, respectively; all p < 0.001). Conclusions: Using MBIR improved image quality, reduced image noise and increased CNR as compared to the other available reconstruction techniques. This may further improve the visualization of coronary artery plaque and allow radiation reduction.

  3. Chronology of DIC technique based on the fundamental mathematical modeling and dehydration impact.

    Science.gov (United States)

    Alias, Norma; Saipol, Hafizah Farhah Saipan; Ghani, Asnida Che Abd

    2014-12-01

    A chronology of mathematical models for heat and mass transfer equation is proposed for the prediction of moisture and temperature behavior during drying using DIC (Détente Instantanée Contrôlée) or instant controlled pressure drop technique. DIC technique has the potential as most commonly used dehydration method for high impact food value including the nutrition maintenance and the best possible quality for food storage. The model is governed by the regression model, followed by 2D Fick's and Fourier's parabolic equation and 2D elliptic-parabolic equation in a rectangular slice. The models neglect the effect of shrinkage and radiation effects. The simulations of heat and mass transfer equations with parabolic and elliptic-parabolic types through some numerical methods based on finite difference method (FDM) have been illustrated. Intel®Core™2Duo processors with Linux operating system and C programming language have been considered as a computational platform for the simulation. Qualitative and quantitative differences between DIC technique and the conventional drying methods have been shown as a comparative.

  4. Efficient Disk-Based Techniques for Manipulating Very Large String Databases

    KAUST Repository

    Allam, Amin

    2017-05-18

    Indexing and processing strings are very important topics in database management. Strings can be database records, DNA sequences, protein sequences, or plain text. Various string operations are required for several application categories, such as bioinformatics and entity resolution. When the string count or sizes become very large, several state-of-the-art techniques for indexing and processing such strings may fail or behave very inefficiently. Modifying an existing technique to overcome these issues is not usually straightforward or even possible. A category of string operations can be facilitated by the suffix tree data structure, which basically indexes a long string to enable efficient finding of any substring of the indexed string, and can be used in other operations as well, such as approximate string matching. In this document, we introduce a novel efficient method to construct the suffix tree index for very long strings using parallel architectures, which is a major challenge in this category. Another category of string operations require clustering similar strings in order to perform application-specific processing on the resulting possibly-overlapping clusters. In this document, based on clustering similar strings, we introduce a novel efficient technique for record linkage and entity resolution, and a novel method for correcting errors in a large number of small strings (read sequences) generated by the DNA sequencing machines.

  5. A noise reduction technique based on nonlinear kernel function for heart sound analysis.

    Science.gov (United States)

    Mondal, Ashok; Saxena, Ishan; Tang, Hong; Banerjee, Poulami

    2017-02-13

    The main difficulty encountered in interpretation of cardiac sound is interference of noise. The contaminated noise obscures the relevant information which are useful for recognition of heart diseases. The unwanted signals are produced mainly by lungs and surrounding environment. In this paper, a novel heart sound de-noising technique has been introduced based on a combined framework of wavelet packet transform (WPT) and singular value decomposition (SVD). The most informative node of wavelet tree is selected on the criteria of mutual information measurement. Next, the coefficient corresponding to the selected node is processed by SVD technique to suppress noisy component from heart sound signal. To justify the efficacy of the proposed technique, several experiments have been conducted with heart sound dataset, including normal and pathological cases at different signal to noise ratios. The significance of the method is validated by statistical analysis of the results. The biological information preserved in de-noised heart sound (HS) signal is evaluated by k-means clustering algorithm and Fit Factor calculation. The overall results show that proposed method is superior than the baseline methods.

  6. Visualization of ultrasound induced cavitation bubbles using the synchrotron x-ray Analyzer Based Imaging technique

    International Nuclear Information System (INIS)

    Izadifar, Zahra; Izadifar, Mohammad; Izadifar, Zohreh; Chapman, Dean; Belev, George

    2014-01-01

    Observing cavitation bubbles deep within tissue is very difficult. The development of a method for probing cavitation, irrespective of its location in tissues, would improve the efficiency and application of ultrasound in the clinic. A synchrotron x-ray imaging technique, which is capable of detecting cavitation bubbles induced in water by a sonochemistry system, is reported here; this could possibly be extended to the study of therapeutic ultrasound in tissues. The two different x-ray imaging techniques of Analyzer Based Imaging (ABI) and phase contrast imaging (PCI) were examined in order to detect ultrasound induced cavitation bubbles. Cavitation was not observed by PCI, however it was detectable with ABI. Acoustic cavitation was imaged at six different acoustic power levels and six different locations through the acoustic beam in water at a fixed power level. The results indicate the potential utility of this technique for cavitation studies in tissues, but it is time consuming. This may be improved by optimizing the imaging method. (paper)

  7. A Novel Kernel-Based Regularization Technique for PET Image Reconstruction

    Directory of Open Access Journals (Sweden)

    Abdelwahhab Boudjelal

    2017-06-01

    Full Text Available Positron emission tomography (PET is an imaging technique that generates 3D detail of physiological processes at the cellular level. The technique requires a radioactive tracer, which decays and releases a positron that collides with an electron; consequently, annihilation photons are emitted, which can be measured. The purpose of PET is to use the measurement of photons to reconstruct the distribution of radioisotopes in the body. Currently, PET is undergoing a revamp, with advancements in data measurement instruments and the computing methods used to create the images. These computer methods are required to solve the inverse problem of “image reconstruction from projection”. This paper proposes a novel kernel-based regularization technique for maximum-likelihood expectation-maximization ( κ -MLEM to reconstruct the image. Compared to standard MLEM, the proposed algorithm is more robust and is more effective in removing background noise, whilst preserving the edges; this suppresses image artifacts, such as out-of-focus slice blur.

  8. TL and ESR based identification of gamma-irradiated frozen fish using different hydrolysis techniques

    International Nuclear Information System (INIS)

    Ahn, Jae-Jun; Akram, Kashif; Shahbaz, Hafiz Muhammad; Kwon, Joong-Ho

    2014-01-01

    Frozen fish fillets (walleye Pollack and Japanese Spanish mackerel) were selected as samples for irradiation (0–10 kGy) detection trials using different hydrolysis methods. Photostimulated luminescence (PSL)-based screening analysis for gamma-irradiated frozen fillets showed low sensitivity due to limited silicate mineral contents on the samples. Same limitations were found in the thermoluminescence (TL) analysis on mineral samples isolated by density separation method. However, acid (HCl) and alkali (KOH) hydrolysis methods were effective in getting enough minerals to carry out TL analysis, which was reconfirmed through the normalization step by calculating the TL ratios (TL 1 /TL 2 ). For improved electron spin resonance (ESR) analysis, alkali and enzyme (alcalase) hydrolysis methods were compared in separating minute-bone fractions. The enzymatic method provided more clear radiation-specific hydroxyapatite radicals than that of the alkaline method. Different hydrolysis methods could extend the application of TL and ESR techniques in identifying the irradiation history of frozen fish fillets. - Highlights: • Irradiation has potential to improve hygienic quality of raw and processed seafood. • Detection of irradiated food is important to enforce the applied regulations. • Different techniques were compared to separate silicate minerals from frozen fish. • Limitations were observed in TL analysis on minerals isolated by density separation. • Hydrolysis methods provided more clear identification using TL and ESR techniques

  9. Applications of synchrotron-based micro-imaging techniques for the analysis of Cultural Heritage materials

    International Nuclear Information System (INIS)

    Cotte, Marine; Chilida, Javier; Walter, Philippe; Taniguchi, Yoko; Susini, Jean

    2009-01-01

    The analysis of cultural Heritage objects is often technically challenging. When analyzing micro-fragments, the amount of matter is usually very tiny, hence requiring sensitive techniques. These samples, in particular painting fragments, may present multi-layered structures, with layer thickness of ∼10 μm. It leads to favor micro-imaging techniques, with a good lateral resolution (about one micrometer), that manage the discriminative study of each layer. Besides, samples are usually very complex in term of chemistry, as they are made of mineral and organic matters, amorphous and crystallized phases, major and minor elements. Accordingly, a multi-modal approach is generally essential to solve the chemical complexity of such hybrid materials. Different examples will be given, to illustrate the various possibilities of synchrotron-based micro-imaging techniques, such as micro X-ray diffraction, micro X-ray fluorescence, micro X-ray absorption spectroscopy and micro FTIR spectroscopy. Focus will be made on paintings, but the whole range of museum objects (going from soft matter like paper or wood to hard matter like metal and glass) will be also considered.

  10. Visualization of ultrasound induced cavitation bubbles using the synchrotron x-ray Analyzer Based Imaging technique.

    Science.gov (United States)

    Izadifar, Zahra; Belev, George; Izadifar, Mohammad; Izadifar, Zohreh; Chapman, Dean

    2014-12-07

    Observing cavitation bubbles deep within tissue is very difficult. The development of a method for probing cavitation, irrespective of its location in tissues, would improve the efficiency and application of ultrasound in the clinic. A synchrotron x-ray imaging technique, which is capable of detecting cavitation bubbles induced in water by a sonochemistry system, is reported here; this could possibly be extended to the study of therapeutic ultrasound in tissues. The two different x-ray imaging techniques of Analyzer Based Imaging (ABI) and phase contrast imaging (PCI) were examined in order to detect ultrasound induced cavitation bubbles. Cavitation was not observed by PCI, however it was detectable with ABI. Acoustic cavitation was imaged at six different acoustic power levels and six different locations through the acoustic beam in water at a fixed power level. The results indicate the potential utility of this technique for cavitation studies in tissues, but it is time consuming. This may be improved by optimizing the imaging method.

  11. Preparation of Natural Rubber (NR) Based Nano-Sized Materials Using Sol-Gel Technique

    International Nuclear Information System (INIS)

    Dahlan Mohd; Mahathir Mohamed

    2011-01-01

    The objectives of this project are to prepare nano-sized natural rubber-based hybrid coating material by sol-gel technique; to explore the possibility of producing ENR-Si (epoxidized natural rubber-silica) cramer with toughening effects; and to use it in radiation curing of surface coating. Since early 1960s Malaysia has introduced various forms of value-added natural rubber such as Standard Malaysian Rubber (SMR), methylmethacrylate-grafted natural rubber (MG rubber), followed by liquid natural rubber and epoxidized natural rubber (ENR). Products such as liquid epoxidized natural rubber acrylate (LENRA) and thermoplastic natural rubber (TPNR) are still on-going research projects in Nuclear Malaysia. The former has strong possibility to be used as radiation-sensitive comparabilities in TPNR blends, besides its original purpose for example in radiation curing of surface coating. But earlier findings indicated that, to make it (as for surface coating) more effective, reinforcement system is needed to be introduced. Strong candidate is silica by sol-gel technique, since common reinforcement filler for example carbon black has drawbacks in this particular case. This technique was introduced in late 1960s to produce metal oxides such as silica and titanium oxides in solution. (author)

  12. The Role of Liquid Based Cytology and Ancillary Techniques in the Peritoneal Washing Analysis: Our Institutional Experience

    Science.gov (United States)

    Rossi, Esther; Bizzarro, Tommaso; Martini, Maurizio; Longatto-Filho, Adhemar; Schmitt, Fernando; Fagotti, Anna; Scambia, Giovanni; Zannoni, Gian Franco

    2017-01-01

    Background The cytological analysis of peritoneal effusions serves as a diagnostic and prognostic aid for either primary or metastatic diseases. Among the different cytological preparations, liquid based cytology (LBC) represents a feasible and reliable method ensuring also the application of ancillary techniques (i.e immunocytochemistry-ICC and molecular testing). Methods We recorded 10348 LBC peritoneal effusions between January 2000 and December 2014. They were classified as non-diagnostic (ND), negative for malignancy-NM, atypical-suspicious for malignancy-SM and positive for malignancy-PM. Results The cytological diagnosis included 218 ND, 9.035 NM, 213 SM and 882 PM. A total of 8048 (7228 NM, 115SM, 705 PM) cases with histological follow-up were included. Our NM included 21 malignant and 7207 benign histological diagnoses. Our 820 SMs+PMs were diagnosed as 107 unknown malignancies (30SM and 77PM), 691 metastatic lesions (81SM and 610PM), 9 lymphomas (2SM and 7PM), 9 mesotheliomas (1SM and 8SM), 4 sarcomas (1SM and 3PM). Primary gynecological cancers contributed with 64% of the cases. We documented 97.4% sensitivity, 99.9% specificity, 98% diagnostic accuracy, 99.7% negative predictive value (NPV) and 99.7% positive predictive value (PPV). Furthermore, the morphological diagnoses were supported by either 173 conclusive ICC results or 50 molecular analyses. Specifically the molecular testing was performed for the EGFR and KRAS mutational analysis based on the previous or contemporary diagnoses of Non Small Cell Lung Cancer (NSCLC) and colon carcinomas. We identified 10 EGFR in NSCCL and 7 KRAS mutations on LBC stored material. Conclusions Peritoneal cytology is an adjunctive tool in the surgical management of tumors mostly gynecological cancers. LBC maximizes the application of ancillary techniques such as ICC and molecular analysis with feasible diagnostic and predictive yields also in controversial cases. PMID:28099523

  13. Fabrication of Ultrasensitive Field-Effect Transistor DNA Biosensors by a Directional Transfer Technique Based on CVD-Grown Graphene.

    Science.gov (United States)

    Zheng, Chao; Huang, Le; Zhang, Hong; Sun, Zhongyue; Zhang, Zhiyong; Zhang, Guo-Jun

    2015-08-12

    Most graphene field-effect transistor (G-FET) biosensors are fabricated through a routine process, in which graphene is transferred onto a Si/SiO2 substrate and then devices are subsequently produced by micromanufacture processes. However, such a fabrication approach can introduce contamination onto the graphene surface during the lithographic process, resulting in interference for the subsequent biosensing. In this work, we have developed a novel directional transfer technique to fabricate G-FET biosensors based on chemical-vapor-deposition- (CVD-) grown single-layer graphene (SLG) and applied this biosensor for the sensitive detection of DNA. A FET device with six individual array sensors was first fabricated, and SLG obtained by the CVD-growth method was transferred onto the sensor surface in a directional manner. Afterward, peptide nucleic acid (PNA) was covalently immobilized on the graphene surface, and DNA detection was realized by applying specific target DNA to the PNA-functionalized G-FET biosensor. The developed G-FET biosensor was able to detect target DNA at concentrations as low as 10 fM, which is 1 order of magnitude lower than those reported in a previous work. In addition, the biosensor was capable of distinguishing the complementary DNA from one-base-mismatched DNA and noncomplementary DNA. The directional transfer technique for the fabrication of G-FET biosensors is simple, and the as-constructed G-FET DNA biosensor shows ultrasensitivity and high specificity, indicating its potential application in disease diagnostics as a point-of-care tool.

  14. Robust breathing signal extraction from cone beam CT projections based on adaptive and global optimization techniques

    International Nuclear Information System (INIS)

    Chao, Ming; Yuan, Yading; Rosenzweig, Kenneth E; Lo, Yeh-Chi; Wei, Jie; Li, Tianfang

    2016-01-01

    We present a study of extracting respiratory signals from cone beam computed tomography (CBCT) projections within the framework of the Amsterdam Shroud (AS) technique. Acquired prior to the radiotherapy treatment, CBCT projections were preprocessed for contrast enhancement by converting the original intensity images to attenuation images with which the AS image was created. An adaptive robust z-normalization filtering was applied to further augment the weak oscillating structures locally. From the enhanced AS image, the respiratory signal was extracted using a two-step optimization approach to effectively reveal the large-scale regularity of the breathing signals. CBCT projection images from five patients acquired with the Varian Onboard Imager on the Clinac iX System Linear Accelerator (Varian Medical Systems, Palo Alto, CA) were employed to assess the proposed technique. Stable breathing signals can be reliably extracted using the proposed algorithm. Reference waveforms obtained using an air bellows belt (Philips Medical Systems, Cleveland, OH) were exported and compared to those with the AS based signals. The average errors for the enrolled patients between the estimated breath per minute (bpm) and the reference waveform bpm can be as low as  −0.07 with the standard deviation 1.58. The new algorithm outperformed the original AS technique for all patients by 8.5% to 30%. The impact of gantry rotation on the breathing signal was assessed with data acquired with a Quasar phantom (Modus Medical Devices Inc., London, Canada) and found to be minimal on the signal frequency. The new technique developed in this work will provide a practical solution to rendering markerless breathing signal using the CBCT projections for thoracic and abdominal patients. (paper)

  15. An offset tone based gain stabilization technique for mixed-signal RF measurement systems

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, Gopal, E-mail: gjos@barc.gov.in [BARC, Mumbai 400085 (India); Motiwala, Paresh D.; Randale, G.D.; Singh, Pitamber [BARC, Mumbai 400085 (India); Agarwal, Vivek; Kumar, Girish [IIT Bombay, Powai, Mumbai 400076 (India)

    2015-09-21

    This paper describes a gain stabilization technique for a RF signal measurement system. A sinusoidal signal of known amplitude, phase and close enough in frequency is added to the main, to be measured RF signal at the input of the analog section. The system stabilizes this offset tone in the digital domain, as it is sampled at the output of the analog section. This process generates a correction factor needed to stabilize the magnitude of the gain of the analog section for the main RF signal. With the help of a simple calibration procedure, the absolute amplitude of the main RF signal can be measured. The technique is especially suited for a system that processes signals around a single frequency, employs direct signal conversion into the digital domain, and processes subsequent steps in an FPGA. The inherent parallel signal processing in an FPGA-based implementation allows a real time stabilization of the gain. The effectiveness of the technique is derived from the fact, that the gain stabilization stamped to the main RF signal measurement branch requires only a few components in the system to be inherently stable. A test setup, along with experimental results is presented from the field of RF instrumentation for particle accelerators. Due to the availability of a phase synchronized RF reference signal in these systems, the measured phase difference between the main RF and the RF reference is also stabilized using this technique. A scheme of the signal processing is presented, where a moving average filter has been used to filter out not only the unwanted frequencies, but also to separate the main RF signal from the offset tone signal. This is achieved by a suitable choice of sampling and offset tone frequencies. The presented signal processing scheme is suitable to a variety of RF measurement applications.

  16. Lab-Based Measurement of Remediation Techniques for Radiation Portal Monitors (Initial Report)

    International Nuclear Information System (INIS)

    Livesay, Jake

    2012-01-01

    Radiation Portal Monitors (RPM) deployed by the Second Line of Defense (SLD) are known to be sensitive to the natural environmental radioactive background. There are several techniques used to mitigate the effects of background on the monitors, but since the installation environments can vary significantly from one another the need for a standardized, systematic, study of remediation techniques was proposed and carried out. This study is not meant to serve as the absolute last word on the subject. The data collected are, however, intelligible and useful. Some compromises were made, each of which will be described in detail. The hope of this initial report is to familiarize the SLD science teams with ORNL's effort to model the effect of various remediation techniques on simple, static backgrounds. This study provides a good start toward benchmarking the model, and each additional increment of data will serve to make the model more robust. The scope of this initial study is limited to a few basic cases. Its purpose is to prove the utility of lab-based study of remediation techniques and serve as a standard data set for future use. This importance of this first step of standardization will become obvious when science teams are working in parallel on issues of remediation; having a common starting point will do away with one category of difference, thereby making easier the task of determining the sources of disagreement. Further measurements will augment this data set, allowing for further constraint of the universe of possible situations. As will be discussed in the 'Going Forward' section, more data will be included in the final report of this work. Of particular interest will be the data taken with the official TSA lead collimators, which will provide more direct results for comparison with installation data.

  17. Legal, ethical, and procedural bases for the use of aseptic techniques to implant electronic devices

    Science.gov (United States)

    Mulcahy, Daniel M.

    2013-01-01

    The popularity of implanting electronic devices such as transmitters and data loggers into captive and free-ranging animals has increased greatly in the past two decades. The devices have become smaller, more reliable, and more capable (Printz 2004; Wilson and Gifford 2005; Metcalfe et al. 2012). Compared with externally mounted devices, implanted devices are largely invisible to external viewers such as tourists and predators; exist in a physically protected, thermally stable environment in mammals and birds; and greatly reduce drag and risk of entanglement. An implanted animal does not outgrow its device or attachment method as can happen with collars and harnesses, which allows young animals to be more safely equipped. However, compared with mounting external devices, implantation requires greater technical ability to perform the necessary anesthesia, analgesia, and surgery. More than 83% of publications in the 1990s that used radiotelemetry on animals assumed that there were no adverse effects on the animal (Godfrey and Bryant 2003). It is likely that some studies using implanted electronic devices have not been published due to a high level of unexpected mortality or to aberrant behavior or disappearance of the implanted animals, a phenomenon known as the “file drawer” problem (Rosenthal 1979; Scargle 2000). The near absence of such studies from the published record may be providing a false sense of security that procedures being used are more innocuous than they actually are. Similarly, authors sometimes state that it was unlikely that device implantation was problematic because study animals appeared to behave normally, or authors state that previous investigators used the same technique and saw no problems. Such statements are suppositions if no supporting data are provided or if the animals were equipped because there was no other way to follow their activity. Moreover, such suppositions ignore other adverse effects that affect behavior indirectly, and

  18. Testing photogrammetry-based techniques for three-dimensional surface documentation in forensic pathology.

    Science.gov (United States)

    Urbanová, Petra; Hejna, Petr; Jurda, Mikoláš

    2015-05-01

    Three-dimensional surface technologies particularly close range photogrammetry and optical surface scanning have recently advanced into affordable, flexible and accurate techniques. Forensic postmortem investigation as performed on a daily basis, however, has not yet fully benefited from their potentials. In the present paper, we tested two approaches to 3D external body documentation - digital camera-based photogrammetry combined with commercial Agisoft PhotoScan(®) software and stereophotogrammetry-based Vectra H1(®), a portable handheld surface scanner. In order to conduct the study three human subjects were selected, a living person, a 25-year-old female, and two forensic cases admitted for postmortem examination at the Department of Forensic Medicine, Hradec Králové, Czech Republic (both 63-year-old males), one dead to traumatic, self-inflicted, injuries (suicide by hanging), the other diagnosed with the heart failure. All three cases were photographed in 360° manner with a Nikon 7000 digital camera and simultaneously documented with the handheld scanner. In addition to having recorded the pre-autopsy phase of the forensic cases, both techniques were employed in various stages of autopsy. The sets of collected digital images (approximately 100 per case) were further processed to generate point clouds and 3D meshes. Final 3D models (a pair per individual) were counted for numbers of points and polygons, then assessed visually and compared quantitatively using ICP alignment algorithm and a cloud point comparison technique based on closest point to point distances. Both techniques were proven to be easy to handle and equally laborious. While collecting the images at autopsy took around 20min, the post-processing was much more time-demanding and required up to 10h of computation time. Moreover, for the full-body scanning the post-processing of the handheld scanner required rather time-consuming manual image alignment. In all instances the applied approaches

  19. A singular value decomposition linear programming (SVDLP) optimization technique for circular cone based robotic radiotherapy

    Science.gov (United States)

    Liang, Bin; Li, Yongbao; Wei, Ran; Guo, Bin; Xu, Xuang; Liu, Bo; Li, Jiafeng; Wu, Qiuwen; Zhou, Fugen

    2018-01-01

    With robot-controlled linac positioning, robotic radiotherapy systems such as CyberKnife significantly increase freedom of radiation beam placement, but also impose more challenges on treatment plan optimization. The resampling mechanism in the vendor-supplied treatment planning system (MultiPlan) cannot fully explore the increased beam direction search space. Besides, a sparse treatment plan (using fewer beams) is desired to improve treatment efficiency. This study proposes a singular value decomposition linear programming (SVDLP) optimization technique for circular collimator based robotic radiotherapy. The SVDLP approach initializes the input beams by simulating the process of covering the entire target volume with equivalent beam tapers. The requirements on dosimetry distribution are modeled as hard and soft constraints, and the sparsity of the treatment plan is achieved by compressive sensing. The proposed linear programming (LP) model optimizes beam weights by minimizing the deviation of soft constraints subject to hard constraints, with a constraint on the l 1 norm of the beam weight. A singular value decomposition (SVD) based acceleration technique was developed for the LP model. Based on the degeneracy of the influence matrix, the model is first compressed into lower dimension for optimization, and then back-projected to reconstruct the beam weight. After beam weight optimization, the number of beams is reduced by removing the beams with low weight, and optimizing the weights of the remaining beams using the same model. This beam reduction technique is further validated by a mixed integer programming (MIP) model. The SVDLP approach was tested on a lung case. The results demonstrate that the SVD acceleration technique speeds up the optimization by a factor of 4.8. Furthermore, the beam reduction achieves a similar plan quality to the globally optimal plan obtained by the MIP model, but is one to two orders of magnitude faster. Furthermore, the SVDLP

  20. A singular value decomposition linear programming (SVDLP) optimization technique for circular cone based robotic radiotherapy.

    Science.gov (United States)

    Liang, Bin; Li, Yongbao; Wei, Ran; Guo, Bin; Xu, Xuang; Liu, Bo; Li, Jiafeng; Wu, Qiuwen; Zhou, Fugen

    2018-01-05

    With robot-controlled linac positioning, robotic radiotherapy systems such as CyberKnife significantly increase freedom of radiation beam placement, but also impose more challenges on treatment plan optimization. The resampling mechanism in the vendor-supplied treatment planning system (MultiPlan) cannot fully explore the increased beam direction search space. Besides, a sparse treatment plan (using fewer beams) is desired to improve treatment efficiency. This study proposes a singular value decomposition linear programming (SVDLP) optimization technique for circular collimator based robotic radiotherapy. The SVDLP approach initializes the input beams by simulating the process of covering the entire target volume with equivalent beam tapers. The requirements on dosimetry distribution are modeled as hard and soft constraints, and the sparsity of the treatment plan is achieved by compressive sensing. The proposed linear programming (LP) model optimizes beam weights by minimizing the deviation of soft constraints subject to hard constraints, with a constraint on the l 1 norm of the beam weight. A singular value decomposition (SVD) based acceleration technique was developed for the LP model. Based on the degeneracy of the influence matrix, the model is first compressed into lower dimension for optimization, and then back-projected to reconstruct the beam weight. After beam weight optimization, the number of beams is reduced by removing the beams with low weight, and optimizing the weights of the remaining beams using the same model. This beam reduction technique is further validated by a mixed integer programming (MIP) model. The SVDLP approach was tested on a lung case. The results demonstrate that the SVD acceleration technique speeds up the optimization by a factor of 4.8. Furthermore, the beam reduction achieves a similar plan quality to the globally optimal plan obtained by the MIP model, but is one to two orders of magnitude faster. Furthermore, the SVDLP