WorldWideScience

Sample records for activation analysis technique

  1. Mathematical analysis techniques for modeling the space network activities

    Science.gov (United States)

    Foster, Lisa M.

    1992-01-01

    The objective of the present work was to explore and identify mathematical analysis techniques, and in particular, the use of linear programming. This topic was then applied to the Tracking and Data Relay Satellite System (TDRSS) in order to understand the space network better. Finally, a small scale version of the system was modeled, variables were identified, data was gathered, and comparisons were made between actual and theoretical data.

  2. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    Science.gov (United States)

    Lindstrom, David J.; Lindstrom, Richard M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably.

  3. Application of thermal analysis techniques in activated carbon production

    Science.gov (United States)

    Donnals, G.L.; DeBarr, J.A.; Rostam-Abadi, M.; Lizzio, A.A.; Brady, T.A.

    1996-01-01

    Thermal analysis techniques have been used at the ISGS as an aid in the development and characterization of carbon adsorbents. Promising adsorbents from fly ash, tires, and Illinois coals have been produced for various applications. Process conditions determined in the preparation of gram quantities of carbons were used as guides in the preparation of larger samples. TG techniques developed to characterize the carbon adsorbents included the measurement of the kinetics of SO2 adsorption, the performance of rapid proximate analyses, and the determination of equilibrium methane adsorption capacities. Thermal regeneration of carbons was assessed by TG to predict the life cycle of carbon adsorbents in different applications. TPD was used to determine the nature of surface functional groups and their effect on a carbon's adsorption properties.

  4. Instrumental Neutron Activation Analysis Technique using Subsecond Radionuclides

    DEFF Research Database (Denmark)

    Nielsen, H.K.; Schmidt, J.O.

    1987-01-01

    The fast irradiation facility Mach-1 installed at the Danish DR 3 reactor has been used in boron determinations by means of Instrumental Neutron Activation Analysis using12B with 20-ms half-life. The performance characteristics of the system are presented and boron determinations of NBS standard...

  5. Chemical weapons detection by fast neutron activation analysis techniques

    Science.gov (United States)

    Bach, P.; Ma, J. L.; Froment, D.; Jaureguy, J. C.

    1993-06-01

    A neutron diagnostic experimental apparatus has been tested for nondestructive verification of sealed munitions. Designed to potentially satisfy a significant number of van-mobile requirements, this equipment is based on an easy to use industrial sealed tube neutron generator that interrogates the munitions of interest with 14 MeV neutrons. Gamma ray spectra are detected with a high purity germanium detector, especially shielded from neutrons and gamma ray background. A mobile shell holder has been used. Possible configurations allow the detection, in continuous or in pulsed modes, of gamma rays from neutron inelastic scattering, from thermal neutron capture, and from fast or thermal neutron activation. Tests on full scale sealed munitions with chemical simulants show that those with chlorine (old generation materials) are detectable in a few minutes, and those including phosphorus (new generation materials) in nearly the same time.

  6. Activated sludge characterization through microscopy: A review on quantitative image analysis and chemometric techniques

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Daniela P. [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Amaral, A. Luís [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Instituto Politécnico de Coimbra, ISEC, DEQB, Rua Pedro Nunes, Quinta da Nora, 3030-199 Coimbra (Portugal); Ferreira, Eugénio C., E-mail: ecferreira@deb.uminho.pt [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal)

    2013-11-13

    Graphical abstract: -- Highlights: •Quantitative image analysis shows potential to monitor activated sludge systems. •Staining techniques increase the potential for detection of operational problems. •Chemometrics combined with quantitative image analysis is valuable for process monitoring. -- Abstract: In wastewater treatment processes, and particularly in activated sludge systems, efficiency is quite dependent on the operating conditions, and a number of problems may arise due to sludge structure and proliferation of specific microorganisms. In fact, bacterial communities and protozoa identification by microscopy inspection is already routinely employed in a considerable number of cases. Furthermore, quantitative image analysis techniques have been increasingly used throughout the years for the assessment of aggregates and filamentous bacteria properties. These procedures are able to provide an ever growing amount of data for wastewater treatment processes in which chemometric techniques can be a valuable tool. However, the determination of microbial communities’ properties remains a current challenge in spite of the great diversity of microscopy techniques applied. In this review, activated sludge characterization is discussed highlighting the aggregates structure and filamentous bacteria determination by image analysis on bright-field, phase-contrast, and fluorescence microscopy. An in-depth analysis is performed to summarize the many new findings that have been obtained, and future developments for these biological processes are further discussed.

  7. Figure analysis: A teaching technique to promote visual literacy and active Learning.

    Science.gov (United States)

    Wiles, Amy M

    2016-07-08

    Learning often improves when active learning techniques are used in place of traditional lectures. For many of these techniques, however, students are expected to apply concepts that they have already grasped. A challenge, therefore, is how to incorporate active learning into the classroom of courses with heavy content, such as molecular-based biology courses. An additional challenge is that visual literacy is often overlooked in undergraduate science education. To address both of these challenges, a technique called figure analysis was developed and implemented in three different levels of undergraduate biology courses. Here, students learn content while gaining practice in interpreting visual information by discussing figures with their peers. Student groups also make connections between new and previously learned concepts on their own while in class. The instructor summarizes the material for the class only after students grapple with it in small groups. Students reported a preference for learning by figure analysis over traditional lecture, and female students in particular reported increased confidence in their analytical abilities. There is not a technology requirement for this technique; therefore, it may be utilized both in classrooms and in nontraditional spaces. Additionally, the amount of preparation required is comparable to that of a traditional lecture. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(4):336-344, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.

  8. FTIR Analysis of Alkali Activated Slag and Fly Ash Using Deconvolution Techniques

    Science.gov (United States)

    Madavarapu, Sateesh Babu

    The studies on aluminosilicate materials to replace traditional construction materials such as ordinary Portland cement (OPC) to reduce the effects caused has been an important research area for the past decades. Many properties like strength have already been studied and the primary focus is to learn about the reaction mechanism and the effect of the parameters on the formed products. The aim of this research was to explore the structural changes and reaction product analysis of geopolymers (Slag & Fly Ash) using Fourier transform infrared spectroscopy (FTIR) and deconvolution techniques. Spectroscopic techniques give valuable information at a molecular level but not all methods are economic and simple. To understand the mechanisms of alkali activated aluminosilicate materials, attenuated total reflectance (ATR) FTIR has been used where the effect of the parameters on the reaction products have been analyzed. To analyze complex systems like geopolymers using FTIR, deconvolution techniques help to obtain the properties of a particular peak attributed to a certain molecular vibration. Time and temperature dependent analysis were done on slag pastes to understand the polymerization of reactive silica in the system with time and temperature variance. For time dependent analysis slag has been activated with sodium and potassium silicates using two different `n'values and three different silica modulus [Ms- (SiO2 /M2 O)] values. The temperature dependent analysis was done by curing the samples at 60°C and 80°C. Similarly fly ash has been studied by activating with alkali hydroxides and alkali silicates. Under the same curing conditions the fly ash samples were evaluated to analyze the effects of added silicates for alkali activation. The peak shifts in the FTIR explains the changes in the structural nature of the matrix and can be identified using the deconvolution technique. A strong correlation is found between the concentrations of silicate monomer in the

  9. Communication Analysis modelling techniques

    CERN Document Server

    España, Sergio; Pastor, Óscar; Ruiz, Marcela

    2012-01-01

    This report describes and illustrates several modelling techniques proposed by Communication Analysis; namely Communicative Event Diagram, Message Structures and Event Specification Templates. The Communicative Event Diagram is a business process modelling technique that adopts a communicational perspective by focusing on communicative interactions when describing the organizational work practice, instead of focusing on physical activities1; at this abstraction level, we refer to business activities as communicative events. Message Structures is a technique based on structured text that allows specifying the messages associated to communicative events. Event Specification Templates are a means to organise the requirements concerning a communicative event. This report can be useful to analysts and business process modellers in general, since, according to our industrial experience, it is possible to apply many Communication Analysis concepts, guidelines and criteria to other business process modelling notation...

  10. Figure Analysis: A Teaching Technique to Promote Visual Literacy and Active Learning

    Science.gov (United States)

    Wiles, Amy M.

    2016-01-01

    Learning often improves when active learning techniques are used in place of traditional lectures. For many of these techniques, however, students are expected to apply concepts that they have already grasped. A challenge, therefore, is how to incorporate active learning into the classroom of courses with heavy content, such as molecular-based…

  11. Investigation of anti-wear performance of automobile lubricants using thin layer activation analysis technique

    Energy Technology Data Exchange (ETDEWEB)

    Biswal, Jayashree [Isotope and Radiation Application Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400085 (India); Thakre, G.D. [Tribology and Combustion Division, Indian Institute of Petroleum, Dehradun 248005, Uttarakhand (India); Pant, H.J., E-mail: hjpant@barc.gov.in [Isotope and Radiation Application Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400085 (India); Samantray, J.S. [Isotope and Radiation Application Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400085 (India); Arya, P.K. [Tribology and Combustion Division, Indian Institute of Petroleum, Dehradun 248005, Uttarakhand (India); Sharma, S.C.; Gupta, A.K. [Nuclear Physics Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400085 (India)

    2017-05-15

    An investigation was carried out to examine the anti-wear behavior of automobile lubricants using thin layer activation analysis technique. For this study disc gears made of EN 31 steel were labeled with a small amount of radioactivity by irradiating with 13 MeV proton beam from a particle accelerator. Experiments on wear rate measurement of the gear were carried out by mounting the irradiated disc gear on a twin-disc tribometer under lubricated condition. The activity loss was monitored by using a NaI(Tl) scintillation detector integrated with a multichannel analyzer. The relative remnant activity was correlated with thickness loss by generating a calibration curve. The wear measurements were carried out for four different types of lubricants, named as, L1, L2, L3 and L4. At lower load L1 and L4 were found to exhibit better anti-wear properties than L2 and L3, whereas, L4 exhibited the best anti-wear performance behavior than other three lubricants at all the loads and speeds investigated.

  12. Activity analysis: measurement of the effectiveness of surgical training and operative technique.

    Science.gov (United States)

    Shepherd, J P; Brickley, M

    1992-11-01

    All surgical procedures are characterised by a sequence of steps and instrument changes. Although surgical efficiency and training in operative technique closely relate to this process, few studies have attempted to analyse it quantitatively. Because efficiency is particularly important in day surgery and lower third molar removal is a high-volume procedure, the need for which is responsible for particularly long waiting-lists in almost all UK health regions, this operation was selected for evaluation. A series of 80 consecutive procedures, carried out for 43 day-stay patients under general anaesthesia by seven junior staff (senior house officers and registrars: 39 procedures) and four senior staff (senior registrars and consultants: 41 procedures) were analysed. Median operating time for procedures which required retraction of periosteum was 9.5 min (range 2.7-23.3 min). Where these steps were necessary, median time for incision was 25 s (range 10-90 s); for retraction of periosteum, 79 s (range 5-340 s); for bone removal, 118 s (range 10-380 s); for tooth excision, 131 s (range 10-900 s); for debridement, 74 s (range 5-270 s); and for suture, 144 s (range 25-320 s). Junior surgeons could be differentiated from senior surgeons on the basis of omission, repetition and duration of these steps. Juniors omitted retraction of periosteum in 10% of procedures (seniors 23%) and suture in 13% (seniors 32%). Juniors repeated steps in 47% of operations; seniors, 14%. Junior surgeons took significantly more time than senior surgeons for incision, bone removal and tooth excision. No significant differences between junior and senior surgeons were found in relation to the incidence of altered lingual and labial sensation at 7 days. It was concluded that activity analysis may be a useful measure of the effectiveness of surgical training and the efficiency of operative technique.

  13. Behavior Change Techniques Implemented in Electronic Lifestyle Activity Monitors: A Systematic Content Analysis

    Science.gov (United States)

    Lewis, Zakkoyya H; Mayrsohn, Brian G; Rowland, Jennifer L

    2014-01-01

    Background Electronic activity monitors (such as those manufactured by Fitbit, Jawbone, and Nike) improve on standard pedometers by providing automated feedback and interactive behavior change tools via mobile device or personal computer. These monitors are commercially popular and show promise for use in public health interventions. However, little is known about the content of their feedback applications and how individual monitors may differ from one another. Objective The purpose of this study was to describe the behavior change techniques implemented in commercially available electronic activity monitors. Methods Electronic activity monitors (N=13) were systematically identified and tested by 3 trained coders for at least 1 week each. All monitors measured lifestyle physical activity and provided feedback via an app (computer or mobile). Coding was based on a hierarchical list of 93 behavior change techniques. Further coding of potentially effective techniques and adherence to theory-based recommendations were based on findings from meta-analyses and meta-regressions in the research literature. Results All monitors provided tools for self-monitoring, feedback, and environmental change by definition. The next most prevalent techniques (13 out of 13 monitors) were goal-setting and emphasizing discrepancy between current and goal behavior. Review of behavioral goals, social support, social comparison, prompts/cues, rewards, and a focus on past success were found in more than half of the systems. The monitors included a range of 5-10 of 14 total techniques identified from the research literature as potentially effective. Most of the monitors included goal-setting, self-monitoring, and feedback content that closely matched recommendations from social cognitive theory. Conclusions Electronic activity monitors contain a wide range of behavior change techniques typically used in clinical behavioral interventions. Thus, the monitors may represent a medium by which

  14. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  15. Fluorous-assisted metal chelate affinity extraction technique for analysis of protein kinase activity.

    Science.gov (United States)

    Hayama, Tadashi; Kiyokawa, Ena; Yoshida, Hideyuki; Imakyure, Osamu; Yamaguchi, Masatoshi; Nohta, Hitoshi

    2016-08-15

    We have developed a fluorous affinity-based extraction method for measurement of protein kinase activity. In this method, a fluorescent peptide substrate was phosphorylated by a protein kinase, and the obtained phosphopeptide was selectively captured with Fe(III)-immobilized perfluoroalkyliminodiacetic acid reagent via a metal chelate affinity technique. Next, the captured phosphopeptide was selectively extracted into a fluorous solvent mixture, tetradecafluorohexane and 1H,1H,2H,2H-tridecafluoro-1-n-octanol (3:1, v/v), using the specificity of fluorous affinity (fluorophilicity). In contrast, the remained substrate peptide in the aqueous (non-fluorous) phase was easily measured fluorimetrically. Finally, the enzyme activity could be assayed by measuring the decrease in fluorescence. The feasibility of this method was demonstrated by applying the method for measurement of the activity of cAMP-dependent protein kinase (PKA) using its substrate peptide (kemptide) pre-labeled with carboxytetramethylrhodamine (TAMRA).

  16. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    National Research Council Canada - National Science Library

    Caescu Stefan Claudiu; Popescu Andrei; Ploesteanu Mara Gabriela

    2011-01-01

    .... Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization...

  17. Normal coordinate analysis and fungicidal activity study on anilazine and its related compound using spectroscopic techniques

    Science.gov (United States)

    Sheeja Mol, Gilbert Pushpam; Arul Dhas, Deva Dhas; Hubert Joe, Isaac; Balachandran, Sreedharan

    2016-06-01

    The FTIR and FT-Raman spectra of anilazine have been recorded in the range 400-4000 cm-1 and 50-3500 cm-1 respectively. The optimized geometrical parameters of the compound were calculated using B3LYP method with 6-311G(d,p) basis set. The distribution of the vibrational bands were carried out with the help of normal coordinate analysis (NCA). The 1H and 13C nuclear spectra have been recorded and chemical shifts of the molecule were also calculated using the gauge independent atomic orbital (GIAO) method. The UV-Visible spectrum of the compound was recorded in the region 190-900 nm and the electronic properties were determined by time-dependent DFT (TD-DFT) approach. Anilazine was screened for its antifungal activity. Molecular docking studies are conducted to predict its fungicidal activity.

  18. A Comparative Analysis between Active and Passive Techniques for Underwater 3D Reconstruction of Close-Range Objects

    Directory of Open Access Journals (Sweden)

    Maurizio Muzzupappa

    2013-08-01

    Full Text Available In some application fields, such as underwater archaeology or marine biology, there is the need to collect three-dimensional, close-range data from objects that cannot be removed from their site. In particular, 3D imaging techniques are widely employed for close-range acquisitions in underwater environment. In this work we have compared in water two 3D imaging techniques based on active and passive approaches, respectively, and whole-field acquisition. The comparison is performed under poor visibility conditions, produced in the laboratory by suspending different quantities of clay in a water tank. For a fair comparison, a stereo configuration has been adopted for both the techniques, using the same setup, working distance, calibration, and objects. At the moment, the proposed setup is not suitable for real world applications, but it allowed us to conduct a preliminary analysis on the performances of the two techniques and to understand their capability to acquire 3D points in presence of turbidity. The performances have been evaluated in terms of accuracy and density of the acquired 3D points. Our results can be used as a reference for further comparisons in the analysis of other 3D techniques and algorithms.

  19. Cost Analysis of MRI Services in Iran: An Application of Activity Based Costing Technique

    Directory of Open Access Journals (Sweden)

    Bayati

    2015-09-01

    Full Text Available Background Considerable development of MRI technology in diagnostic imaging, high cost of MRI technology and controversial issues concerning official charges (tariffs have been the main motivations to define and implement this study. Objectives The present study aimed to calculate the unit-cost of MRI services using activity-based costing (ABC as a modern cost accounting system and to fairly compare calculated unit-costs with official charges (tariffs. Materials and Methods We included both direct and indirect costs of MRI services delivered in fiscal year 2011 in Shiraz Shahid Faghihi hospital. Direct allocation method was used for distribution of overhead costs. We used micro-costing approach to calculate unit-cost of all different MRI services. Clinical cost data were retrieved from the hospital registering system. Straight-line method was used for depreciation cost estimation. To cope with uncertainty and to increase the robustness of study results, unit costs of 33 MRI services was calculated in terms of two scenarios. Results Total annual cost of MRI activity center (AC was calculated at USD 400,746 and USD 532,104 based on first and second scenarios, respectively. Ten percent of the total cost was allocated from supportive departments. The annual variable costs of MRI center were calculated at USD 295,904. Capital costs measured at USD 104,842 and USD 236, 200 resulted from the first and second scenario, respectively. Existing tariffs for more than half of MRI services were above the calculated costs. Conclusion As a public hospital, there are considerable limitations in both financial and administrative databases of Shahid Faghihi hospital. Labor cost has the greatest share of total annual cost of Shahid Faghihi hospital. The gap between unit costs and tariffs implies that the claim for extra budget from health providers may not be relevant for all services delivered by the studied MRI center. With some adjustments, ABC could be

  20. Cost Analysis of MRI Services in Iran: An Application of Activity Based Costing Technique.

    Science.gov (United States)

    Bayati, Mohsen; Mahboub Ahari, Alireza; Badakhshan, Abbas; Gholipour, Mahin; Joulaei, Hassan

    2015-10-01

    Considerable development of MRI technology in diagnostic imaging, high cost of MRI technology and controversial issues concerning official charges (tariffs) have been the main motivations to define and implement this study. The present study aimed to calculate the unit-cost of MRI services using activity-based costing (ABC) as a modern cost accounting system and to fairly compare calculated unit-costs with official charges (tariffs). We included both direct and indirect costs of MRI services delivered in fiscal year 2011 in Shiraz Shahid Faghihi hospital. Direct allocation method was used for distribution of overhead costs. We used micro-costing approach to calculate unit-cost of all different MRI services. Clinical cost data were retrieved from the hospital registering system. Straight-line method was used for depreciation cost estimation. To cope with uncertainty and to increase the robustness of study results, unit costs of 33 MRI services was calculated in terms of two scenarios. Total annual cost of MRI activity center (AC) was calculated at USD 400,746 and USD 532,104 based on first and second scenarios, respectively. Ten percent of the total cost was allocated from supportive departments. The annual variable costs of MRI center were calculated at USD 295,904. Capital costs measured at USD 104,842 and USD 236, 200 resulted from the first and second scenario, respectively. Existing tariffs for more than half of MRI services were above the calculated costs. As a public hospital, there are considerable limitations in both financial and administrative databases of Shahid Faghihi hospital. Labor cost has the greatest share of total annual cost of Shahid Faghihi hospital. The gap between unit costs and tariffs implies that the claim for extra budget from health providers may not be relevant for all services delivered by the studied MRI center. With some adjustments, ABC could be implemented in MRI centers. With the settlement of a reliable cost accounting system

  1. Cold neutron prompt gamma activation analysis, a non-destructive technique for hydrogen level assessment in zirconium alloys

    Science.gov (United States)

    Couet, Adrien; Motta, Arthur T.; Comstock, Robert J.; Paul, Rick L.

    2012-06-01

    We propose a novel use of a non-destructive technique to quantitatively assess hydrogen concentration in zirconium alloys. The technique, called Cold Neutron Prompt Gamma Activation Analysis (CNPGAA), is based on measuring prompt gamma rays following the absorption of cold neutrons, and comparing the rate of detection of characteristic hydrogen gamma rays to that of gamma rays from matrix atoms. Because the emission is prompt, this method has to be performed in close proximity to a neutron source such as the one at the National Institute of Technology (NIST) Center for Neutron Research. Determination shown here to be simple and accurate, matching the results given by usual destructive techniques such as Vacuum Hot Extraction (VHE), with a precision of ±2 mg kg-1 (or wt ppm). Very low levels of hydrogen (as low as 5 mg kg-1 (wt ppm)) can be detected. Also, it is demonstrated that CNPGAA can be applied sequentially on an individual corrosion coupon during autoclave testing, to measure a gradually increasing hydrogen concentration. Thus, this technique can replace destructive techniques performed on "sister" samples thereby reducing experimental uncertainties.

  2. Techniques for active passivation

    Energy Technology Data Exchange (ETDEWEB)

    Roscioli, Joseph R.; Herndon, Scott C.; Nelson, Jr., David D.

    2016-12-20

    In one embodiment, active (continuous or intermittent) passivation may be employed to prevent interaction of sticky molecules with interfaces inside of an instrument (e.g., an infrared absorption spectrometer) and thereby improve response time. A passivation species may be continuously or intermittently applied to an inlet of the instrument while a sample gas stream is being applied. The passivation species may have a highly polar functional group that strongly binds to either water or polar groups of the interfaces, and once bound presents a non-polar group to the gas phase in order to prevent further binding of polar molecules. The instrument may be actively used to detect the sticky molecules while the passivation species is being applied.

  3. Techniques for active passivation

    Science.gov (United States)

    Roscioli, Joseph R.; Herndon, Scott C.; Nelson, Jr., David D.

    2016-12-20

    In one embodiment, active (continuous or intermittent) passivation may be employed to prevent interaction of sticky molecules with interfaces inside of an instrument (e.g., an infrared absorption spectrometer) and thereby improve response time. A passivation species may be continuously or intermittently applied to an inlet of the instrument while a sample gas stream is being applied. The passivation species may have a highly polar functional group that strongly binds to either water or polar groups of the interfaces, and once bound presents a non-polar group to the gas phase in order to prevent further binding of polar molecules. The instrument may be actively used to detect the sticky molecules while the passivation species is being applied.

  4. Activity analysis: measurement of the effectiveness of surgical training and operative technique.

    OpenAIRE

    Shepherd, J P; Brickley, M.

    1992-01-01

    All surgical procedures are characterised by a sequence of steps and instrument changes. Although surgical efficiency and training in operative technique closely relate to this process, few studies have attempted to analyse it quantitatively. Because efficiency is particularly important in day surgery and lower third molar removal is a high-volume procedure, the need for which is responsible for particularly long waiting-lists in almost all UK health regions, this operation was selected for e...

  5. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Mary A.; Tangyunyong, Paiboon; Cole, Edward I. [Sandia National Laboratories, Albuquerque, New Mexico 87185-1086 (United States)

    2016-01-14

    Laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes (LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increased leakage is not present in devices without AVM signals. Transmission electron microscopy analysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].

  6. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    Science.gov (United States)

    Miller, Mary A.; Tangyunyong, Paiboon; Cole, Edward I.

    2016-01-01

    Laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes (LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increased leakage is not present in devices without AVM signals. Transmission electron microscopy analysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].

  7. Thermal signature analysis of human face during jogging activity using infrared thermography technique

    Science.gov (United States)

    Budiarti, Putria W.; Kusumawardhani, Apriani; Setijono, Heru

    2016-11-01

    Thermal imaging has been widely used for many applications. Thermal camera is used to measure object's temperature above absolute temperature of 0 Kelvin using infrared radiation emitted by the object. Thermal imaging is color mapping taken using false color that represents temperature. Human body is one of the objects that emits infrared radiation. Human infrared radiations vary according to the activity that is being done. Physical activities such as jogging is among ones that is commonly done. Therefore this experiment will investigate the thermal signature profile of jogging activity in human body, especially in the face parts. The results show that the significant increase is found in periorbital area that is near eyes and forehand by the number of 7.5%. Graphical temperature distributions show that all region, eyes, nose, cheeks, and chin at the temperature of 28.5 - 30.2°C the pixel area tends to be constant since it is the surrounding temperature. At the temperature of 30.2 - 34.7°C the pixel area tends to increase, while at the temperature of 34.7 - 37.1°C the pixel area tends to decrease because pixels at temperature of 34.7 - 37.1°C after jogging activity change into temperature of 30.2 - 34.7°C so that the pixel area increases. The trendline of jogging activity during 10 minutes period also shows the increasing of temperature. The results of each person also show variations due to physiological nature of each person, such as sweat production during physical activities.

  8. Decision Analysis Technique

    Directory of Open Access Journals (Sweden)

    Hammad Dabo Baba

    2014-01-01

    Full Text Available One of the most significant step in building structure maintenance decision is the physical inspection of the facility to be maintained. The physical inspection involved cursory assessment of the structure and ratings of the identified defects based on expert evaluation. The objective of this paper is to describe present a novel approach to prioritizing the criticality of physical defects in a residential building system using multi criteria decision analysis approach. A residential building constructed in 1985 was considered in this study. Four criteria which includes; Physical Condition of the building system (PC, Effect on Asset (EA, effect on Occupants (EO and Maintenance Cost (MC are considered in the inspection. The building was divided in to nine systems regarded as alternatives. Expert's choice software was used in comparing the importance of the criteria against the main objective, whereas structured Proforma was used in quantifying the defects observed on all building systems against each criteria. The defects severity score of each building system was identified and later multiplied by the weight of the criteria and final hierarchy was derived. The final ranking indicates that, electrical system was considered the most critical system with a risk value of 0.134 while ceiling system scored the lowest risk value of 0.066. The technique is often used in prioritizing mechanical equipment for maintenance planning. However, result of this study indicates that the technique could be used in prioritizing building systems for maintenance planning

  9. An analysis in vivo of intracanal bacterial load before and after chemo-mechanical preparation: A comparative analysis of two irrigants and two activation techniques

    Science.gov (United States)

    Rico-Romano, Cristina; Zubizarreta-Macho, Álvaro; Baquero-Artigao, María-Rosario

    2016-01-01

    Background The goals of this randomized double-blind trial were to assess the antimicrobial activity in vivo of Sodium hypochlorite (NaOCl) vs. chlorhexidine gluconate (CHX) used in combination either with EndoActivator® or IRRI S® files in patients with apical periodontitis. Material and Methods A total of 120 patients with apical periodontitis (in single or multiple root canals) were randomly assigned to the four irrigation protocols outlined below: Group A: 5.25% sodium hypochlorite (NaOCl) + EndoActivator®; Group B: 5.25% NaOCl + IRRI S® files; Group C: 2% chlorhexidine gluconate (CHX) + EndoActivator®; Group D: 2% CHX + IRRI S® files. Paper points were used to collect microbiological samples before (1A samples) and after (1B samples) irrigation. Viable colony-forming units (CFU) were quantified twice: (1) without speciation, and (2) only for Enterococcus Faecalis(EF). Statistical analysis was performed using SPSS 22.0 for Windows. Results No significant differences were observed between NaOCl and CHX in the reduction of CFU; in fact, reduction was irrigants. Conversely, statistically significant differences were found between the two activation techniques (sonic and ultrasonic) in the reduction of Enterococcus Faecalis(EF). Thus, the effectiveness of ultrasonic activation was significantly higher (irrigants with the two activation techniques (groups A, B, C and D), significant differences were observed between group A and B (p=0.025) in the reduction of EF populations, reaching up to 94%. Conclusions NaClO and CHX are effective in reducing intracanal bacterial load. Ultrasonic activation is the most effective activation technique in reducing EF populations. Key words:Chlorhexidine gluconate, sodium hypochlorite, ultrasonic irrigation, sonic irrigation, apical periodontitis, Enterococcus faecalis. PMID:26855714

  10. Model building techniques for analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald; Cordova, Theresa Elena; Henry, Ronald C.; Brooks, Sean; Martin, Wilbur D.

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.

  11. Improved techniques in data analysis and interpretation of potential fields: examples of application in volcanic and seismically active areas

    Directory of Open Access Journals (Sweden)

    G. Florio

    2002-06-01

    Full Text Available Geopotential data may be interpreted by many different techniques, depending on the nature of the mathematical equations correlating specific unknown ground parameters to the measured data set. The investigation based on the study of the gravity and magnetic anomaly fields represents one of the most important geophysical approaches in the earth sciences. It has now evolved aimed both at improving of known methods and testing other new and reliable techniques. This paper outlines a general framework for several applications of recent techniques in the study of the potential methods for the earth sciences. Most of them are here described and significant case histories are shown to illustrate their reliability on active seismic and volcanic areas.

  12. Application of synchrotron-radiation-based x-ray microprobe techniques for the analysis of recombination activity of metals precipitated at Si/SiGe misfit dislocations

    Energy Technology Data Exchange (ETDEWEB)

    Vyvenko, O F [University of California, LBNL, 1 Cyclotron Rd, Berkeley, CA 94720 (United States); Buonassisi, T [University of California, LBNL, 1 Cyclotron Rd, Berkeley, CA 94720 (United States); Istratov, A A [University of California, LBNL, 1 Cyclotron Rd, Berkeley, CA 94720 (United States); Weber, E R [University of California, LBNL, 1 Cyclotron Rd, Berkeley, CA 94720 (United States); Kittler, M [IHP, Im Technologiepark 25, D-15236 Frankfurt (Oder) (Germany); Seifert, W [IHP, Im Technologiepark 25, D-15236 Frankfurt (Oder) (Germany)

    2002-12-09

    In this study we report application of synchrotron-radiation-based x-ray microprobe techniques (the x-ray-beam-induced current (XBIC) and x-ray fluorescence ({mu}-XRF) methods) to the analysis of the recombination activity and space distribution of copper and iron in the vicinity of dislocations in silicon/silicon-germanium structures. A combination of these two techniques enables one to study the chemical nature of the defects and impurities and their recombination activity in situ and to map metal clusters with a micron-scale resolution. XRF analysis revealed that copper formed clearly distinguishable precipitates along the misfit dislocations. A proportional dependence between the XBIC contrast and the number of copper atoms in the precipitates was established. In hydrogen-passivated iron-contaminated samples we observed clusters of iron precipitates which had no recombination activity detectable by the XBIC technique as well as iron clusters which were not completely passivated.

  13. Two non-destructive neutron inspection techniques: prompt gamma-ray activation analysis and cold neutron tomography

    OpenAIRE

    Baechler, Sébastien; Dousse, Jean-Claude; Jolie, Jan

    2005-01-01

    Deux techniques d’inspection non-destructives utilisant des faisceaux de neutrons froids ont été développées à la source de neutrons SINQ de l’Institut Paul Scherrer : (1) l’analyse par activation neutronique prompte (PGAA) et (2) la tomographie neutronique. L’analyse par PGA (Prompt Gamma-ray Activation) est une méthode nucléaire qui permet de déterminer la concentration d’éléments présents dans un échantillon. Cette technique consiste à détecter les rayons gamma prompts émis par l’échantill...

  14. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  15. Kinetic activation-relaxation technique

    Science.gov (United States)

    Béland, Laurent Karim; Brommer, Peter; El-Mellouhi, Fedwa; Joly, Jean-François; Mousseau, Normand

    2011-10-01

    We present a detailed description of the kinetic activation-relaxation technique (k-ART), an off-lattice, self-learning kinetic Monte Carlo (KMC) algorithm with on-the-fly event search. Combining a topological classification for local environments and event generation with ART nouveau, an efficient unbiased sampling method for finding transition states, k-ART can be applied to complex materials with atoms in off-lattice positions or with elastic deformations that cannot be handled with standard KMC approaches. In addition to presenting the various elements of the algorithm, we demonstrate the general character of k-ART by applying the algorithm to three challenging systems: self-defect annihilation in c-Si (crystalline silicon), self-interstitial diffusion in Fe, and structural relaxation in a-Si (amorphous silicon).

  16. Kinetic activation-relaxation technique.

    Science.gov (United States)

    Béland, Laurent Karim; Brommer, Peter; El-Mellouhi, Fedwa; Joly, Jean-François; Mousseau, Normand

    2011-10-01

    We present a detailed description of the kinetic activation-relaxation technique (k-ART), an off-lattice, self-learning kinetic Monte Carlo (KMC) algorithm with on-the-fly event search. Combining a topological classification for local environments and event generation with ART nouveau, an efficient unbiased sampling method for finding transition states, k-ART can be applied to complex materials with atoms in off-lattice positions or with elastic deformations that cannot be handled with standard KMC approaches. In addition to presenting the various elements of the algorithm, we demonstrate the general character of k-ART by applying the algorithm to three challenging systems: self-defect annihilation in c-Si (crystalline silicon), self-interstitial diffusion in Fe, and structural relaxation in a-Si (amorphous silicon).

  17. Study of some Ayurvedic Indian medicinal plants for the essential trace elemental contents by instrumental neutron activation analysis and atomic absorption spectroscopy techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lokhande, R.S.; Singare, P.U.; Andhele, M.L. [Dept. of Chemistry, Univ. of Mumbai, Santacruz, Mumbai (India); Acharya, R.; Nair, A.G.C.; Reddy, A.V.R. [Radiochemistry Div., Bhabha Atomic Research Centre, Trombay, Mumbai (India)

    2009-07-01

    Elemental analysis of some medicinal plants used in the Indian Ayurvedic system was performed by employing instrumental neutron activation analysis (INAA) and atomic absorption spectroscopy (AAS) techniques. The samples were irradiated with thermal neutrons in a nuclear reactor and the induced activity was counted by gamma ray spectrometry using an efficiency calibrated high resolution high purity germanium (HPGe) detector. Most of the medicinal plants were found to be rich in one or more of the elements under study. The variation in elemental concentration in same medicinal plants samples collected in summer, winter and rainy seasons was studied and the biological effects of these elements on human beings are discussed. (orig.)

  18. Quantitative Techniques in Volumetric Analysis

    Science.gov (United States)

    Zimmerman, John; Jacobsen, Jerrold J.

    1996-12-01

    Quantitative Techniques in Volumetric Analysis is a visual library of techniques used in making volumetric measurements. This 40-minute VHS videotape is designed as a resource for introducing students to proper volumetric methods and procedures. The entire tape, or relevant segments of the tape, can also be used to review procedures used in subsequent experiments that rely on the traditional art of quantitative analysis laboratory practice. The techniques included are: Quantitative transfer of a solid with a weighing spoon Quantitative transfer of a solid with a finger held weighing bottle Quantitative transfer of a solid with a paper strap held bottle Quantitative transfer of a solid with a spatula Examples of common quantitative weighing errors Quantitative transfer of a solid from dish to beaker to volumetric flask Quantitative transfer of a solid from dish to volumetric flask Volumetric transfer pipet A complete acid-base titration Hand technique variations The conventional view of contemporary quantitative chemical measurement tends to focus on instrumental systems, computers, and robotics. In this view, the analyst is relegated to placing standards and samples on a tray. A robotic arm delivers a sample to the analysis center, while a computer controls the analysis conditions and records the results. In spite of this, it is rare to find an analysis process that does not rely on some aspect of more traditional quantitative analysis techniques, such as careful dilution to the mark of a volumetric flask. Figure 2. Transfer of a solid with a spatula. Clearly, errors in a classical step will affect the quality of the final analysis. Because of this, it is still important for students to master the key elements of the traditional art of quantitative chemical analysis laboratory practice. Some aspects of chemical analysis, like careful rinsing to insure quantitative transfer, are often an automated part of an instrumental process that must be understood by the

  19. Digital Fourier analysis advanced techniques

    CERN Document Server

    Kido, Ken'iti

    2015-01-01

    This textbook is a thorough, accessible introduction to advanced digital Fourier analysis for advanced undergraduate and graduate students. Assuming knowledge of the Fast Fourier Transform, this book covers advanced topics including the Hilbert transform, cepstrum analysis, and the two-dimensional Fourier transform. Saturated with clear, coherent illustrations, "Digital Fourier Analysis - Advanced Techniques" includes practice problems and thorough Appendices. As a central feature, the book includes interactive applets (available online) that mirror the illustrations. These user-friendly applets animate concepts interactively, allowing the user to experiment with the underlying mathematics. The applet source code in Visual Basic is provided online, enabling advanced students to tweak and change the programs for more sophisticated results. A complete, intuitive guide, "Digital Fourier Analysis - Advanced Techniques" is an essential reference for students in science and engineering.

  20. Triangulation of Data Analysis Techniques

    Directory of Open Access Journals (Sweden)

    Lauri, M

    2011-10-01

    Full Text Available In psychology, as in other disciplines, the concepts of validity and reliability are considered essential to give an accurate interpretation of results. While in quantitative research the idea is well established, in qualitative research, validity and reliability take on a different dimension. Researchers like Miles and Huberman (1994 and Silverman (2000, 2001, have shown how these issues are addressed in qualitative research. In this paper I am proposing that the same corpus of data, in this case the transcripts of focus group discussions, can be analysed using more than one data analysis technique. I refer to this idea as ‘triangulation of data analysis techniques’ and argue that such triangulation increases the reliability of the results. If the results obtained through a particular data analysis technique, for example thematic analysis, are congruent with the results obtained by analysing the same transcripts using a different technique, for example correspondence analysis, it is reasonable to argue that the analysis and interpretation of the data is valid.

  1. Atmospheric deposition of rare earth elements in Albania studied by the moss biomonitoring technique, neutron activation analysis and GIS technology.

    Science.gov (United States)

    Allajbeu, Sh; Yushin, N S; Qarri, F; Duliu, O G; Lazo, P; Frontasyeva, M V

    2016-07-01

    Rare earth elements (REEs) are typically conservative elements that are scarcely derived from anthropogenic sources. The mobilization of REEs in the environment requires the monitoring of these elements in environmental matrices, in which they are present at trace level. The determination of 11 REEs in carpet-forming moss species (Hypnum cupressiforme) collected from 44 sampling sites over the whole territory of the country were done by using epithermal neutron activation analysis (ENAA) at IBR-2 fast pulsed reactor in Dubna. This paper is focused on REEs (lanthanides) and Sc. Fe as typical consistent element and Th that appeared good correlations between the elements of lanthanides are included in this paper. Th, Sc, and REEs were never previously determined in the air deposition of Albania. Descriptive statistics were used for data treatment using MINITAB 17 software package. The median values of the elements under investigation were compared with those of the neighboring countries such as Bulgaria, Macedonia, Romania, and Serbia, as well as Norway which is selected as a clean area. Geographical distribution maps of the elements over the sampled territory were constructed using geographic information system (GIS) technology. Geochemical behavior of REEs in moss samples has been studied by using the ternary diagram of Sc-La-Th, Spider diagrams and multivariate analysis. It was revealed that the accumulation of REEs in current mosses is associated with the wind-blowing metal-enriched soils that is pointed out as the main emitting factor of the elements under investigation.

  2. Proof-of-principle results for identifying the composition of dust particles and volcanic ash samples through the technique of photon activation analysis at the IAC

    Science.gov (United States)

    Mamtimin, Mayir; Cole, Philip L.; Segebade, Christian

    2013-04-01

    Instrumental analytical methods are preferable in studying sub-milligram quantities of airborne particulates collected in dust filters. The multi-step analytical procedure used in treating samples through chemical separation can be quite complicated. Further, due to the minute masses of the airborne particulates collected on filters, such chemical treatment can easily lead to significant levels of contamination. Radio-analytical techniques, and in particular, activation analysis methods offer a far cleaner alternative. Activation methods require minimal sample preparation and provide sufficient sensitivity for detecting the vast majority of the elements throughout the periodic table. In this paper, we will give a general overview of the technique of photon activation analysis. We will show that by activating dust particles with 10- to 30-MeV bremsstrahlung photons, we can ascertain their elemental composition. The samples are embedded in dust-collection filters and are irradiated "as is" by these photons. The radioactivity of the photonuclear reaction products is measured with appropriate spectrometers and the respective analytes are quantified using multi-component calibration materials. We shall provide specific examples of identifying the elemental components of airborne dust particles and volcanic ash by making use of bremsstrahlung photons from an electron linear accelerator at the Idaho Accelerator Center in Pocatello, Idaho.

  3. Advanced Techniques of Stress Analysis

    Directory of Open Access Journals (Sweden)

    Simion TATARU

    2013-12-01

    Full Text Available This article aims to check the stress analysis technique based on 3D models also making a comparison with the traditional technique which utilizes a model built directly into the stress analysis program. This comparison of the two methods will be made with reference to the rear fuselage of IAR-99 aircraft, structure with a high degree of complexity which allows a meaningful evaluation of both approaches. Three updated databases are envisaged: the database having the idealized model obtained using ANSYS and working directly on documentation, without automatic generation of nodes and elements (with few exceptions, the rear fuselage database (performed at this stage obtained with Pro/ ENGINEER and the one obtained by using ANSYS with the second database. Then, each of the three databases will be used according to arising necessities.The main objective is to develop the parameterized model of the rear fuselage using the computer aided design software Pro/ ENGINEER. A review of research regarding the use of virtual reality with the interactive analysis performed by the finite element method is made to show the state- of- the-art achieved in this field.

  4. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  5. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  6. Use of the Taguchi method for biomechanical comparison of flexor-tendon-repair techniques to allow immediate active flexion. A new method of analysis and optimization of technique to improve the quality of the repair.

    Science.gov (United States)

    Singer, G; Ebramzadeh, E; Jones, N F; Meals, R

    1998-10-01

    The current trend toward early active flexion after repair of the flexor tendons necessitates a stronger repair than that provided by a modified Kessler technique with use of 4-0 nylon suture. The purpose of the current study was to determine, with use of the Taguchi method of analysis, the strongest and most consistent repair of the flexor tendons. Flexor tendons were obtained from fresh-frozen hands of human cadavera. Eight flexor tendons initially were repaired with the modified Kessler technique with use of 4-0 nylon core suture and 6-0 nylon epitenon suture. A test matrix was used to analyze a total of twenty variables in sixty-four tests. These variables included eight techniques for core-suture repair, four types of core suture, two sizes of core suture, four techniques for suture of the epitenon, and two distances from the repair site for placement of the core suture. After each repair, the specimens were mounted in a servohydraulic mechanical testing machine for tension-testing to failure. The optimum combination of variables was determined, with the Taguchi method, to be an augmented Becker technique with use of 3-0 Mersilene core suture, placed 0.75 centimeter from the cut edge with volar epitenon suture. The four-strand, double modified Kessler technique provided the second strongest repair. Five tendons that had been repaired with use of the optimum combination then were tested and compared with tendons that had been repaired with the standard modified Kessler technique. With the optimum combination of variables, the strength of the repair improved from a mean (and standard deviation) of 17.2 +/- 2.9 to 128 +/- 5.6 newtons, and the stiffness improved from a mean of 4.6 to 16.2 newtons per millimeter.

  7. Development of Active Correlation Technique

    CERN Document Server

    Tsyganov, Y S

    2015-01-01

    With reaching to extremely high intensities of heavy-ion beams new requirements for the detection system of the Dubna Gas-Filled Recoil Separator (DGFRS) will definitely be set. One of the challenges is how to apply the active correlations method to suppress beam associated background products without significant losses in the whole long-term experiment efficiency value. Different scenarios and equations to develop the method according this requirement are under consideration in the present paper. The execution time to estimate the dead time parameter associated with the optimal choice of the life-time parameter is presented.

  8. Influence of elemental concentration in soil on vegetables applying analytical nuclear techniques: k{sub 0}-instrumental neutron activation analysis and radiometry

    Energy Technology Data Exchange (ETDEWEB)

    Menezes, Maria Angela de B.C. [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), Belo Horizonte, MG (Brazil). Servico de Reator e Irradiacao]. E-mail: menezes@cdtn.br; Mingote, Raquel Maia [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), Belo Horizonte, MG (Brazil). Servico de Quimica e Radioquimica; Silva, Lucilene Guerra e; Pedrosa, Lorena Gomes [Minas Gerais Univ., Belo Horizonte, MG (Brazil). Faculdade de Farmacia

    2005-07-01

    Samples from two vegetable gardens where analysed aiming at determining the elemental concentration. The vegetables selected to be studied are grown by the people for their own use and are present in daily meal. One vegetable garden studied is close to a mining activity in a region inserted in the Iron Quadrangle (Quadrilatero Ferrifero), located in the Brazilian state of Minas Gerais. This region is considered one of the richest mineral bearing regions in the world. Another vegetable garden studied is far from this region and without any mining activity It was also studied as a comparative site. This assessment was carried out to evaluate the elemental concentration in soil and vegetables, matrixes connected with the chain food, applying the k{sub 0}-Instrumental Neutron Activation Analysis (k{sub 0}-INAA) at the Laboratory for Neutron Activation Analysis. However, this work reports only the results of thorium, uranium and rare-earth obtained in samples collected during the dry season, focusing on the influence of these elements on vegetable elemental composition. Results of natural radioactivity determined by Gross Alpha and Gross Beta measurements, are also reported. This study is related to the BRA 11920 project, entitled 'Iron Quadrangle, Brazil: assessment of health impact caused by mining pollutants through chain food applying nuclear and related techniques', one of the researches co-ordinated by the IAEA (Vienna, Austria). (author)

  9. Chromatographic finger print analysis of anti-inflammatory active extract fractions of aerial parts of Tribulus terrestris by HPTLC technique

    Institute of Scientific and Technical Information of China (English)

    Mona Salih Mohammed; Mohamed Fahad Alajmi; Perwez Alam; Hassan Subki Khalid; Abelkhalig Muddathir Mahmoud; Wadah Jamal Ahmed

    2014-01-01

    Objective:To develop HPTLC fingerprint profile of anti-inflammatory active extract fractions of Tribulus terrestris (family Zygophyllaceae). Methods:The anti-inflammatory activity was tested for the methanol and its fractions (chloroform, ethyl acetate, n-butanol and aqueous) and chloroform extract of Tribulus terrestris (aerial parts) by injecting different groups of rats (6 each) with carrageenan in hind paw and measuring the edema volume before and 1, 2 and 3 h after carrageenan injection. Control group received saline i.p. The extracts treatment was injected i.p. in doses of 200 mg/kg 1 h before carrageenan administration. Indomethacin (30 mg/kg) was used as standard. HPTLC studies were carried out using CAMAG HPTLC system equipped with Linomat IV applicator, TLC scanner 3, Reprostar 3, CAMAG ADC 2 and WIN CATS-4 software for the active fractions of chloroform fraction of methanol extract. Results:The methanol extract showed good antiedematous effect with percentage of inhibition more than 72%, indicating its ability to inhibit the inflammatory mediators. The methanol extract was re-dissolved in 100 mL of distilled water and fractionated with chloroform, ethyl acetate and n-butanol. The four fractions (chloroform, ethyl acetate, n-butanol and aqueous) were subjected to anti-inflammatory activity. Chloroform fraction showed good anti-inflammatory activity at dose of 200 mg/kg. Chloroform fraction was then subjected to normal phase silica gel column chromatography and eluted with petroleum ether-chloroform, chloroform-ethyl acetate mixtures of increasing polarity which produced 15 fractions (F1-F15). Only fractions F1, F2, F4, F5, F7, F9, F11 and F14 were found to be active, hence these were analyzed with HPTLC to develop their finger print profile. These fractions showed different spots with different Rf values. Conclusions:The different chloroform fractions F1, F2, F4, F5, F7, F9, F11 and F14 revealed 4, 7, 7, 8, 9, 7, 7 and 6 major spots, respectively. The

  10. BIOMECHANICS AND HISTOLOGICAL ANALYSIS IN RABBIT FLEXOR TENDONS REPAIRED USING THREE SUTURE TECHNIQUES (FOUR AND SIX STRANDS) WITH EARLY ACTIVE MOBILIZATION

    Science.gov (United States)

    Severo, Antônio Lourenço; Arenhart, Rodrigo; Silveira, Daniela; Ávila, Aluísio Otávio Vargas; Berral, Francisco José; Lemos, Marcelo Barreto; Piluski, Paulo César Faiad; Lech, Osvandré Luís Canfield; Fukushima, Walter Yoshinori

    2015-01-01

    Objective: Analyzing suture time, biomechanics (deformity between the stumps) and the histology of three groups of tendinous surgical repair: Brazil-2 (4-strands) which the end knot (core) is located outside the tendon, Indiana (4-strands) and Tsai (6-strands) with sutures technique which the end knot (core) is inner of the tendon, associated with early active mobilization. Methods: The right calcaneal tendons (plantar flexor of the hind paw) of 36 rabbits of the New Zealand breed (Oryctolagus cuniculus) were used in the analysis. This sample presents similar size to human flexor tendon that has approximately 4.5 mm (varying from 2mm). The selected sample showed the same mass (2.5 to 3kg) and were male or female adults (from 8 ½ months). For the flexor tendons of the hind paws, sterile and driven techniques were used in accordance to the Committee on Animal Research and Ethics (CETEA) of the University of the State of Santa Catarina (UDESC), municipality of Lages, in Brazil (protocol # 1.33.09). Results: In the biomechanical analysis (deformity) carried out between tendinous stumps, there was no statistically significant difference (p>0.01). There was no statistical difference in relation to surgical time in all three suture techniques with a mean of 6.0 minutes for Tsai (6- strands), 5.7 minutes for Indiana (4-strands) and 5.6 minutes for Brazil (4-strands) (p>0.01). With the early active mobility, there was qualitative and quantitative evidence of thickening of collagen in 38.9% on the 15th day and in 66.7% on the 30th day, making the biological tissue stronger and more resistant (p=0.095). Conclusion: This study demonstrated that there was no histological difference between the results achieved with an inside or outside end knot with respect to the repaired tendon and the number of strands did not affect healing, vascularization or sliding of the tendon in the osteofibrous tunnel, which are associated with early active mobility, with the repair techniques

  11. Active learning techniques for librarians practical examples

    CERN Document Server

    Walsh, Andrew

    2010-01-01

    A practical work outlining the theory and practice of using active learning techniques in library settings. It explains the theory of active learning and argues for its importance in our teaching and is illustrated using a large number of examples of techniques that can be easily transferred and used in teaching library and information skills to a range of learners within all library sectors. These practical examples recognise that for most of us involved in teaching library and information skills the one off session is the norm, so we need techniques that allow us to quickly grab and hold our

  12. Air Pollution Studies in Central Russia (Tver and Yaroslavl Regions) Using the Moss Biomonitoring Technique and Neutron Activation Analysis

    CERN Document Server

    Ermakova, E V; Pavlov, S S; Povtoreiko, E A; Steinnes, E; Cheremisina, Ye N

    2003-01-01

    Data of 34 elements, including heavy metals, halogens, rare-earth elements, U, and Th in 140 moss samples, collected in central Russia (Tver and Yaroslavl regions and the northern part of Moscow Region) in 2000-2002, are presented. Factor analysis with VARIMAX rotation was applied to identify possible sources of the elements determined in the mosses. The seven resulting factors represent crust, vegetation and anthropogenic components in the moss. Some of the factors were interpreted as being associated with ferrous smelters (Fe, Zn, Sb, Ta); combination of non-ferrous smelters and other industries (Mn, Co, Mo, Cr, Ni, W); an oil-refining plant, and oil combustion at the thermal power plant (V, Ni). The geographical distribution patterns of the factor scores are also presented. The dependency equations of elemental content in mosses versus distance from the source are derived.

  13. Prefractionation techniques in proteome analysis.

    Science.gov (United States)

    Righetti, Pier Giorgio; Castagna, Annalisa; Herbert, Ben; Reymond, Frederic; Rossier, Joël S

    2003-08-01

    The present review deals with a number of prefractionation protocols in preparation for two-dimensional map analysis, both in the fields of chromatography and in the field of electrophoresis. In the first case, Fountoulaki's groups has reported just about any chromatographic procedure useful as a prefractionation step, including affinity, ion-exchange, and reversed-phase resins. As a result of the various enrichment steps, several hundred new species, previously undetected in unfractionated samples, could be revealed for the first time. Electrophoretic prefractionation protocols include all those electrokinetic methodologies which are performed in free solution, essentially all relying on isoelectric focusing steps. The devices here reviewed include multichamber apparatus, such as the multicompartment electrolyzer with Immobiline membranes, Off-Gel electrophoresis in a multicup device and the Rotofor, an instrument also based on a multichamber system but exploiting the conventional technique of carrier-ampholyte-focusing. Other instruments of interest are the Octopus, a continuous-flow device for isoelectric focusing in a upward flowing liquid curtain, and the Gradiflow, where different pI cuts are obtained by a multistep passage through two compartments buffered at different pH values. It is felt that this panoply of methods could offer a strong step forward in "mining below the tip of the iceberg" for detecting the "unseen proteome".

  14. Determination of concentrations of Fe, Mg, and Zn in some ferrite samples using neutron activation analysis and X-ray fluorescence techniques.

    Science.gov (United States)

    Ali, I A; Mohamed, Gehan Y; Azzam, A; Sattar, A A

    2017-01-14

    Mg-Zn ferrite is considered as one of the important materials with potential uses in many applications. In this work, samples of ferrite Mg(1-x)ZnxFe2O4 (where x=0.0, 0.2, 0.4, 0.6, 0.8 and 1) were synthesized by the sol-gel method for use in some hyperthermia applications. The composition and purity of the prepared samples hardly affected their properties. Therefore, the elemental concentration of these samples was measured by the X-ray fluorescence technique and thermal neutron activation analysis to check the quality of the prepared samples. The results of both methods were compared with each other and with the molecular ratios of the as-prepared samples. In addition, no existing elemental impurity, with considerable concentration, was measured.

  15. Comparative Analysis of Hand Gesture Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Arpana K. Patel

    2015-03-01

    Full Text Available During past few years, human hand gesture for interaction with computing devices has continues to be active area of research. In this paper survey of hand gesture recognition is provided. Hand Gesture Recognition is contained three stages: Pre-processing, Feature Extraction or matching and Classification or recognition. Each stage contains different methods and techniques. In this paper define small description of different methods used for hand gesture recognition in existing system with comparative analysis of all method with its benefits and drawbacks are provided.

  16. Characterization of ancient glass excavated in Enez (Ancient Ainos) Turkey by combined Instrumental Neutron Activation Analysis and Fourier Transform Infrared spectrometry techniques

    Energy Technology Data Exchange (ETDEWEB)

    Akyuz, Sevim, E-mail: s.akyuz@iku.edu.tr [Physics Department, Science and Letters Faculty, Istanbul Kultur University, Atakoy Campus, Bakirkoy 34156, Istanbul (Turkey); Akyuz, Tanil [Physics Department, Science and Letters Faculty, Istanbul Kultur University, Atakoy Campus, Bakirkoy 34156, Istanbul (Turkey); Mukhamedshina, Nuranya M.; Mirsagatova, A. Adiba [Institute of Nuclear Physics, Uzbek Academy of Sciences, 702132, Ulugbek, Tashkent (Uzbekistan); Basaran, Sait; Cakan, Banu [Department of Restoration and Conservation of Artefacts, Letters Faculty, Istanbul University, Vezneciler, Istanbul (Turkey)

    2012-05-15

    Ancient glass fragments excavated in the archaeological district Enez (Ancient Ainos)-Turkey were investigated by combined Instrumental Neutron Activation Analysis (INAA) and Fourier Transform Infrared (FTIR) spectrometry techniques. Multi-elemental contents of 15 glass fragments that belong to Hellenistic, Roman, Byzantine, and Ottoman Periods, were determined by INAA. The concentrations of twenty six elements (Na, K, Ca, Sc, Cr, Mn, Fe, Co, Cu, Zn, As, Rb, Sr, Sb, Cs, Ba, Ce, Sm, Eu, Tb, Yb, Lu, Hf, Ta, Au and Th), which might be present in the samples as flux, stabilizers, colorants or opacifiers, and impurities, were examined. Chemometric treatment of the INAA data was performed and principle component analysis revealed presence of 3 distinct groups. The thermal history of the glass samples was determined by FTIR spectrometry. - Highlights: Black-Right-Pointing-Pointer INAA was performed to determine elemental compositions of ancient glass fragments. Black-Right-Pointing-Pointer Basic, coloring/discoloring elements and impurities have been determined. Black-Right-Pointing-Pointer PCA discriminated the glasses depending on their chronological order. Black-Right-Pointing-Pointer The thermal history of the glass samples was determined by FTIR spectrometry.

  17. Eigenspace design techniques for active flutter suppression

    Science.gov (United States)

    Garrard, W. L.; Liebst, B. S.

    1984-01-01

    The application of eigenspace design techniques to an active flutter suppression system for the DAST ARW-2 research drone is examined. Eigenspace design techniques allow the control system designer to determine feedback gains which place controllable eigenvalues in specified configurations and which shape eigenvectors to achieve desired dynamic response. Eigenspace techniques were applied to the control of lateral and longitudinal dynamic response of aircraft. However, little was published on the application of eigenspace techniques to aeroelastic control problems. This discussion will focus primarily on methodology for design of full-state and limited-state (output) feedback controllers. Most of the states in aeroelastic control problems are not directly measurable, and some type of dynamic compensator is necessary to convert sensor outputs to control inputs. Compensator design are accomplished by use of a Kalman filter modified if necessary by the Doyle-Stein procedure for full-state loop transfer function recovery, by some other type of observer, or by transfer function matching.

  18. Active load control techniques for wind turbines.

    Energy Technology Data Exchange (ETDEWEB)

    van Dam, C.P. (University of California, Davis, CA); Berg, Dale E.; Johnson, Scott J. (University of California, Davis, CA)

    2008-07-01

    This report provides an overview on the current state of wind turbine control and introduces a number of active techniques that could be potentially used for control of wind turbine blades. The focus is on research regarding active flow control (AFC) as it applies to wind turbine performance and loads. The techniques and concepts described here are often described as 'smart structures' or 'smart rotor control'. This field is rapidly growing and there are numerous concepts currently being investigated around the world; some concepts already are focused on the wind energy industry and others are intended for use in other fields, but have the potential for wind turbine control. An AFC system can be broken into three categories: controls and sensors, actuators and devices, and the flow phenomena. This report focuses on the research involved with the actuators and devices and the generated flow phenomena caused by each device.

  19. Active cycle of breathing technique for cystic fibrosis.

    Science.gov (United States)

    Mckoy, Naomi A; Wilson, Lisa M; Saldanha, Ian J; Odelola, Olaide A; Robinson, Karen A

    2016-07-05

    People with cystic fibrosis experience chronic airway infections as a result of mucus build up within the lungs. Repeated infections often cause lung damage and disease. Airway clearance therapies aim to improve mucus clearance, increase sputum production, and improve airway function. The active cycle of breathing technique (also known as ACBT) is an airway clearance method that uses a cycle of techniques to loosen airway secretions including breathing control, thoracic expansion exercises, and the forced expiration technique. This is an update of a previously published review. To compare the clinical effectiveness of the active cycle of breathing technique with other airway clearance therapies in cystic fibrosis. We searched the Cochrane Cystic Fibrosis Trials Register, compiled from electronic database searches and handsearching of journals and conference abstract books. We also searched the reference lists of relevant articles and reviews.Date of last search: 25 April 2016. Randomised or quasi-randomised controlled clinical studies, including cross-over studies, comparing the active cycle of breathing technique with other airway clearance therapies in cystic fibrosis. Two review authors independently screened each article, abstracted data and assessed the risk of bias of each study. Our search identified 62 studies, of which 19 (440 participants) met the inclusion criteria. Five randomised controlled studies (192 participants) were included in the meta-analysis; three were of cross-over design. The 14 remaining studies were cross-over studies with inadequate reports for complete assessment. The study size ranged from seven to 65 participants. The age of the participants ranged from six to 63 years (mean age 22.33 years). In 13 studies, follow up lasted a single day. However, there were two long-term randomised controlled studies with follow up of one to three years. Most of the studies did not report on key quality items, and therefore, have an unclear risk of

  20. Innovative Techniques Simplify Vibration Analysis

    Science.gov (United States)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  1. PGNAA 方法学的发展与现状%Development and Status of Prompt Gamma Neutron Activation Analysis Technique Methodology

    Institute of Scientific and Technical Information of China (English)

    王兴华; 孙洪超; 姚永刚; 肖才锦; 张贵英; 金象春; 华龙; 周四春

    2014-01-01

    瞬发伽马中子活化分析(PGNAA)为非破坏性、在线测量的核分析方法。目前国际上有30多座研究堆建立了PGNAA实验室。本文介绍了三种定量瞬发伽马活化分析方法:相对法、校准曲线法、k0因子法,阐述了基本原理及其应用领域,以及针对短寿命核素高精度测量的束流斩波器技术,针对大样品测量带来的中子自吸收和伽马自屏蔽效应的内标法。此外还简介了基于CARR堆的热中子瞬发伽马活化分析装置进展情况,对国内的PGNAA问题进行了探讨。%Prompt Gamma Neutron Activation Analysis (PGNAA) is one of the nonde‐structive and On‐line measurement of nuclear analytical methods ,There are more than 30 PGNAA laboratories which are established based on the research reactor currently . The basic principle and the application field of three kinds of analytical method of PGNAA were introduced ,such as the relative comparison method、calibration method、k0‐factor method .T he short life nuclides is proposed using the beam chopper technique in order to improve the measurement accuracy . T he internal standard method w as proposed for that large sample neutron measurement that brings self absorption and gamma‐ray self shielding effect .The PGNAA system was introduced at CARR .It pro‐vides methodology reference to establish the prompt gamma activation analysis on the base of CARR for our country .

  2. Comparing Techniques for Certified Static Analysis

    Science.gov (United States)

    Cachera, David; Pichardie, David

    2009-01-01

    A certified static analysis is an analysis whose semantic validity has been formally proved correct with a proof assistant. The recent increasing interest in using proof assistants for mechanizing programming language metatheory has given rise to several approaches for certification of static analysis. We propose a panorama of these techniques and compare their respective strengths and weaknesses.

  3. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  4. INVERSE FILTERING TECHNIQUES IN SPEECH ANALYSIS

    African Journals Online (AJOL)

    Dr Obe

    most convenient example, have been devised for obtaining waveforms related ... computer to speech analysis led to important elaborations of ... techniques of fast Fourier transformer (FFT) and. Analysis by ... the first three formants F1, F2, F3 to be made. Using the ... introduced and demonstrated to be a powerful tool for the ...

  5. Analysis of neutron flux distribution using the Monte Carlo method for the feasibility study of the Prompt Gamma Activation Analysis technique at the IPR-R1 TRIGA reactor

    Energy Technology Data Exchange (ETDEWEB)

    Guerra, Bruno T.; Pereira, Claubia, E-mail: brunoteixeiraguerra@yahoo.com.br, E-mail: claubia@nuclear.ufmg.br [Universidade Federal de Minas Gerais (DEN/UFMG), Belo Horizonte, MG (Brazil). Departmento de Energia Nuclear; Soares, Alexandre L.; Menezes, Maria Angela B.C., E-mail: menezes@cdtn.br, E-mail: asleal@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2015-07-01

    The IPR-R1 is a reactor type TRIGA, Mark-I model, manufactured by the General Atomic Company and installed at Nuclear Technology Development Centre (CDTN), Brazilian Commission for Nuclear Energy (CNEN), in Belo Horizonte, Brazil. It is a light water moderated and cooled, graphite-reflected, open-pool type research reactor and operates at 100 kW. It presents low power, low pressure, for application in research, training and radioisotopes production. The fuel is an alloy of zirconium hydride and uranium enriched at 20% in {sup 235}U. The implementation of the PGNAA (Prompt Gamma Neutron Activation Analysis) using this research reactor will significantly increase in number of chemical elements analysed and the kind of matrices. A project is underway in order to implement this technique at CDTN. The objective of this study was to contribute in feasibility analysis of implementing this technique. For this purpose, MCNP is being used. Some variance reduction tools in the methodology, that has been already developed, was introduced for calculating of the neutron flux in the neutron extractor inclined. The objective was to reduce the code error and thereby increasing the reliability of the results. With the implementation of the variance reduction tools, the results of the thermal and epithermal neutron fluxes presented a significant improvement in both calculations. (author)

  6. Applications of electrochemical techniques in mineral analysis.

    Science.gov (United States)

    Niu, Yusheng; Sun, Fengyue; Xu, Yuanhong; Cong, Zhichao; Wang, Erkang

    2014-09-01

    This review, covering reports published in recent decade from 2004 to 2013, shows how electrochemical (EC) techniques such as voltammetry, electrochemical impedance spectroscopy, potentiometry, coulometry, etc., have made significant contributions in the analysis of minerals such as clay, sulfide, oxide, and oxysalt. It was discussed based on the classifications of both the types of the used EC techniques and kinds of the analyzed minerals. Furthermore, minerals as electrode modification materials for EC analysis have also been summarized. Accordingly, research vacancies and future development trends in these areas are discussed.

  7. Alternative Analysis Techniques for Needs and Needs Documentation Techniques,

    Science.gov (United States)

    1980-06-20

    Have you previously participated in a brainwriting session! a. Yes b. No 9. Have you previously participated in the Nominal Group Technique process...brainstorming technique for future sessions. Strongly I Strongly disagree !I agree 8. It was easy to present my views using the brainwriting technique...Strongly!i Strongly disagree I , agree * 9. I was satisfied with the brainwriting technique. Strongly i Strongly disagree __ agree . 10. I recommend using

  8. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  9. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    OpenAIRE

    T. Subramani; J. Jothi

    2014-01-01

    This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case s...

  10. Kinematic Analysis of Healthy Hips during Weight-Bearing Activities by 3D-to-2D Model-to-Image Registration Technique

    Directory of Open Access Journals (Sweden)

    Daisuke Hara

    2014-01-01

    Full Text Available Dynamic hip kinematics during weight-bearing activities were analyzed for six healthy subjects. Continuous X-ray images of gait, chair-rising, squatting, and twisting were taken using a flat panel X-ray detector. Digitally reconstructed radiographic images were used for 3D-to-2D model-to-image registration technique. The root-mean-square errors associated with tracking the pelvis and femur were less than 0.3 mm and 0.3° for translations and rotations. For gait, chair-rising, and squatting, the maximum hip flexion angles averaged 29.6°, 81.3°, and 102.4°, respectively. The pelvis was tilted anteriorly around 4.4° on average during full gait cycle. For chair-rising and squatting, the maximum absolute value of anterior/posterior pelvic tilt averaged 12.4°/11.7° and 10.7°/10.8°, respectively. Hip flexion peaked on the way of movement due to further anterior pelvic tilt during both chair-rising and squatting. For twisting, the maximum absolute value of hip internal/external rotation averaged 29.2°/30.7°. This study revealed activity dependent kinematics of healthy hip joints with coordinated pelvic and femoral dynamic movements. Kinematics’ data during activities of daily living may provide important insight as to the evaluating kinematics of pathological and reconstructed hips.

  11. Comparative study between the PIXE technique and neutron activation analysis for Zinc determination; Estudo comparativo entre a tecnica de inducao de raios X por particulas e analise por ativacao com neutrons na determinacao do metal pesado zinco

    Energy Technology Data Exchange (ETDEWEB)

    Cruvinel, Paulo Estevao; Crestana, Silvio [Empresa Brasileira de Pesquisa Agropecuaria, Sao Carlos, SP (Brazil). CNPDIA. E-mail: cruvinel@cnpdia.embrapa.br; Armelin, Maria Jose Aguirre [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil); Artaxo Netto, Paulo Eduardo [Sao Paulo Univ., SP (Brazil). Inst. de Fisica

    1997-07-01

    This work presents a comparative study between the PIXE, proton beams and neutron activation analysis (NAA) techniques, for determination of total zinc concentration. Particularly, soil samples from the Pindorama, Instituto Agronomico de Campinas, Sao Paulo State, Brazil, experimental station have been analysed and measuring the zinc contents in {mu}g/g. The results presented good correlation between the mentioned techniques. The PIXE and NAA analyses have been carried out by using the series S, 2.4 MeV proton beams Pelletron accelerator and the IPEN/CNEN-IEA-R1 reactor, both installed at the Sao Paulo - Brazil university.

  12. PHOTOGRAMMETRIC TECHNIQUES FOR ROAD SURFACE ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. A. Knyaz

    2016-06-01

    Full Text Available The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  13. Photogrammetric Techniques for Road Surface Analysis

    Science.gov (United States)

    Knyaz, V. A.; Chibunichev, A. G.

    2016-06-01

    The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  14. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  15. A Comparative Analysis of Biomarker Selection Techniques

    Directory of Open Access Journals (Sweden)

    Nicoletta Dessì

    2013-01-01

    Full Text Available Feature selection has become the essential step in biomarker discovery from high-dimensional genomics data. It is recognized that different feature selection techniques may result in different set of biomarkers, that is, different groups of genes highly correlated to a given pathological condition, but few direct comparisons exist which quantify these differences in a systematic way. In this paper, we propose a general methodology for comparing the outcomes of different selection techniques in the context of biomarker discovery. The comparison is carried out along two dimensions: (i measuring the similarity/dissimilarity of selected gene sets; (ii evaluating the implications of these differences in terms of both predictive performance and stability of selected gene sets. As a case study, we considered three benchmarks deriving from DNA microarray experiments and conducted a comparative analysis among eight selection methods, representatives of different classes of feature selection techniques. Our results show that the proposed approach can provide useful insight about the pattern of agreement of biomarker discovery techniques.

  16. UPLC: a preeminent technique in pharmaceutical analysis.

    Science.gov (United States)

    Kumar, Ashok; Saini, Gautam; Nair, Anroop; Sharma, Rishbha

    2012-01-01

    The pharmaceutical companies today are driven to create novel and more efficient tools to discover, develop, deliver and monitor the drugs. In this contest the development of rapid chromatographic method is crucial for the analytical laboratories. In precedent decade, substantial technological advances have been done in enhancing particle chemistry performance, improving detector design and in optimizing the system, data processors and various controls of chromatographic techniques. When all was blended together, it resulted in the outstanding performance via ultra-high performance liquid chromatography (UPLC), which holds back the principle of HPLC technique. UPLC shows a dramatic enhancement in speed, resolution as well as the sensitivity of analysis by using particle size less than 2 pm and the system is operational at higher pressure, while the mobile phase could be able to run at greater linear velocities as compared to HPLC. This technique is considered as a new focal point in field of liquid chromatographic studies. This review focuses on the basic principle, instrumentation of UPLC and its advantages over HPLC, furthermore, this article emphasizes various pharmaceutical applications of this technique.

  17. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  18. PRELIMINARY RESULTS OF ATMOSPHERIC DEPOSITION OF MAJOR AND TRACE ELEMENTS IN THE GREATER AND LESSER CAUCASUS MOUNTAINS STUDIED BY THE MOSS TECHNIQUE AND NEUTRON ACTIVATION ANALYSIS

    Directory of Open Access Journals (Sweden)

    S. Shetekauri

    2015-05-01

    Full Text Available The method of moss biomonitoring of atmospheric deposition of trace elements was applied for the first time in the western Caucasus Mountains to assess the environmental situation in this region. The sixteen moss samples have been collected in 2014 summer growth period along altitudinal gradients in the range of altitudes from 600 m to 2665 m. Concentrations of Na, Mg, Al, Cl, K, Ca, Ti, V, Mn, Fe, Zn, As, Br, Rb, Mo, Cd, I, Sb, Ba, La, Sm, W, Au, and U determined by neutron activation analysis in the moss samples are reported. A comparison with the data for moss collected in Norway (pristine area was carried out.  Multivariate statistical analysis of the results was used for assessment pollution sources in the studied part of the Caucasus. The increase in concentrations of most of elements with rising altitude due to gradually disappearing vegetation cover and wind erosion of soil was observed. A comparison with the available data for moss collected in the Alps at the same altitude (~ 2500 m was performed.

  19. Biomechanical Analysis of Contemporary Throwing Technique Theory

    Directory of Open Access Journals (Sweden)

    Chen Jian

    2015-01-01

    Full Text Available Based on the movement process of throwing and in order to further improve the throwing technique of our country, this paper will first illustrate the main influence factors which will affect the shot distance via the mutual combination of movement equation and geometrical analysis. And then, it will give the equation of the acting force that the throwing athletes have to bear during throwing movement; and will reach the speed relationship between each arthrosis during throwing and batting based on the kinetic analysis of the throwing athletes’ arms while throwing. This paper will obtain the momentum relationship of the athletes’ each arthrosis by means of rotational inertia analysis; and then establish a restricted particle dynamics equation from the Lagrange equation. The obtained result shows that the momentum of throwing depends on the momentum of the athletes’ wrist joints while batting.

  20. COSIMA data analysis using multivariate techniques

    Directory of Open Access Journals (Sweden)

    J. Silén

    2014-08-01

    Full Text Available We describe how to use multivariate analysis of complex TOF-SIMS spectra introducing the method of random projections. The technique allows us to do full clustering and classification of the measured mass spectra. In this paper we use the tool for classification purposes. The presentation describes calibration experiments of 19 minerals on Ag and Au substrates using positive mode ion spectra. The discrimination between individual minerals gives a crossvalidation Cohen κ for classification of typically about 80%. We intend to use the method as a fast tool to deduce a qualitative similarity of measurements.

  1. Identifying pyroclastic and lahar deposits and assessing erosion and lahar hazards at active volcanoes using multi-temporal HSR image analysis and techniques for change detection

    Science.gov (United States)

    Kassouk, Zeineb; Thouret, Jean-Claude; Oehler, Jean-François; Solikhin, Akhmad

    2014-05-01

    The increasing availability of high-spatial resolution (HSR) remote sensing images leads to new opportunities for hazard assessment in the case of active volcanoes. Object-oriented analysis (OOA) of HSR images helps to simultaneously exploit spatial, spectral and contextual information. Here, we identify and delineate pyroclastic density current (PDC) and post-eruption lahar deposits on the south flank of Merapi volcano, Indonesia, after the large 2010 eruption. GeoEye-1 (2010 and 2011) and Pleiades (2012) images were analyzed with an adjusted object-oriented method. The PDC deposits include valley-confined block-and-ash flows (BAFs), unconfined, overbank pyroclastic flows (OPFs), and high-energy surges or ash-cloud surges. We follow up the evolution of the pyroclastic and lahar deposits through changes in the spectral indices calculated in segmented features, which represent the principal units of deposits and devastated areas. The object-oriented analysis has been applied to the pseudo image comprising of three spectral indices (NDWI water index; NDVI vegetation index; and NDRSI Red Soil Index). This pseudo image has enabled us to delineate fifteen units of PDC and lahar deposits, and damaged forests and settlements in the Gendol-Opak catchment (c.80 sqkm). The units represent 75% of classes obtained by photointerpretation of the same image and supported by field observations. A combination of NDWI and NDVI helps to separate areas affected by surges (NDWI 0.3 and NDWIsurges. The NDWI/NDRSI 2010 plot displays two clusters: NDRSI close to 0 is assigned to scoria-rich PFs while NDWI close to 0 and NDRSI4 x106/km2/year) from erosion acting in the Gendol valley, which characterize composite volcanoes after a large eruption. HSR images have also helped to measure geomorphic characteristics (channel capacity/wetted section; longitudinal change in channel confinement, and channel sinuosity) of river channels, which favor overbank and avulsion of lahars on a densely

  2. Development of a technique using MCNPX code for determination of nitrogen content of explosive materials using prompt gamma neutron activation analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Nasrabadi, M.N., E-mail: mnnasrabadi@ast.ui.ac.ir [Department of Nuclear Engineering, Faculty of Advanced Sciences and Technologies, University of Isfahan, Isfahan 81746-73441 (Iran, Islamic Republic of); Bakhshi, F.; Jalali, M.; Mohammadi, A. [Department of Nuclear Engineering, Faculty of Advanced Sciences and Technologies, University of Isfahan, Isfahan 81746-73441 (Iran, Islamic Republic of)

    2011-12-11

    Nuclear-based explosive detection methods can detect explosives by identifying their elemental components, especially nitrogen. Thermal neutron capture reactions have been used for detecting prompt gamma 10.8 MeV following radioactive neutron capture by {sup 14}N nuclei. We aimed to study the feasibility of using field-portable prompt gamma neutron activation analysis (PGNAA) along with improved nuclear equipment to detect and identify explosives, illicit substances or landmines. A {sup 252}Cf radio-isotopic source was embedded in a cylinder made of high-density polyethylene (HDPE) and the cylinder was then placed in another cylindrical container filled with water. Measurements were performed on high nitrogen content compounds such as melamine (C{sub 3}H{sub 6}N{sub 6}). Melamine powder in a HDPE bottle was placed underneath the vessel containing water and the neutron source. Gamma rays were detected using two NaI(Tl) crystals. The results were simulated with MCNP4c code calculations. The theoretical calculations and experimental measurements were in good agreement indicating that this method can be used for detection of explosives and illicit drugs.

  3. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    S V Dhurandhar

    2004-10-01

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting binary inspirals; Fourier transforms over Doppler shifted time intervals are computed for long duration periodic sources; optimally weighted cross-correlations for stochastic background. Some recent schemes which efficiently search for inspirals will be described. The performance of some of these techniques on real data obtained will be discussed. Finally, some results on cancellation of systematic noises in laser interferometric space antenna (LISA) will be presented and future directions indicated.

  4. Application of Electromigration Techniques in Environmental Analysis

    Science.gov (United States)

    Bald, Edward; Kubalczyk, Paweł; Studzińska, Sylwia; Dziubakiewicz, Ewelina; Buszewski, Bogusław

    Inherently trace-level concentration of pollutants in the environment, together with the complexity of sample matrices, place a strong demand on the detection capabilities of electromigration methods. Significant progress is continually being made, widening the applicability of these techniques, mostly capillary zone electrophoresis, micellar electrokinetic chromatography, and capillary electrochromatography, to the analysis of real-world environmental samples, including the concentration sensitivity and robustness of the developed analytical procedures. This chapter covers the recent major developments in the domain of capillary electrophoresis analysis of environmental samples for pesticides, polycyclic aromatic hydrocarbons, phenols, amines, carboxylic acids, explosives, pharmaceuticals, and ionic liquids. Emphasis is made on pre-capillary and on-capillary chromatography and electrophoresis-based concentration of analytes and detection improvement.

  5. A numerical comparison of sensitivity analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hamby, D.M.

    1993-12-31

    Engineering and scientific phenomena are often studied with the aid of mathematical models designed to simulate complex physical processes. In the nuclear industry, modeling the movement and consequence of radioactive pollutants is extremely important for environmental protection and facility control. One of the steps in model development is the determination of the parameters most influential on model results. A {open_quotes}sensitivity analysis{close_quotes} of these parameters is not only critical to model validation but also serves to guide future research. A previous manuscript (Hamby) detailed many of the available methods for conducting sensitivity analyses. The current paper is a comparative assessment of several methods for estimating relative parameter sensitivity. Method practicality is based on calculational ease and usefulness of the results. It is the intent of this report to demonstrate calculational rigor and to compare parameter sensitivity rankings resulting from various sensitivity analysis techniques. An atmospheric tritium dosimetry model (Hamby) is used here as an example, but the techniques described can be applied to many different modeling problems. Other investigators (Rose; Dalrymple and Broyd) present comparisons of sensitivity analyses methodologies, but none as comprehensive as the current work.

  6. Learning by Doing: An Empirical Study of Active Teaching Techniques

    Science.gov (United States)

    Hackathorn, Jana; Solomon, Erin D.; Blankmeyer, Kate L.; Tennial, Rachel E.; Garczynski, Amy M.

    2011-01-01

    The current study sought to examine the effectiveness of four teaching techniques (lecture, demonstrations, discussions, and in-class activities) in the classroom. As each technique offers different benefits to the instructor and students, each technique was expected to aid in a different depth of learning. The current findings indicated that each…

  7. Gas Chromatographic-Mass Spectrometric Analysis of Volatiles Obtained by Four Different Techniques from Salvia rosifolia Sm. and Evaluation for Biological Activity

    Science.gov (United States)

    Volatile constituents from the aerial parts of Salvia rosifolia Sm. (Lamiaceae), endemic to Turkey, were obtained by four different isolation techniques and then analyzed by gas chromatography (GC/FID) and gas chromatography – mass spectrometry (GC/MS) methods. Also in scope of the present work, the...

  8. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  9. Quantitative analysis of Li by PIGE technique

    Science.gov (United States)

    Fonseca, M.; Mateus, R.; Santos, C.; Cruz, J.; Silva, H.; Luis, H.; Martins, L.; Jesus, A. P.

    2017-09-01

    In this work, the cross section of the reactions 7Li(p,pγ)7Li (γ - 478 keV) at the proton energy range 2.0-4.2 MeV was measured. The measurements were carried out at the 3 MV Tandem Accelerator at the CTN/IST Laboratory in Lisbon. To validate the obtained results, calculated gamma-ray yields were compared, at several proton energy values, with experimental yields for thick samples made of inorganic compounds containing lithium. In order to quantify the light elements present in the samples, we used a standard free method for PIGE in thick samples, based on a code - Emitted Radiation Yield Analysis (ERYA), which integrates the nuclear reaction excitation function along the depth of the sample. We also demonstrated the capacity of the technique for analysis of Li ores, as Spodumene, Lithium Muscovite and Holmquistite, and Li-alloys for plasma facing materials showing that this is a reliable and accurate method for PIGE analysis of Li in thick samples.

  10. Analytical techniques in pharmaceutical analysis: A review

    Directory of Open Access Journals (Sweden)

    Masoom Raza Siddiqui

    2017-02-01

    Full Text Available The development of the pharmaceuticals brought a revolution in human health. These pharmaceuticals would serve their intent only if they are free from impurities and are administered in an appropriate amount. To make drugs serve their purpose various chemical and instrumental methods were developed at regular intervals which are involved in the estimation of drugs. These pharmaceuticals may develop impurities at various stages of their development, transportation and storage which makes the pharmaceutical risky to be administered thus they must be detected and quantitated. For this analytical instrumentation and methods play an important role. This review highlights the role of the analytical instrumentation and the analytical methods in assessing the quality of the drugs. The review highlights a variety of analytical techniques such as titrimetric, chromatographic, spectroscopic, electrophoretic, and electrochemical and their corresponding methods that have been applied in the analysis of pharmaceuticals.

  11. Techniques for Analysis of Plant Phenolic Compounds

    Directory of Open Access Journals (Sweden)

    Thomas H. Roberts

    2013-02-01

    Full Text Available Phenolic compounds are well-known phytochemicals found in all plants. They consist of simple phenols, benzoic and cinnamic acid, coumarins, tannins, lignins, lignans and flavonoids. Substantial developments in research focused on the extraction, identification and quantification of phenolic compounds as medicinal and/or dietary molecules have occurred over the last 25 years. Organic solvent extraction is the main method used to extract phenolics. Chemical procedures are used to detect the presence of total phenolics, while spectrophotometric and chromatographic techniques are utilized to identify and quantify individual phenolic compounds. This review addresses the application of different methodologies utilized in the analysis of phenolic compounds in plant-based products, including recent technical developments in the quantification of phenolics.

  12. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    2015-01-01

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  13. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  14. Performance Comparison of Active Queue Management Techniques

    Directory of Open Access Journals (Sweden)

    T. B. Reddy

    2008-01-01

    Full Text Available Congestion is an important issue which researchers focus on in the Transmission Control Protocol (TCP network environment. To keep the stability of the whole network, congestion control algorithms have been extensively studied. Queue management method employed by the routers is one of the important issues in the congestion control study. Active Queue Management (AQM has been proposed as a router-based mechanism for early detection of congestion inside the network. In this study, we are comparing AQM two popular queue management methods, Random Early Detection (RED and droptail, in different aspects, such as throughput and fairness Index. The comparison results indicate RED performed slightly better with higher throughput and higher fairness Index than droptail. Simulation is done by using Network Simulator (NS2 and the graphs are drawn using X- graph.

  15. Conference on Instrumental Activation Analysis: IAA 89

    Science.gov (United States)

    Vobecky, M.; Obrusnik, I.

    1989-05-01

    The proceedings contain 40 abstracts of papers all of which have been incorporated in INIS. The papers were centred on the applications of radioanalytical methods, especially on neutron activation analysis, x ray fluorescence analysis, PIXE analysis and tracer techniques in biology, medicine and metallurgy, measuring instruments including microcomputers, and data processing methods.

  16. Input techniques that dynamically change their cursor activation area

    DEFF Research Database (Denmark)

    Hertzum, Morten; Hornbæk, Kasper

    2007-01-01

    cursor, whose activation area always contains the closest object, and two variants of cell cursors, whose activation areas contain a set of objects in the vicinity of the cursor. We report two experiments that compare these techniques to a point cursor; in one experiment participants use a touchpad......Efficient pointing is crucial to graphical user interfaces, and input techniques that dynamically change their activation area may yield improvements over point cursors by making objects selectable at a distance. Input techniques that dynamically change their activation area include the bubble...... for operating the input techniques, in the other a mouse. In both experiments, the bubble cursor is fastest and participants make fewer errors with it. Participants also unanimously prefer this technique. For small targets, the cell cursors are generally more accurate than the point cursor; in the second...

  17. 78 FR 37690 - Federal Acquisition Regulation; Price Analysis Techniques

    Science.gov (United States)

    2013-06-21

    ... Federal Acquisition Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General... clarify and give a precise reference in the use of a price analysis technique in order to establish a fair... reference used in FAR 15.404-1(b)(2)(i). FAR 15.404-1(b)(2) addresses various price analysis techniques and...

  18. 77 FR 40552 - Federal Acquisition Regulation; Price Analysis Techniques

    Science.gov (United States)

    2012-07-10

    ... Federal Acquisition Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General... clarify the use of a price analysis technique in order to establish a fair and reasonable price. DATES....404-1(b)(2) addresses various price analysis techniques and procedures the Government may use to...

  19. Attitude Exploration Using Factor Analysis Technique

    Directory of Open Access Journals (Sweden)

    Monika Raghuvanshi

    2016-12-01

    Full Text Available Attitude is a psychological variable that contains positive or negative evaluation about people or an environment. The growing generation possesses learning skills, so if positive attitude is inculcated at the right age, it might therefore become habitual. Students in the age group 14-20 years from the city of Bikaner, India, are the target population for this study. An inventory of 30Likert-type scale statements was prepared in order to measure attitude towards the environment and matters related to conservation. The primary data is collected though a structured questionnaire, using cluster sampling technique and analyzed using the IBM SPSS 23 statistical tool. Factor analysis is used to reduce 30 variables to a smaller number of more identifiable groups of variables. Results show that students “need more regulation and voluntary participation to protect the environment”, “need conservation of water and electricity”, “are concerned for undue wastage of water”, “need visible actions to protect the environment”, “need strengthening of the public transport system”, “are a little bit ignorant about the consequences of global warming”, “want prevention of water pollution by industries”, “need changing of personal habits to protect the environment”, and “don’t have firsthand experience of global warming”. Analysis revealed that nine factors obtained could explain about 58.5% variance in the attitude of secondary school students towards the environment in the city of Bikaner, India. The remaining 39.6% variance is attributed to other elements not explained by this analysis. A global campaign for improvement in attitude about environmental issues and its utility in daily lives may boost positive youth attitudes, potentially impacting worldwide. A cross-disciplinary approach may be developed by teaching along with other related disciplines such as science, economics, and social studies etc.

  20. Application of Active Flow Control Technique for Gust Load Alleviation

    Institute of Scientific and Technical Information of China (English)

    XU Xiaoping; ZHU Xiaoping; ZHOU Zhou; FAN Ruijun

    2011-01-01

    A new gust load alleviation technique is presented in this paper based on active flow control.Numerical studies are conducted to investigate the beneficial effects on the aerodynamic characteristics of the quasi “Global Hawk” airfoil using arrays of jets during the gust process.Based on unsteady Navier-Stokes equations,the grid-velocity method is introduced to simulate the gust influence,and dynamic response in vertical gust flow perturbation is investigated for the airfoil as well.An unsteady surface transpiration boundary condition is enforced over a user specified portion of the airfoil's surface to emulate the time dependent velocity boundary conditions.Firstly,after applying this method to simulate typical NACA0006 airfoil gust response to a step change in the angle of attack,it shows that the indicial responses of the airfoil make good agreement with the exact theoretical values and the calculated values in references.Furthermore,gust response characteristic for the quasi “Global Hawk” airfoil is analyzed.Five kinds of flow control techniques are introduced as steady blowing,steady suction,unsteady blowing,unsteady suction and synthetic jets.The physical analysis of the influence on the effects of gust load alleviation is proposed to provide some guidelines for practice.Numerical results have indicated that active flow control technique,as a new technology of gust load alleviation,can affect and suppress the fluid disturbances caused by gust so as to achieve the purpose of gust load alleviation.

  1. Analysis of Hospital Processes with Process Mining Techniques.

    Science.gov (United States)

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  2. Numerical modeling techniques for flood analysis

    Science.gov (United States)

    Anees, Mohd Talha; Abdullah, K.; Nawawi, M. N. M.; Ab Rahman, Nik Norulaini Nik; Piah, Abd. Rahni Mt.; Zakaria, Nor Azazi; Syakir, M. I.; Mohd. Omar, A. K.

    2016-12-01

    Topographic and climatic changes are the main causes of abrupt flooding in tropical areas. It is the need to find out exact causes and effects of these changes. Numerical modeling techniques plays a vital role for such studies due to their use of hydrological parameters which are strongly linked with topographic changes. In this review, some of the widely used models utilizing hydrological and river modeling parameters and their estimation in data sparse region are discussed. Shortcomings of 1D and 2D numerical models and the possible improvements over these models through 3D modeling are also discussed. It is found that the HEC-RAS and FLO 2D model are best in terms of economical and accurate flood analysis for river and floodplain modeling respectively. Limitations of FLO 2D in floodplain modeling mainly such as floodplain elevation differences and its vertical roughness in grids were found which can be improve through 3D model. Therefore, 3D model was found to be more suitable than 1D and 2D models in terms of vertical accuracy in grid cells. It was also found that 3D models for open channel flows already developed recently but not for floodplain. Hence, it was suggested that a 3D model for floodplain should be developed by considering all hydrological and high resolution topographic parameter's models, discussed in this review, to enhance the findings of causes and effects of flooding.

  3. Function Analysis and Decomposistion using Function Analysis Systems Technique

    Energy Technology Data Exchange (ETDEWEB)

    Wixson, James Robert

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Technique (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  4. Function Analysis and Decomposistion using Function Analysis Systems Technique

    Energy Technology Data Exchange (ETDEWEB)

    J. R. Wixson

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Techniques (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  5. Particle fluence measurements by activation technique for radiation damage studies

    CERN Document Server

    León-Florián, E; Furetta, C; Leroy, Claude

    1995-01-01

    High-level radiation environment can produce radiation damage in detectors and their associate electronic components. The establishment of a correlation between damage, irradiation level and absorbed dose requires a precise measurement of the fluence of particles causing the damage. The activation technique is frequently used for performing particle fluence measurements. A review of this technique is presented.

  6. Development of Prompt Gamma Neutron Activation Analysis Techniques%瞬发伽马中子活化分析技术发展现状

    Institute of Scientific and Technical Information of China (English)

    卢毅; 宋朝晖

    2013-01-01

    It made a brief summarization of the current development of the Prompt Gamma Neutron Activation A -nalysis.The PGNAA theory, method, facility and international works on the application of PGNAA were intro-duced .In the end , there was a discussion about some problems in the development of PGNAA .%对瞬发伽马中子活化分析( PGNAA)技术发展现状进行了概述。介绍了PGNAA的基本原理、方法、设备以及当前国内外在PGNAA应用方面所做的一些研究工作。最后对PGNAA在技术发展方面存在的一些问题进行了探讨。

  7. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  8. Talking Techne: Techniques to Establish an Active Online Discussion Forum

    Science.gov (United States)

    Palenque, Stephanie Maher; DeCosta, Meredith

    2015-01-01

    Discussion forums are critically important to the online classroom, as they virtually take the place of a classroom discussion and become a stage on which active learning takes place. Active learning occurs when instructors practice certain techniques in the discussion that are carefully and thoughtfully crafted and guided. The authors propose the…

  9. Quality assurance and quantitative error analysis by tracer techniques

    Energy Technology Data Exchange (ETDEWEB)

    Schuetze, N.; Hermann, U.

    1983-12-01

    The locations, types and sources of casting defects have been tested by tracer techniques. Certain sites of moulds were labelled using /sup 199/Au, /sup 24/Na sodium carbonate solution, and technetium solution produced in the technetium generator on a /sup 99/Mo//sup 99/Tc elution column. Evaluations were made by means of activity measurements and autoradiography. The locations and causes of casting defects can be determined by error analysis. The surface defects of castings resulting from the moulding materials and from the blacking can be detected by technetium, the subsurface defects are located by gold.

  10. Data Analysis Techniques for a Lunar Surface Navigation System Testbed

    Science.gov (United States)

    Chelmins, David; Sands, O. Scott; Swank, Aaron

    2011-01-01

    NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.

  11. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  12. Determination of silver, gold, zinc and copper in mineral samples by various techniques of instrumental neutron activation analysis; Determinacion de plata, oro, zinc y cobre en muestras minerales mediante diversas tecnicas de analisis por activacion de neutrones instrumental

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez R, N. I.; Rios M, C.; Pinedo V, J. L. [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Cipres No. 10, Fracc. La Penuela, 98068 Zacatecas, Zac. (Mexico); Yoho, M.; Landsberger, S., E-mail: neisla126@hotmail.com [University of Texas at Austin, Nuclear Engineering Teaching Laboratory, Austin 78712, Texas (United States)

    2015-09-15

    Using the method of instrumental neutron activation analysis, mineral exploration samples were analyzed in order to determine the concentrations of silver, gold, zinc and copper; these minerals being the main products of benefit of Tizapa and Cozamin mines. Samples were subjected to various techniques, where the type of radiation and counting methods were chosen based on the specific isotopic characteristics of each element. For calibration and determination of concentrations the comparator method was used, certified standards were subjected to the same conditions of irradiation and measurement that the prospecting samples. The irradiations were performed at the research reactor TRIGA Mark II of the University of Texas at Austin. The silver concentrations were determined by Cyclical Epithermal Neutron Activation Analysis. This method in combination with the transfer pneumatic system allowed a good analytical precision and accuracy in prospecting for silver, from photo peak measurement 657.7 keV of short half-life radionuclide {sup 110}Ag. For the determination of gold and zinc, Epithermal Neutron Activation Analysis was used, the photo peaks analyzed corresponded to the energies 411.8 keV of radionuclide {sup 199}Au and 438.6 keV of metastable radionuclide {sup 69m}Zn. On the other hand, copper quantification was based on the photo peak analysis of 1039.2 keV produced by the short half-life radionuclide {sup 66}Cu, by Thermal Neutron Activation Analysis. The photo peaks measurement corresponding to gold, zinc and copper was performed using a Compton suppression system, which allowed an improvement in the signal to noise relationship, so that better detection limits and low uncertainties associated with the results were obtained. Comparing elemental concentrations the highest values in silver, zinc and copper was for samples of mine Tizapa. Regarding gold values were found in the same range for both mines. To evaluate the precision and accuracy of the methods used

  13. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  14. Node Augmentation Technique in Bayesian Network Evidence Analysis and Marshaling

    Energy Technology Data Exchange (ETDEWEB)

    Keselman, Dmitry [Los Alamos National Laboratory; Tompkins, George H [Los Alamos National Laboratory; Leishman, Deborah A [Los Alamos National Laboratory

    2010-01-01

    Given a Bayesian network, sensitivity analysis is an important activity. This paper begins by describing a network augmentation technique which can simplifY the analysis. Next, we present two techniques which allow the user to determination the probability distribution of a hypothesis node under conditions of uncertain evidence; i.e. the state of an evidence node or nodes is described by a user specified probability distribution. Finally, we conclude with a discussion of three criteria for ranking evidence nodes based on their influence on a hypothesis node. All of these techniques have been used in conjunction with a commercial software package. A Bayesian network based on a directed acyclic graph (DAG) G is a graphical representation of a system of random variables that satisfies the following Markov property: any node (random variable) is independent of its non-descendants given the state of all its parents (Neapolitan, 2004). For simplicities sake, we consider only discrete variables with a finite number of states, though most of the conclusions may be generalized.

  15. Escherichia coli activity characterization using a laser dynamic speckle technique

    CERN Document Server

    Ramírez-Miquet, Evelio E; Contreras-Alarcón, Orestes R

    2012-01-01

    The results of applying a laser dynamic speckle technique to characterize bacterial activity are presented. The speckle activity was detected in two-compartment Petri dishes. One compartment was inoculated and the other one was left as a control blank. The speckled images were processed by the recently reported temporal difference method. Three inoculums of 0.3, 0.5, and 0.7 McFarland units of cell concentration were tested; each inoculum was tested twice for a total of six experiments. The dependences on time of the mean activity, the standard deviation of activity and other descriptors of the speckle pattern evolution were calculated for both the inoculated compartment and the blank. In conclusion the proposed dynamic speckle technique allows characterizing the activity of Escherichia coli bacteria in solid medium.

  16. COMPARISON OF ACTIVE RELEASE TECHNIQUE AND MYOFASCIAL RELEASE TECHNIQUE ON PAIN, GRIP STRENGTH & FUNCTIONAL PERFORMANCE IN PATIENTS WITH CHRONIC LATERAL EPICONDYLITIS

    Directory of Open Access Journals (Sweden)

    Parth Trivedi

    2014-06-01

    Full Text Available Background & Purpose: Lateral epicondylitis is the most common lesion of the elbow. Tennis elbow or lateral epicondylitis is defined as a syndrome of pain in the wrist extensor muscles at or near their lateral epicondyle origin or pain directly over the lateral epicondyle. So, the aim of this study was to compare the effectiveness of Active Release Technique (ART and Myofascial Release Technique (MFR in the treatment of Chronic Lateral Epicondylitis (CLE. Methodology: The study included thirty-six patients with Chronic Lateral Epicondylitis of age group range between 30 to 45 years. Patients were randomly divided into three groups: Control Group (A, Active Release Technique Group (B and Myofascial Release Technique Group (C. The patients were treated for 4 weeks and three outcome measures: 0-10 NPRS, Hand Dynamometer and PRTEE were taken for assessment and analysis at baseline and after 4th weeks was done. Result: In this study the result showed that Active Release Technique and Myofascial Release Technique were effective in all three outcome measures when compared to Control Group. Myofascial Release Technique was more effective in improving grip strength & reducing pain & disability when compared to Active Release Technique.(p<0.05 Conclusion: Active Release Technique and Myofascial Release Technique are effective in patients with Chronic Lateral Epicondylitis. Myofascial Release Technique demonstrated better outcomes than Active Release Technique in the management of Chronic Lateral Epicondylitis.

  17. Classification Techniques for Multivariate Data Analysis.

    Science.gov (United States)

    1980-03-28

    analysis among biologists, botanists, and ecologists, while some social scientists may refer "typology". Other frequently encountered terms are pattern...the determinantal equation: lB -XW 0 (42) 49 The solutions X. are the eigenvalues of the matrix W-1 B 1 as in discriminant analysis. There are t non...Statistical Package for Social Sciences (SPSS) (14) subprogram FACTOR was used for the principal components analysis. It is designed both for the factor

  18. Stalked protozoa identification by image analysis and multivariable statistical techniques.

    Science.gov (United States)

    Amaral, A L; Ginoris, Y P; Nicolau, A; Coelho, M A Z; Ferreira, E C

    2008-06-01

    Protozoa are considered good indicators of the treatment quality in activated sludge systems as they are sensitive to physical, chemical and operational processes. Therefore, it is possible to correlate the predominance of certain species or groups and several operational parameters of the plant. This work presents a semiautomatic image analysis procedure for the recognition of the stalked protozoa species most frequently found in wastewater treatment plants by determining the geometrical, morphological and signature data and subsequent processing by discriminant analysis and neural network techniques. Geometrical descriptors were found to be responsible for the best identification ability and the identification of the crucial Opercularia and Vorticella microstoma microorganisms provided some degree of confidence to establish their presence in wastewater treatment plants.

  19. Analysis of Gopher Tortoise Population Estimation Techniques

    Science.gov (United States)

    2005-10-01

    terrestrial reptile that was once found throughout the southeastern United States from North Carolina into Texas. However, due to numerous factors...et al. 2000, Waddle 2000). Solar energy is used for thermoregulation and egg incubation. Also, tortoises are grazers (Garner and Landers 1981...Evaluation and review of field techniques used to study and manage gopher tortoises.” Pages 205-215 in Management of amphibians, reptiles , and small mammals

  20. Impedance Flow Cytometry: A Novel Technique in Pollen Analysis.

    Science.gov (United States)

    Heidmann, Iris; Schade-Kampmann, Grit; Lambalk, Joep; Ottiger, Marcel; Di Berardino, Marco

    2016-01-01

    An efficient and reliable method to estimate plant cell viability, especially of pollen, is important for plant breeding research and plant production processes. Pollen quality is determined by classical methods, like staining techniques or in vitro pollen germination, each having disadvantages with respect to reliability, analysis speed, and species dependency. Analysing single cells based on their dielectric properties by impedance flow cytometry (IFC) has developed into a common method for cellular characterisation in microbiology and medicine during the last decade. The aim of this study is to demonstrate the potential of IFC in plant cell analysis with the focus on pollen. Developing and mature pollen grains were analysed during their passage through a microfluidic chip to which radio frequencies of 0.5 to 12 MHz were applied. The acquired data provided information about the developmental stage, viability, and germination capacity. The biological relevance of the acquired IFC data was confirmed by classical staining methods, inactivation controls, as well as pollen germination assays. Different stages of developing pollen, dead, viable and germinating pollen populations could be detected and quantified by IFC. Pollen viability analysis by classical FDA staining showed a high correlation with IFC data. In parallel, pollen with active germination potential could be discriminated from the dead and the viable but non-germinating population. The presented data demonstrate that IFC is an efficient, label-free, reliable and non-destructive technique to analyse pollen quality in a species-independent manner.

  1. Trends and Techniques in Visual Gaze Analysis

    CERN Document Server

    Stellmach, Sophie; Dachselt, Raimund; Lindley, Craig A

    2010-01-01

    Visualizing gaze data is an effective way for the quick interpretation of eye tracking results. This paper presents a study investigation benefits and limitations of visual gaze analysis among eye tracking professionals and researchers. The results were used to create a tool for visual gaze analysis within a Master's project.

  2. Robust control design techniques for active flutter suppression

    Science.gov (United States)

    Ozbay, Hitay; Bachmann, Glen R.

    1994-01-01

    In this paper, an active flutter suppression problem is studied for a thin airfoil in unsteady aerodynamics. The mathematical model of this system is infinite dimensional because of Theodorsen's function which is irrational. Several second order approximations of Theodorsen's function are compared. A finite dimensional model is obtained from such an approximation. We use H infinity control techniques to find a robustly stabilizing controller for active flutter suppression.

  3. 48 CFR 215.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Proposal analysis techniques. 215.404-1 Section 215.404-1 Federal Acquisition Regulations System DEFENSE ACQUISITION... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for...

  4. Weed Identification Using An Automated Active Shape Matching (AASM) Technique

    DEFF Research Database (Denmark)

    Swain, K C; Nørremark, Michael; Jørgensen, R N

    2011-01-01

    Weed identification and control is a challenge for intercultural operations in agriculture. As an alternative to chemical pest control, a smart weed identification technique followed by mechanical weed control system could be developed. The proposed smart identification technique works...... on the concept of ‘active shape modelling’ to identify weed and crop plants based on their morphology. The automated active shape matching system (AASM) technique consisted of, i) a Pixelink camera ii) an LTI (Lehrstuhlfuer technische informatik) image processing library, iii) a laptop pc with the Linux OS. A 2......-leaf growth stage model for Solanum nigrum L. (nightshade) is generated from 32 segmented training images in Matlab software environment. Using the AASM algorithm, the leaf model was aligned and placed at the centre of the target plant and a model deformation process carried out. The parameters used...

  5. Bonding techniques for hybrid active pixel sensors (HAPS)

    Science.gov (United States)

    Bigas, M.; Cabruja, E.; Lozano, M.

    2007-05-01

    A hybrid active pixel sensor (HAPS) consists of an array of sensing elements which is connected to an electronic read-out unit. The most used way to connect these two different devices is bump bonding. This interconnection technique is very suitable for these systems because it allows a very fine pitch and a high number of I/Os. However, there are other interconnection techniques available such as direct bonding. This paper, as a continuation of a review [M. Lozano, E. Cabruja, A. Collado, J. Santander, M. Ullan, Nucl. Instr. and Meth. A 473 (1-2) (2001) 95-101] published in 2001, presents an update of the different advanced bonding techniques available for manufacturing a hybrid active pixel detector.

  6. A Technique for Shunt Active Filter meld micro grid System

    Directory of Open Access Journals (Sweden)

    A. Lumani

    2015-08-01

    Full Text Available The proposed system presents a control technique for a micro grid connected hybrid generation system ith case study interfaced with a three phase shunt active filter to suppress the current harmonics and reactive power present in the load using PQ Theory with ANN controller. This Hybrid Micro Grid is developed using freely renewable energy resources like Solar Photovoltaic (SPV and Wind Energy (WE. To extract the maximum available power from PV panels and wind turbines, Maximum power point Tracker (MPPT has been included. This MPPT uses the “Standard Perturbs and Observe” technique. By using PQ Theory with ANN Controller, the Reference currents are generated which are to be injected by Shunt active power filter (SAPFto compensate the current harmonics in the non linear load. Simulation studies shows that the proposed control technique performs non-linear load current harmonic compensation maintaining the load current in phase with the source voltage.\\

  7. Active structural control with stable fuzzy PID techniques

    CERN Document Server

    Yu, Wen

    2016-01-01

    This book presents a detailed discussion of intelligent techniques to measure the displacement of buildings when they are subjected to vibration. It shows how these techniques are used to control active devices that can reduce vibration 60–80% more effectively than widely used passive anti-seismic systems. After introducing various structural control devices and building-modeling and active structural control methods, the authors propose offset cancellation and high-pass filtering techniques to solve some common problems of building-displacement measurement using accelerometers. The most popular control algorithms in industrial settings, PD/PID controllers, are then analyzed and then combined with fuzzy compensation. The stability of this combination is proven with standard weight-training algorithms. These conditions provide explicit methods for selecting PD/PID controllers. Finally, fuzzy-logic and sliding-mode control are applied to the control of wind-induced vibration. The methods described are support...

  8. NEW TECHNIQUES USED IN AUTOMATED TEXT ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. I strate

    2010-12-01

    Full Text Available Automated analysis of natural language texts is one of the most important knowledge discovery tasks for any organization. According to Gartner Group, almost 90% of knowledge available at an organization today is dispersed throughout piles of documents buried within unstructured text. Analyzing huge volumes of textual information is often involved in making informed and correct business decisions. Traditional analysis methods based on statistics fail to help processing unstructured texts and the society is in search of new technologies for text analysis. There exist a variety of approaches to the analysis of natural language texts, but most of them do not provide results that could be successfully applied in practice. This article concentrates on recent ideas and practical implementations in this area.

  9. The Network Protocol Analysis Technique in Snort

    Science.gov (United States)

    Wu, Qing-Xiu

    Network protocol analysis is a network sniffer to capture data for further analysis and understanding of the technical means necessary packets. Network sniffing is intercepted by packet assembly binary format of the original message content. In order to obtain the information contained. Required based on TCP / IP protocol stack protocol specification. Again to restore the data packets at protocol format and content in each protocol layer. Actual data transferred, as well as the application tier.

  10. Innovative Perceptual Motor Activities: Programing Techniques That Work.

    Science.gov (United States)

    Sorrell, Howard M.

    1978-01-01

    A circuit approach and station techniques are used to depict perceptual motor games for handicapped and nonhandicapped children. Twenty activities are described in terms of objectives, materials, and procedures, and their focus on visual tracking, visual discrimination and copying of forms, spatial body perception, fine motor coordination, tactile…

  11. Macro elemental analysis of food samples by nuclear analytical technique

    Science.gov (United States)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  12. Uncertainty Analysis Technique for OMEGA Dante Measurements

    Energy Technology Data Exchange (ETDEWEB)

    May, M J; Widmann, K; Sorce, C; Park, H; Schneider, M

    2010-05-07

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  13. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  14. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  15. OPERATIONAL MODAL ANALYSIS SCHEMES USING CORRELATION TECHNIQUE

    Institute of Scientific and Technical Information of China (English)

    Zheng Min; Shen Fan; Chen Huaihai

    2005-01-01

    For some large-scale engineering structures in operating conditions, modal parameters estimation must base itself on response-only data. This problem has received a considerable amount of attention in the past few years. It is well known that the cross-correlation function between the measured responses is a sum of complex exponential functions of the same form as the impulse response function of the original system. So this paper presents a time-domain operating modal identification global scheme and a frequency-domain scheme from output-only by coupling the cross-correlation function with conventional modal parameter estimation. The outlined techniques are applied to an airplane model to estimate modal parameters from response-only data.

  16. Assessing voluntary muscle activation with the twitch interpolation technique.

    Science.gov (United States)

    Shield, Anthony; Zhou, Shi

    2004-01-01

    The twitch interpolation technique is commonly employed to assess the completeness of skeletal muscle activation during voluntary contractions. Early applications of twitch interpolation suggested that healthy human subjects could fully activate most of the skeletal muscles to which the technique had been applied. More recently, however, highly sensitive twitch interpolation has revealed that even healthy adults routinely fail to fully activate a number of skeletal muscles despite apparently maximal effort. Unfortunately, some disagreement exists as to how the results of twitch interpolation should be employed to quantify voluntary activation. The negative linear relationship between evoked twitch force and voluntary force that has been observed by some researchers implies that voluntary activation can be quantified by scaling a single interpolated twitch to a control twitch evoked in relaxed muscle. Observations of non-linear evoked-voluntary force relationships have lead to the suggestion that the single interpolated twitch ratio can not accurately estimate voluntary activation. Instead, it has been proposed that muscle activation is better determined by extrapolating the relationship between evoked and voluntary force to provide an estimate of true maximum force. However, criticism of the single interpolated twitch ratio typically fails to take into account the reasons for the non-linearity of the evoked-voluntary force relationship. When these reasons are examined, it appears that most are even more challenging to the validity of extrapolation than they are to the linear equation. Furthermore, several factors that contribute to the observed non-linearity can be minimised or even eliminated with appropriate experimental technique. The detection of small activation deficits requires high resolution measurement of force and careful consideration of numerous experimental details such as the site of stimulation, stimulation intensity and the number of interpolated

  17. The German Passive: Analysis and Teaching Technique.

    Science.gov (United States)

    Griffen, T. D.

    1981-01-01

    Proposes an analysis of German passive based upon internal structure rather than translation conventions from Latin and Greek. Claims that this approach leads to a description of the perfect participle as an adjectival complement, which eliminates the classification of a passive voice for German and simplifies the learning task. (MES)

  18. Comparison of Hydrogen Sulfide Analysis Techniques

    Science.gov (United States)

    Bethea, Robert M.

    1973-01-01

    A summary and critique of common methods of hydrogen sulfide analysis is presented. Procedures described are: reflectance from silver plates and lead acetate-coated tiles, lead acetate and mercuric chloride paper tapes, sodium nitroprusside and methylene blue wet chemical methods, infrared spectrophotometry, and gas chromatography. (BL)

  19. An analysis technique for microstrip antennas

    Science.gov (United States)

    Agrawal, P. K.; Bailey, M. C.

    1977-01-01

    The paper presents a combined numerical and empirical approach to the analysis of microstrip antennas over a wide range of frequencies. The method involves representing the antenna by a fine wire grid immersed in a dielectric medium and then using Richmond's reaction formulation (1974) to evaluate the piecewise sinusoidal currents on the grid segments. The calculated results are then modified to account for the finite dielectric discontinuity. The method is applied to round and square microstrip antennas.

  20. A comparison of wavelet analysis techniques in digital holograms

    Science.gov (United States)

    Molony, Karen M.; Maycock, Jonathan; McDonald, John B.; Hennelly, Bryan M.; Naughton, Thomas J.

    2008-04-01

    This study explores the effectiveness of wavelet analysis techniques on digital holograms of real-world 3D objects. Stationary and discrete wavelet transform techniques have been applied for noise reduction and compared. Noise is a common problem in image analysis and successful reduction of noise without degradation of content is difficult to achieve. These wavelet transform denoising techniques are contrasted with traditional noise reduction techniques; mean filtering, median filtering, Fourier filtering. The different approaches are compared in terms of speckle reduction, edge preservation and resolution preservation.

  1. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  2. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  3. Radial Velocity Data Analysis with Compressed Sensing Techniques

    Science.gov (United States)

    Hara, Nathan C.; Boué, G.; Laskar, J.; Correia, A. C. M.

    2016-09-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  4. Radial Velocity Data Analysis with Compressed Sensing Techniques

    CERN Document Server

    Hara, Nathan C; Laskar, Jacques; Correia, Alexandre C M

    2016-01-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  5. Radial velocity data analysis with compressed sensing techniques

    Science.gov (United States)

    Hara, Nathan C.; Boué, G.; Laskar, J.; Correia, A. C. M.

    2017-01-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian process framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick Observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  6. Theoretical analysis of highly linear tunable filters using Switched-Resistor techniques

    NARCIS (Netherlands)

    Jiraseree-amornkun, Amorn; Jiraseree-Amornkun, A.; Worapishet, Apisak; Klumperink, Eric A.M.; Nauta, Bram; Surakampontorn, Wanlop

    2008-01-01

    Abstract—In this paper, an in-depth analysis of switched-resistor (S-R) techniques for implementing low-voltage low-distortion tunable active-RC filters is presented. The S-R techniques make use of switch(es) with duty-cycle-controlled clock(s) to achieve tunability of the effective resistance and,

  7. Techniques for Surveying Urban Active Faults by Seismic Methods

    Institute of Scientific and Technical Information of China (English)

    Xu Mingcai; Gao Jinghua; Liu Jianxun; Rong Lixin

    2005-01-01

    Using the seismic method to detect active faults directly below cities is an irreplaceable prospecting technique. The seismic method can precisely determine the fault position. Seismic method itself can hardly determine the geological age of fault. However, by considering in connection with the borehole data and the standard geological cross-section of the surveyed area, the geological age of reflected wave group can be qualitatively (or semi-quantitatively)determined from the seismic depth profile. To determine the upper terminal point of active faults directly below city, it is necessary to use the high-resolution seismic reflection technique.To effectively determine the geometric feature of deep faults, especially to determine the relation between deep and shallow fracture structures, the seismic reflection method is better than the seismic refraction method.

  8. Application of neutron activation tracer sediment technique on environmental science

    Institute of Scientific and Technical Information of China (English)

    YinYi; ZhongWei-Ni; 等

    1997-01-01

    Field and laboratory inverstigations were carried out to study the transport and dispersion law of polluted sediments near wastewater outlet using neutron activation tracer technique.The direction of transport and dispersion of polluted sediments,dispersion amount in different directions,sedimentary region of polluted sediment and evaluation of polluted risk are given.This provided a new test method for the study of environmental science and added a new forecasted content for the evaluation of environmental influence.

  9. Active vibration control techniques for flexible space structures

    Science.gov (United States)

    Parlos, Alexander G.; Jayasuriya, Suhada

    1990-01-01

    Two proposed control system design techniques for active vibration control in flexible space structures are detailed. Control issues relevant only to flexible-body dynamics are addressed, whereas no attempt was made to integrate the flexible and rigid-body spacecraft dynamics. Both of the proposed approaches revealed encouraging results; however, further investigation of the interaction of the flexible and rigid-body dynamics is warranted.

  10. New techniques for emulsion analysis in a hybrid experiment

    Energy Technology Data Exchange (ETDEWEB)

    Kodama, K. (Aichi University of Education, Kariya 448 (Japan)); Ushida, N. (Aichi University of Education, Kariya 448 (Japan)); Mokhtarani, A. (University of California (Davis), Davis, CA 95616 (United States)); Paolone, V.S. (University of California (Davis), Davis, CA 95616 (United States)); Volk, J.T. (University of California (Davis), Davis, CA 95616 (United States)); Wilcox, J.O. (University of California (Davis), Davis, CA 95616 (United States)); Yager, P.M. (University of California (Davis), Davis, CA 95616 (United States)); Edelstein, R.M. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Freyberger, A.P. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Gibaut, D.B. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Lipton, R.J. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Nichols, W.R. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Potter, D.M. (Carnegie-Mellon Univers

    1994-08-01

    A new method, called graphic scanning, was developed by the Nagoya University Group for emulsion analysis in a hybrid experiment. This method enhances both speed and reliability of emulsion analysis. Details of the application of this technique to the analysis of Fermilab experiment E653 are described. ((orig.))

  11. 48 CFR 815.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Proposal analysis techniques. 815.404-1 Section 815.404-1 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS... techniques. (a) Contracting officers are responsible for the technical and administrative sufficiency of the...

  12. On discriminant analysis techniques and correlation structures in high dimensions

    DEFF Research Database (Denmark)

    Clemmensen, Line Katrine Harder

    This paper compares several recently proposed techniques for performing discriminant analysis in high dimensions, and illustrates that the various sparse methods dier in prediction abilities depending on their underlying assumptions about the correlation structures in the data. The techniques...... generally focus on two things: Obtaining sparsity (variable selection) and regularizing the estimate of the within-class covariance matrix. For high-dimensional data, this gives rise to increased interpretability and generalization ability over standard linear discriminant analysis. Here, we group...

  13. Evolution of Electroencephalogram Signal Analysis Techniques during Anesthesia

    Directory of Open Access Journals (Sweden)

    Mahmoud I. Al-Kadi

    2013-05-01

    Full Text Available Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device.

  14. Evolution of electroencephalogram signal analysis techniques during anesthesia.

    Science.gov (United States)

    Al-Kadi, Mahmoud I; Reaz, Mamun Bin Ibne; Ali, Mohd Alauddin Mohd

    2013-05-17

    Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs) are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device.

  15. Evolution of Electroencephalogram Signal Analysis Techniques during Anesthesia

    Science.gov (United States)

    Al-Kadi, Mahmoud I.; Reaz, Mamun Bin Ibne; Ali, Mohd Alauddin Mohd

    2013-01-01

    Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs) are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device. PMID:23686141

  16. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    Directory of Open Access Journals (Sweden)

    T. Subramani

    2014-06-01

    Full Text Available This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case study on natural base isolation using naturally available soils is presented. Also, the future areas of research are indicated. Earthquakes are one of nature IS greatest hazards; throughout historic time they have caused significant loss offline and severe damage to property, especially to man-made structures. On the other hand, earthquakes provide architects and engineers with a number of important design criteria foreign to the normal design process. From well established procedures reviewed by many researchers, seismic isolation may be used to provide an effective solution for a wide range of seismic design problems. The application of the base isolation techniques to protect structures against damage from earthquake attacks has been considered as one of the most effective approaches and has gained increasing acceptance during the last two decades. This is because base isolation limits the effects of the earthquake attack, a flexible base largely decoupling the structure from the ground motion, and the structural response accelerations are usually less than the ground acceleration. In general, the increase of additional viscous damping in the structure may reduce displacement and acceleration responses of the structure. This study also seeks to evaluate the effects of additional damping on the seismic response when compared with structures without additional damping for the different ground motions.

  17. Key-space analysis of double random phase encryption technique

    Science.gov (United States)

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  18. Electrogastrography: A Noninvasive Technique to Evaluate Gastric Electrical Activity

    OpenAIRE

    Claudia P. Sanmiguel; Mintchev, Martin P.; Bowes, Kenneth L.

    1998-01-01

    Electrogastrography (EGG) is the recording of gastric electrical activity (GEA) from the body surface. The cutaneous signal is low in amplitude and consequently must be amplified considerably. The resultant signal is heavily contaminated with noise, and visual analysis alone of an EGG signal is inadequate. Consequently, EGG recordings require special methodology for acquisition, processing and analysis. Essential components of this methodology involve an adequate system of digital filtering, ...

  19. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  20. Regional environmental analysis and management: New techniques for current problems

    Science.gov (United States)

    Honea, R. B.; Paludan, C. T. N.

    1974-01-01

    Advances in data acquisition and processing procedures for regional environmental analysis are discussed. Automated and semi-automated techniques employing Earth Resources Technology Satellite data and conventional data sources are presented. Experiences are summarized. The ERTS computer compatible tapes provide a very complete and flexible record of earth resources data and represent a viable medium to enhance regional environmental analysis research.

  1. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    OpenAIRE

    Rodica IVORSCHI

    2012-01-01

    SWOT analysis is the most important management techniques for understanding the strategic position of an organization. Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be benefi cial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  2. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing...

  3. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  4. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  5. A comparison of maximal bioenergetic enzyme activities obtained with commonly used homogenization techniques.

    Science.gov (United States)

    Grace, M; Fletcher, L; Powers, S K; Hughes, M; Coombes, J

    1996-12-01

    Homogenization of tissue for analysis of bioenergetic enzyme activities is a common practice in studies examining metabolic properties of skeletal muscle adaptation to disease, aging, inactivity or exercise. While numerous homogenization techniques are in use today, limited information exists concerning the efficacy of specific homogenization protocols. Therefore, the purpose of this study was to compare the efficacy of four commonly used approaches to homogenizing skeletal muscle for analysis of bioenergetic enzyme activity. The maximal enzyme activity (Vmax) of citrate synthase (CS) and lactate dehydrogenase (LDH) were measured from homogenous muscle samples (N = 48 per homogenization technique) and used as indicators to determine which protocol had the highest efficacy. The homogenization techniques were: (1) glass-on-glass pestle; (2) a combination of a mechanical blender and a teflon pestle (Potter-Elvehjem); (3) a combination of the mechanical blender and a biological detergent; and (4) the combined use of a mechanical blender and a sonicator. The glass-on-glass pestle homogenization protocol produced significantly higher (P < 0.05) enzyme activities compared to all other protocols for both enzymes. Of the four protocols examined, the data demonstrate that the glass-on-glass pestle homogenization protocol is the technique of choice for studying bioenergetic enzyme activity in skeletal muscle.

  6. Neutron Activation Analysis of Inhomogeneous Large Samples; An Explorative Study

    NARCIS (Netherlands)

    Baas, H.W.

    2004-01-01

    Neutron activation analysis is a powerful technique for the determination of trace-element concentrations. Since both neutrons that are used for activation and gamma rays that are detected have a high penetrating power, the technique can be applied for relatively large samples (up to 13 L), as demon

  7. Design, data analysis and sampling techniques for clinical research

    OpenAIRE

    Karthik Suresh; Sanjeev V Thomas; Geetha Suresh

    2011-01-01

    Statistical analysis is an essential technique that enables a medical research practitioner to draw meaningful inference from their data analysis. Improper application of study design and data analysis may render insufficient and improper results and conclusion. Converting a medical problem into a statistical hypothesis with appropriate methodological and logical design and then back-translating the statistical results into relevant medical knowledge is a real challenge. This article explains...

  8. Analysis On Classification Techniques In Mammographic Mass Data Set

    OpenAIRE

    K.K.Kavitha; Dr.A.Kangaiammal

    2015-01-01

    Data mining, the extraction of hidden information from large databases, is to predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. Data-Mining classification techniques deals with determining to which group each data instances are associated with. It can deal with a wide variety of data so that large amount of data can be involved in processing. This paper deals with analysis on various data mining classification techniques such a...

  9. Determination of the archaeological origin of ceramic fragments characterized by neutron activation analysis, by means of the application of multivariable statistical analysis techniques;Determinacion del origen de fragmentos de ceramica arqueologica caracterizados con analisis por activacion neutronica, mediante la aplicacion de tecnicas de analisis estadistico multivariable

    Energy Technology Data Exchange (ETDEWEB)

    Almazan T, M. G.; Jimenez R, M.; Monroy G, F.; Tenorio, D. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Rodriguez G, N. L. [Instituto Mexiquense de Cultura, Subdireccion de Restauracion y Conservacion, Hidalgo poniente No. 1013, 50080 Toluca, Estado de Mexico (Mexico)

    2009-07-01

    The elementary composition of archaeological ceramic fragments obtained during the explorations in San Miguel Ixtapan, Mexico State, was determined by the neutron activation analysis technique. The samples irradiation was realized in the research reactor TRIGA Mark III with a neutrons flow of 1centre dot10{sup 13}ncentre dotcm{sup -2}centre dots{sup -1}. The irradiation time was of 2 hours. Previous to the acquisition of the gamma rays spectrum the samples were allowed to decay from 12 to 14 days. The analyzed elements were: Nd, Ce, Lu, Eu, Yb, Pa(Th), Tb, La, Cr, Hf, Sc, Co, Fe, Cs, Rb. The statistical treatment of the data, consistent in the group analysis and the main components analysis allowed to identify three different origins of the archaeological ceramic, designated as: local, foreign and regional. (Author)

  10. Active Ageing: An Analysis

    Directory of Open Access Journals (Sweden)

    Alina-Cristina Nuta

    2011-10-01

    Full Text Available The problem of ageing is a highly topical for Romania and for European Union. In this framework, to create and implement some strategies for active ageing is an important objective. The international and regional forums set (supported by official statistics that the number of older people growing rapidly. Romania needs some programmes (with labour, social, economic, health care aspects to deal with the demographic changes, programs that will reform the existing working life structures and legislation. Despite the actual pension reform, which tries to close the opportunity of early retirement (by penalizing the total pension flows, or increasing the retirement age, etc., the labour system does not sets some important targets for this area.

  11. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  12. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  13. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    Science.gov (United States)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  14. Virtual Mold Technique in Thermal Stress Analysis during Casting Process

    Institute of Scientific and Technical Information of China (English)

    Si-Young Kwak; Jae-Wook Baek; Jeong-Ho Nam; Jeong-Kil Choi

    2008-01-01

    It is important to analyse the casting product and the mold at the same time considering thermal contraction of the casting and thermal expansion of the mold. The analysis considering contact of the casting and the mold induces the precise prediction of stress distribution and the defect such as hot tearing. But it is difficult to generate FEM mesh for the interface of the casting and the mold. Moreover the mesh for the mold domain spends lots of computational time and memory for the analysis due to a number of meshes. Consequently we proposed the virtual mold technique which only uses mesh of the casting part for thermal stress analysis in casting process. The spring bar element in virtual mold technique is used to consider the contact of the casting and the mold. In general, a volume of the mold is much bigger than that of casting part, so the proposed technique decreases the number of mesh and saves the computational memory and time greatly. In this study, the proposed technique was verified by the comparison with the traditional contact technique on a specimen. And the proposed technique gave satisfactory results.

  15. Image analysis techniques for the study of turbulent flows

    Directory of Open Access Journals (Sweden)

    Ferrari Simone

    2017-01-01

    Full Text Available In this paper, a brief review of Digital Image Analysis techniques employed in Fluid Mechanics for the study of turbulent flows is given. Particularly the focus is on the techniques developed by the research teams the Author worked in, that can be considered relatively “low cost” techniques. Digital Image Analysis techniques have the advantage, when compared to the traditional techniques employing physical point probes, to be non-intrusive and quasi-continuous in space, as every pixel on the camera sensor works as a single probe: consequently, they allow to obtain two-dimensional or three-dimensional fields of the measured quantity in less time. Traditionally, the disadvantages are related to the frequency of acquisition, but modern high-speed cameras are typically able to acquire at frequencies from the order of 1 KHz to the order of 1 MHz. Digital Image Analysis techniques can be employed to measure concentration, temperature, position, displacement, velocity, acceleration and pressure fields with similar equipment and setups, and can be consequently considered as a flexible and powerful tool for measurements on turbulent flows.

  16. Parallelization of events generation for data analysis techniques

    CERN Document Server

    Lazzaro, A

    2010-01-01

    With the startup of the LHC experiments at CERN, the involved community is now focusing on the analysis of the collected data. The complexity of the data analyses will be a key factor for finding eventual new phenomena. For such a reason many data analysis tools have been developed in the last several years, which implement several data analysis techniques. Goal of these techniques is the possibility of discriminating events of interest and measuring parameters on a given input sample of events, which are themselves defined by several variables. Also particularly important is the possibility of repeating the determination of the parameters by applying the procedure on several simulated samples, which are generated using Monte Carlo techniques and the knowledge of the probability density functions of the input variables. This procedure achieves a better estimation of the results. Depending on the number of variables, complexity of their probability density functions, number of events, and number of sample to g...

  17. Review and classification of variability analysis techniques with clinical applications.

    Science.gov (United States)

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  18. Teaching Computational Geophysics Classes using Active Learning Techniques

    Science.gov (United States)

    Keers, H.; Rondenay, S.; Harlap, Y.; Nordmo, I.

    2016-12-01

    We give an overview of our experience in teaching two computational geophysics classes at the undergraduate level. In particular we describe The first class is for most students the first programming class and assumes that the students have had an introductory course in geophysics. In this class the students are introduced to basic Matlab skills: use of variables, basic array and matrix definition and manipulation, basic statistics, 1D integration, plotting of lines and surfaces, making of .m files and basic debugging techniques. All of these concepts are applied to elementary but important concepts in earthquake and exploration geophysics (including epicentre location, computation of travel time curves for simple layered media plotting of 1D and 2D velocity models etc.). It is important to integrate the geophysics with the programming concepts: we found that this enhances students' understanding. Moreover, as this is a 3 year Bachelor program, and this class is taught in the 2nd semester, there is little time for a class that focusses on only programming. In the second class, which is optional and can be taken in the 4th or 6th semester, but often is also taken by Master students we extend the Matlab programming to include signal processing and ordinary and partial differential equations, again with emphasis on geophysics (such as ray tracing and solving the acoustic wave equation). This class also contains a project in which the students have to write a brief paper on a topic in computational geophysics, preferably with programming examples. When teaching these classes it was found that active learning techniques, in which the students actively participate in the class, either individually, in pairs or in groups, are indispensable. We give a brief overview of the various activities that we have developed when teaching theses classes.

  19. Automated thermal mapping techniques using chromatic image analysis

    Science.gov (United States)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  20. A new technique of ECG analysis and its application to evaluation of disorders during ventricular tachycardia

    Energy Technology Data Exchange (ETDEWEB)

    Moskalenko, A.V. [Institute of Theoretical and Experimental Biophysics RAS, Institutskaya Street, 3, Pushchino 142290 (Russian Federation)], E-mail: info@avmoskalenko.ru; Rusakov, A.V. [Institute of Theoretical and Experimental Biophysics RAS, Institutskaya Street, 3, Pushchino 142290 (Russian Federation); Elkin, Yu.E. [Institute of Mathematical Problems of Biology RAS, Institutskaya Street, 4, Pushchino 142290 (Russian Federation)

    2008-04-15

    We propose a new technique of ECG analysis to characterize the properties of polymorphic ventricular arrhythmias, potentially life-threatening disorders of cardiac activation. The technique is based on extracting two indices from the ECG fragment. The result is a new detailed quantitative description of polymorphic ECGs. Our observations suggest that the proposed ECG processing algorithm provides information that supplements the traditional visual ECG analysis. The estimates of ECG variation in this study reveal some unexpected details of ventricular activation dynamics, which are possibly useful for diagnosing cardiac rhythm disturbances.

  1. Active Learning Techniques Applied to an Interdisciplinary Mineral Resources Course.

    Science.gov (United States)

    Aird, H. M.

    2015-12-01

    An interdisciplinary active learning course was introduced at the University of Puget Sound entitled 'Mineral Resources and the Environment'. Various formative assessment and active learning techniques that have been effective in other courses were adapted and implemented to improve student learning, increase retention and broaden knowledge and understanding of course material. This was an elective course targeted towards upper-level undergraduate geology and environmental majors. The course provided an introduction to the mineral resources industry, discussing geological, environmental, societal and economic aspects, legislation and the processes involved in exploration, extraction, processing, reclamation/remediation and recycling of products. Lectures and associated weekly labs were linked in subject matter; relevant readings from the recent scientific literature were assigned and discussed in the second lecture of the week. Peer-based learning was facilitated through weekly reading assignments with peer-led discussions and through group research projects, in addition to in-class exercises such as debates. Writing and research skills were developed through student groups designing, carrying out and reporting on their own semester-long research projects around the lasting effects of the historical Ruston Smelter on the biology and water systems of Tacoma. The writing of their mini grant proposals and final project reports was carried out in stages to allow for feedback before the deadline. Speakers from industry were invited to share their specialist knowledge as guest lecturers, and students were encouraged to interact with them, with a view to employment opportunities. Formative assessment techniques included jigsaw exercises, gallery walks, placemat surveys, think pair share and take-home point summaries. Summative assessment included discussion leadership, exams, homeworks, group projects, in-class exercises, field trips, and pre-discussion reading exercises

  2. ANALYSIS OF RELATIONS BETWEEN JUDO TECHNIQUES AND SPECIFIC MOTOR ABILITIES

    Directory of Open Access Journals (Sweden)

    Patrik Drid

    2006-06-01

    Full Text Available Specific physical preparation affects the development of motor abilities required for execution of specific movements in judo. When selecting proper specific exercises for judo for a target motor ability, it is necessary to precede it with the study of the structure of specific judo techniques and activities of individual muscle groups engaged for execution of the technique. On the basis of this, one can understand which muscles are most engaged during realization of individual techniques, which serves as a standpoint for selection of a particular complex of specific exercises to produce the highest effects. In addition to the development of particular muscle groups, the means of specific preparation will take effect on the development of those motor abilities which are evaluated as the indispensable for the development of particular qualities which are characteristic for judo. This paper analyses the relationship between judo techniques field and specific motor abilities.

  3. Surface dielectric relaxation: probing technique and its application to thermal activation dynamics of polymer surface.

    Science.gov (United States)

    Ishii, Masashi

    2010-09-01

    For dynamic analyses of a polymer surface, a dielectric relaxation measurement technique with parallel electrodes placed away from the surface was developed. In this technique, a liquid heating medium was filled in the space between the polymer surface and the electrodes. The construction that maintains the surface can clarify the physical interactions between the liquid and the bare surface and controlling the temperature of the liquid reveals the thermal activation property of the surface. The dielectric relaxation spectrum of the surface convoluted into the bulk and liquid spectra can be obtained by a reactance analysis and the surface spectrum is expressed with an equivalent resistance-capacitance parallel circuit. On the basis of the electromechanical analogy, the electric elements can be converted into mechanical elements that indicate the viscoelasticity of the polymer surface. Using these measurement and analysis techniques, the electric and mechanical properties of the surface of a gelatinized chloroprene rubber sample were analyzed.

  4. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.

    The thesis describes and develops the theoretical foundations of the Random Decrement technique, while giving several examples of modal analysis of large building constructions (bridges). The connection between modal parameters and Random Decrement functions is described theoretically...... is expanded to include both a vector formulation that increases speed considerably, and a new method for the prediction of the variance of the estimated Random Decrement functions. The thesis closes with a number of examples of modal analysis of bridges exposed to natural (ambient) load....

  5. Data analysis techniques for nuclear and particle physicists

    CERN Document Server

    Pruneau, Claude

    2017-01-01

    This is an advanced data analysis textbook for scientists specializing in the areas of particle physics, nuclear physics, and related subfields. As a practical guide for robust, comprehensive data analysis, it focuses on realistic techniques to explain instrumental effects. The topics are relevant for engineers, scientists, and astroscientists working in the fields of geophysics, chemistry, and the physical sciences. The book serves as a reference for more senior scientists while being eminently accessible to advanced undergraduate and graduate students.

  6. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  7. Multiscale statistical analysis of coronal solar activity

    CERN Document Server

    Gamborino, Diana; Martinell, Julio J

    2016-01-01

    Multi-filter images from the solar corona are used to obtain temperature maps which are analyzed using techniques based on proper orthogonal decomposition (POD) in order to extract dynamical and structural information at various scales. Exploring active regions before and after a solar flare and comparing them with quiet regions we show that the multiscale behavior presents distinct statistical properties for each case that can be used to characterize the level of activity in a region. Information about the nature of heat transport is also be extracted from the analysis.

  8. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    Science.gov (United States)

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  9. Optimization Techniques for Analysis of Biological and Social Networks

    Science.gov (United States)

    2012-03-28

    systematic fashion under a unifying theoretical and algorithmic framework . Optimization, Complex Networks, Social Network Analysis, Computational...analyzing a new metaheuristic technique, variable objective search. 3. Experimentation and application: Implement the proposed algorithms, test and fine...exact solutions are presented. In [3], we introduce the variable objective search framework for combinatorial optimization. The method utilizes

  10. Evaluation of Meterorite Amono Acid Analysis Data Using Multivariate Techniques

    Science.gov (United States)

    McDonald, G.; Storrie-Lombardi, M.; Nealson, K.

    1999-01-01

    The amino acid distributions in the Murchison carbonaceous chondrite, Mars meteorite ALH84001, and ice from the Allan Hills region of Antarctica are shown, using a multivariate technique known as Principal Component Analysis (PCA), to be statistically distinct from the average amino acid compostion of 101 terrestrial protein superfamilies.

  11. Improvement of vibration and noise by applying analysis technology. Development of active control technique of engine noise in a car cabin. Kaiseki gijutsu wo oyoshita shindo-soon no kaizen. Shashitsunai engine soon akutibu seigyo gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    Uchida, H.; Nakao, N.; Butsuen, T. (Matsuda Motor Corp., Hiroshima (Japan). Technology Research Inst.)

    1994-06-01

    It is difficult to reduce engine noise which is principal noise in a car cabin without producing an adverse effect on low cost production. Active noise control technique (ANC) has been developed to reduce engine noise compatible with low cost production. This paper discusses its control algorithm and the system configuration and presents experimental results. The filtered-x least mean square method is a well-known ANC algorithm, however, it often requires large amount of calculation exceeding the present capacity of a digital signal processor. An effective ANC algorithm is developed by the use of the repetitiveness of the engine noise. This paper describes the basic theory of the control algorithm, the extension to a multiple input and output system, the system configuration and experimental results. A noise control system with three microphones is designed with consideration of the spatial distribution of the noise and reduces noise in the whole cabin by 8dB(A) in the largest case. Active noise control technique is applicable to many areas and can be used for the reduction of noise and vibration other than engine noise. 5 refs., 7 figs., 1 tab.

  12. What Child Analysis Can Teach Us about Psychoanalytic Technique.

    Science.gov (United States)

    Ablon, Steven Luria

    2014-01-01

    Child analysis has much to teach us about analytic technique. Children have an innate, developmentally driven sense of analytic process. Children in analysis underscore the importance of an understanding and belief in the therapeutic action of play, the provisional aspects of play, and that not all play will be understood. Each analysis requires learning a new play signature that is constantly reorganized. Child analysis emphasizes the emergence and integration of dissociated states, the negotiation of self-other relationships, the importance of co-creation, and the child's awareness of the analyst's sensibility. Child analysis highlights the robust nature of transference and how working through and repairing is related to the initiation of coordinated patterns of high predictability in the context of deep attachments. I will illustrate these and other ideas in the description of the analysis of a nine-year-old boy.

  13. Developing techniques for cause-responsibility analysis of occupational accidents.

    Science.gov (United States)

    Jabbari, Mousa; Ghorbani, Roghayeh

    2016-11-01

    The aim of this study was to specify the causes of occupational accidents, determine social responsibility and the role of groups involved in work-related accidents. This study develops occupational accidents causes tree, occupational accidents responsibility tree, and occupational accidents component-responsibility analysis worksheet; based on these methods, it develops cause-responsibility analysis (CRA) techniques, and for testing them, analyzes 100 fatal/disabling occupational accidents in the construction setting that were randomly selected from all the work-related accidents in Tehran, Iran, over a 5-year period (2010-2014). The main result of this study involves two techniques for CRA: occupational accidents tree analysis (OATA) and occupational accidents components analysis (OACA), used in parallel for determination of responsible groups and responsibilities rate. From the results, we find that the management group of construction projects has 74.65% responsibility of work-related accidents. The developed techniques are purposeful for occupational accidents investigation/analysis, especially for the determination of detailed list of tasks, responsibilities, and their rates. Therefore, it is useful for preventing work-related accidents by focusing on the responsible group's duties.

  14. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm......The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... (SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits...

  15. Design, data analysis and sampling techniques for clinical research.

    Science.gov (United States)

    Suresh, Karthik; Thomas, Sanjeev V; Suresh, Geetha

    2011-10-01

    Statistical analysis is an essential technique that enables a medical research practitioner to draw meaningful inference from their data analysis. Improper application of study design and data analysis may render insufficient and improper results and conclusion. Converting a medical problem into a statistical hypothesis with appropriate methodological and logical design and then back-translating the statistical results into relevant medical knowledge is a real challenge. This article explains various sampling methods that can be appropriately used in medical research with different scenarios and challenges.

  16. Characterization of archaeological ceramics from the north western lowland Maya Area, using the technique of neutron activation analysis; Caracterizacion de ceramicas arqueologicas de las tierras bajas noroccidentales del Area Maya, empleando la tecnica de activacion neutronica

    Energy Technology Data Exchange (ETDEWEB)

    Lopez R, M. C.; Tenorio, D.; Jimenez R, M. [ININ, Carretera Mexico-Toluca s/n, Ocoyoacac 52750, Estado de Mexico (Mexico); Terreros, E. [Museo del Templo Mayor, INAH, Seminario No. 8, Col. Centro, Mexico 06060, D. F. (Mexico); Ochoa, L. [UNAM, Instituto de Investigaciones Antropologicas, Circuito Exterior s/n, Ciudad Universitaria, Mexico 04510, D. F. (Mexico)

    2008-07-01

    It is a study on 50 samples of ceramics from various archaeological sites of the north western lowland Maya Area. This study was performed by neutron activation analysis of 19 chemical elements and the treatments relevant statistical data. Significant differences were found among the pieces that led to group them into five major groups, the difference is the site of their manufacture and therefore in the raw materials used for this. (Author)

  17. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    Directory of Open Access Journals (Sweden)

    Mohammed naved Khan

    2013-05-01

    Full Text Available Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among researchers to rely on higher order multivariate techniques viz. confirmatory factor analysis, structural equation modeling etc. to generate and analyze complex models, while at times ignoring very basic and yet powerful procedures such as mean, t-Test, ANOVA and correlation. The marked shift in orientation of researchers towards using sophisticated analytical techniques can largely beattributed to the competition within the community of researchers in social sciences in general and those working in the area of service quality in particular as also growing demands of reviewers ofjournals. From a pragmatic viewpoint, it is expected that the paper will serve as a useful source of information and provide deeper insights to academic researchers, consultants, and practitionersinterested in modelling patterns of service quality and arriving at optimal solutions to increasingly complex management problems.

  18. A comparative study on change vector analysis based change detection techniques

    Indian Academy of Sciences (India)

    Sartajvir Singh; Rajneesh Talwar

    2014-12-01

    Detection of Earth surface changes are essential to monitor regional climatic, snow avalanche hazard analysis and energy balance studies that occur due to air temperature irregularities. Geographic Information System (GIS) enables such research activities to be carried out through change detection analysis. From this viewpoint, different change detection algorithms have been developed for land-use land-cover (LULC) region. Among the different change detection algorithms, change vector analysis (CVA) has level headed capability of extracting maximuminformation in terms of overall magnitude of change and the direction of change between multispectral bands from multi-temporal satellite data sets. Since past two–three decades, many effective CVA based change detection techniques e.g., improved change vector analysis (ICVA), modified change vector analysis (MCVA) and change vector analysis posterior-probability space (CVAPS), have been developed to overcome the difficulty that exists in traditional change vector analysis (CVA). Moreover, many integrated techniques such as cross correlogram spectral matching (CCSM) based CVA. CVA uses enhanced principal component analysis (PCA) and inverse triangular (IT) function, hyper-spherical direction cosine (HSDC), and median CVA (m-CVA), as an effective LULC change detection tools. This paper comprises a comparative analysis on CVA based change detection techniques such as CVA, MCVA, ICVA and CVAPS. This paper also summarizes the necessary integrated CVA techniques along with their characteristics, features and shortcomings. Based on experiment outcomes, it has been evaluated that CVAPS technique has greater potential than other CVA techniques to evaluate the overall transformed information over three differentMODerate resolution Imaging Spectroradiometer (MODIS) satellite data sets of different regions. Results of this study are expected to be potentially useful for more accurate analysis of LULC changes which will, in turn

  19. Evaluation of Damping Using Frequency Domain Operational Modal Analysis Techniques

    DEFF Research Database (Denmark)

    Bajric, Anela; Georgakis, Christos T.; Brincker, Rune

    2015-01-01

    Operational Modal Analysis (OMA) techniques provide in most cases reasonably accurate estimates of structural frequencies and mode shapes. In contrast though, they are known to often produce uncertain structural damping estimates, which is mainly due to inherent random and/or bias errors...... domain techniques, the Frequency Domain Decomposition (FDD) and the Frequency Domain Polyreference (FDPR). The response of a two degree-of-freedom (2DOF) system is numerically established with specified modal parameters subjected to white noise loading. The system identification is evaluated with well...

  20. Error analysis in correlation computation of single particle reconstruction technique

    Institute of Scientific and Technical Information of China (English)

    胡悦; 隋森芳

    1999-01-01

    The single particle reconstruction technique has become particularly important in the structure analysis of hiomaeromolecules. The problem of reconstructing a picture from identical samples polluted by colored noises is studied, and the alignment error in the correlation computation of single particle reconstruction technique is analyzed systematically. The concept of systematic error is introduced, and the explicit form of the systematic error is given under the weak noise approximation. The influence of the systematic error on the reconstructed picture is discussed also, and an analytical formula for correcting the distortion in the picture reconstruction is obtained.

  1. Analysis On Classification Techniques In Mammographic Mass Data Set

    Directory of Open Access Journals (Sweden)

    Mrs. K. K. Kavitha

    2015-07-01

    Full Text Available Data mining, the extraction of hidden information from large databases, is to predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. Data-Mining classification techniques deals with determining to which group each data instances are associated with. It can deal with a wide variety of data so that large amount of data can be involved in processing. This paper deals with analysis on various data mining classification techniques such as Decision Tree Induction, Naïve Bayes , k-Nearest Neighbour (KNN classifiers in mammographic mass dataset.

  2. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T......, which is shown to be approximated by a χ2 distribution. Application of this test to the results of determinations of manganese in human serum by a method of established precision, led to the detection of airborne pollution of the serum during the sampling process. The subsequent improvement in sampling...... conditions was shown to give not only increased precision, but also improved accuracy of the results....

  3. Golden glazes analysis by PIGE and PIXE techniques

    Science.gov (United States)

    Fonseca, M.; Luís, H.; Franco, N.; Reis, M. A.; Chaves, P. C.; Taborda, A.; Cruz, J.; Galaviz, D.; Fernandes, N.; Vieira, P.; Ribeiro, J. P.; Jesus, A. P.

    2011-12-01

    We present the analysis performed on the chemical composition of two golden glazes available in the market using the PIGE and PIXE techniques at the ITN ion beam laboratory. The analysis of the light elements was performed using the Emitted Radiation Yield Analysis (ERYA) code, a standard-free method for PIGE analysis on thick samples. The results were compared to those obtained on an old glaze. Consistently high concentrations of lead and sodium were found in all analyzed golden glazes. The analysis of the samples pointed to Mo and Co as the specific elements responsible of the gold colour at the desired temperature, and allowed Portuguese ceramists to produce a golden glaze at 997 °C. Optical reflection spectra of the glazes are given, showing that the produced glaze has a spectrum similar to the old glaze. Also, in order to help the ceramists, the unknown compositions of four different types of frits (one of the components of glazes) were analysed.

  4. Development of fault diagnostic technique using reactor noise analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  5. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    Science.gov (United States)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  6. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  7. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Charlton, William S [Univ. of California, Berkeley, CA (United States)

    1999-09-01

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels.

  8. Electrogastrography: A Noninvasive Technique to Evaluate Gastric Electrical Activity

    Directory of Open Access Journals (Sweden)

    Claudia P Sanmiguel

    1998-01-01

    Full Text Available Electrogastrography (EGG is the recording of gastric electrical activity (GEA from the body surface. The cutaneous signal is low in amplitude and consequently must be amplified considerably. The resultant signal is heavily contaminated with noise, and visual analysis alone of an EGG signal is inadequate. Consequently, EGG recordings require special methodology for acquisition, processing and analysis. Essential components of this methodology involve an adequate system of digital filtering, amplification and analysis, along with minimization of the sources of external noise (random motions of the patient, electrode-skin interface impedance, electrode bending, obesity, etc and a quantitative interpretation of the recordings. There is a close relationship between GEA and gastric motility. Although it has been demonstrated that EGG satisfactorily reflects internal GEA frequency, there is not acceptable correlation with gastric contractions or gastric emptying. Many attempts have been made to relate EGG 'abnormalities' with clinical syndromes and diseases; however, the diagnostic and clinical value of EGG is still very much in question.

  9. Using High Speed Smartphone Cameras and Video Analysis Techniques to Teach Mechanical Wave Physics

    Science.gov (United States)

    Bonato, Jacopo; Gratton, Luigi M.; Onorato, Pasquale; Oss, Stefano

    2017-01-01

    We propose the use of smartphone-based slow-motion video analysis techniques as a valuable tool for investigating physics concepts ruling mechanical wave propagation. The simple experimental activities presented here, suitable for both high school and undergraduate students, allows one to measure, in a simple yet rigorous way, the speed of pulses…

  10. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  11. Large areas elemental mapping by ion beam analysis techniques

    Science.gov (United States)

    Silva, T. F.; Rodrigues, C. L.; Curado, J. F.; Allegro, P.; Moro, M. V.; Campos, P. H. O. V.; Santos, S. B.; Kajiya, E. A. M.; Rizzutto, M. A.; Added, N.; Tabacniks, M. H.

    2015-07-01

    The external beam line of the Laboratory for Material Analysis with Ion Beams (LAMFI) is a versatile setup for multi-technique analysis. X-ray detectors for Particle Induced X-rays Emission (PIXE) measurements, a Gamma-ray detector for Particle Induced Gamma- ray Emission (PIGE), and a particle detector for scattering analysis, such as Rutherford Backscattering Spectrometry (RBS), were already installed. In this work, we present some results, using a large (60-cm range) XYZ computer controlled sample positioning system, completely developed and build in our laboratory. The XYZ stage was installed at the external beam line and its high spacial resolution (better than 5 μm over the full range) enables positioning the sample with high accuracy and high reproducibility. The combination of a sub-millimeter beam with the large range XYZ robotic stage is being used to produce elemental maps of large areas in samples like paintings, ceramics, stones, fossils, and all sort of samples. Due to its particular characteristics, this is a unique device in the sense of multi-technique analysis of large areas. With the continuous development of the external beam line at LAMFI, coupled to the robotic XYZ stage, it is becoming a robust and reliable option for regular analysis of trace elements (Z > 5) competing with the traditional in-vacuum ion-beam-analysis with the advantage of automatic rastering.

  12. Efficient techniques for genotype-phenotype correlational analysis.

    Science.gov (United States)

    Saha, Subrata; Rajasekaran, Sanguthevar; Bi, Jinbo; Pathak, Sudipta

    2013-04-04

    Single Nucleotide Polymorphisms (SNPs) are sequence variations found in individuals at some specific points in the genomic sequence. As SNPs are highly conserved throughout evolution and within a population, the map of SNPs serves as an excellent genotypic marker. Conventional SNPs analysis mechanisms suffer from large run times, inefficient memory usage, and frequent overestimation. In this paper, we propose efficient, scalable, and reliable algorithms to select a small subset of SNPs from a large set of SNPs which can together be employed to perform phenotypic classification. Our algorithms exploit the techniques of gene selection and random projections to identify a meaningful subset of SNPs. To the best of our knowledge, these techniques have not been employed before in the context of genotype-phenotype correlations. Random projections are used to project the input data into a lower dimensional space (closely preserving distances). Gene selection is then applied on the projected data to identify a subset of the most relevant SNPs. We have compared the performance of our algorithms with one of the currently known best algorithms called Multifactor Dimensionality Reduction (MDR), and Principal Component Analysis (PCA) technique. Experimental results demonstrate that our algorithms are superior in terms of accuracy as well as run time. In our proposed techniques, random projection is used to map data from a high dimensional space to a lower dimensional space, and thus overcomes the curse of dimensionality problem. From this space of reduced dimension, we select the best subset of attributes. It is a unique mechanism in the domain of SNPs analysis, and to the best of our knowledge it is not employed before. As revealed by our experimental results, our proposed techniques offer the potential of high accuracies while keeping the run times low.

  13. The analysis of gastric function using computational techniques

    CERN Document Server

    Young, P

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of...

  14. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  15. Characterization of PTFE Using Advanced Thermal Analysis Techniques

    Science.gov (United States)

    Blumm, J.; Lindemann, A.; Meyer, M.; Strasser, C.

    2010-10-01

    Polytetrafluoroethylene (PTFE) is a synthetic fluoropolymer used in numerous industrial applications. It is often referred to by its trademark name, Teflon. Thermal characterization of a PTFE material was carried out using various thermal analysis and thermophysical properties test techniques. The transformation energetics and specific heat were measured employing differential scanning calorimetry. The thermal expansion and the density changes were determined employing pushrod dilatometry. The viscoelastic properties (storage and loss modulus) were analyzed using dynamic mechanical analysis. The thermal diffusivity was measured using the laser flash technique. Combining thermal diffusivity data with specific heat and density allows calculation of the thermal conductivity of the polymer. Measurements were carried out from - 125 °C up to 150 °C. Additionally, measurements of the mechanical properties were carried out down to - 170 °C. The specific heat tests were conducted into the fully molten regions up to 370 °C.

  16. Calcium Hardness Analysis of Water Samples Using EDXRF Technique

    Directory of Open Access Journals (Sweden)

    Kanan Deep

    2014-08-01

    Full Text Available Calcium hardness of water samples has been determined using a method based upon the Energy Dispersive X-ray fluorescence (EDXRF technique for elemental analysis. The minimum detection limit for Ca has been found in the range 0.1-100ppm. The experimental approach and analytical method for calcium studies seem satisfactory for the purpose and can be utilized for similar investigations.

  17. Technique of Hadamard transform microscope fluorescence image analysis

    Institute of Scientific and Technical Information of China (English)

    梅二文; 顾文芳; 曾晓斌; 陈观铨; 曾云鹗

    1995-01-01

    Hadamard transform spatial multiplexed imaging technique is combined with fluorescence microscope and an instrument of Hadamard transform microscope fluorescence image analysis is developed. Images acquired by this instrument can provide a lot of useful information simultaneously, including three-dimensional Hadamard transform microscope cell fluorescence image, the fluorescence intensity and fluorescence distribution of a cell, the background signal intensity and the signal/noise ratio, etc.

  18. Failure Analysis Seminar: Techniques and Teams. Seminar Notes. Volume I.

    Science.gov (United States)

    1981-01-01

    and Progress - Evaluate 7* 6 *~ 0 6 9 9 S 9 FAILURE ANALYSIS STRATEGY1 Augustine E. Magistro *. Introduction A primary task of management and systems...by Augustine Magistro , Picatinny Arsenal and Lawrence R. Seggel, U. S. Army Missile Command. The report Is available from the National Technical...to emphasize techniques - Identification and improvement of your leadership styles 2I BIOGRAPHIC SKETCHES: A.E. "Gus" Magistro - Systems Evaluation

  19. Impedance Flow Cytometry: A Novel Technique in Pollen Analysis

    OpenAIRE

    Heidmann, Iris; Schade-Kampmann, Grit; Lambalk, Joep; Ottiger, Marcel; Di Berardino, Marco

    2016-01-01

    Introduction An efficient and reliable method to estimate plant cell viability, especially of pollen, is important for plant breeding research and plant production processes. Pollen quality is determined by classical methods, like staining techniques or in vitro pollen germination, each having disadvantages with respect to reliability, analysis speed, and species dependency. Analysing single cells based on their dielectric properties by impedance flow cytometry (IFC) has developed into a comm...

  20. Analysis of diagnostic calorimeter data by the transfer function technique

    Science.gov (United States)

    Delogu, R. S.; Poggi, C.; Pimazzoni, A.; Rossi, G.; Serianni, G.

    2016-02-01

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  1. Analysis of diagnostic calorimeter data by the transfer function technique

    Energy Technology Data Exchange (ETDEWEB)

    Delogu, R. S., E-mail: rita.delogu@igi.cnr.it; Pimazzoni, A.; Serianni, G. [Consorzio RFX, Corso Stati Uniti, 35127 Padova (Italy); Poggi, C.; Rossi, G. [Università degli Studi di Padova, Via 8 Febbraio 1848, 35122 Padova (Italy)

    2016-02-15

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  2. Chromatographic Fingerprint Analysis of Marrubiin in Marrubium vulgare L. via HPTLC Technique

    OpenAIRE

    Keyvan Yousefi; Sanaz Hamedeyazdan; Mohammadali Torbati; Fatemeh Fathiazad

    2016-01-01

    Purpose: In the present study we aimed to quantify marrubiin, as the major active compound, in the aerial parts of Marrubium vulgare from Iran using a HPTLC-densitometry technique. Methods: Quantitative determination of marrubiin in M. vulgare methanol extract was performed by HPTLC analysis via a fully automated TLC scanner. Later on, the in vitro antioxidant activity of the M. vulgare methanol extract was determined using 1,1-diphenyl-2-picryl-hydrazil (DPPH) free radic...

  3. Innovative techniques to analyze time series of geomagnetic activity indices

    Science.gov (United States)

    Balasis, Georgios; Papadimitriou, Constantinos; Daglis, Ioannis A.; Potirakis, Stelios M.; Eftaxias, Konstantinos

    2016-04-01

    Magnetic storms are undoubtedly among the most important phenomena in space physics and also a central subject of space weather. The non-extensive Tsallis entropy has been recently introduced, as an effective complexity measure for the analysis of the geomagnetic activity Dst index. The Tsallis entropy sensitively shows the complexity dissimilarity among different "physiological" (normal) and "pathological" states (intense magnetic storms). More precisely, the Tsallis entropy implies the emergence of two distinct patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a higher degree of organization, and (ii) a pattern associated with normal periods, which is characterized by a lower degree of organization. Other entropy measures such as Block Entropy, T-Complexity, Approximate Entropy, Sample Entropy and Fuzzy Entropy verify the above mentioned result. Importantly, the wavelet spectral analysis in terms of Hurst exponent, H, also shows the existence of two different patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a fractional Brownian persistent behavior (ii) a pattern associated with normal periods, which is characterized by a fractional Brownian anti-persistent behavior. Finally, we observe universality in the magnetic storm and earthquake dynamics, on a basis of a modified form of the Gutenberg-Richter law for the Tsallis statistics. This finding suggests a common approach to the interpretation of both phenomena in terms of the same driving physical mechanism. Signatures of discrete scale invariance in Dst time series further supports the aforementioned proposal.

  4. The potential of electroanalytical techniques in pharmaceutical analysis.

    Science.gov (United States)

    Kauffmann, J M; Pékli-Novák, M; Nagy, A

    1996-03-01

    With the considerable progresses observed in analytical instrumentation, it was of interest to survey recent trends in the field of electroanalysis of drugs. Potentiometric, voltammetric and amperometric techniques were scrutinized both in terms of historical evolution and in terms of potentialities with respect to the analysis of drugs in various matrices. With regard to the former, it appeared that numerous original selective electrodes (for drugs and ions) have been studied and several ion-selective electrodes have been successfully commercialized. Improvements are still expected in this field in order to find more robust membrane matrices and to minimize the surface fouling. Electrochemistry is well suited for trace metal analysis. A renewed interest in potentiometric stripping analysis is observed and is stimulated by the power of computers and microprocessors which allow rapid signal recording and data handling. Polarography and its refinements (Pulsed Waveform, Automation,...) is ideally applied for trace metal analysis and speciation. The technique is still useful in the analysis of drug formulations and in biological samples provided that the method is adequately validated (selectivity!). The same holds for solid electrodes which are currently routinely applied as sensitive detectors after chromatographic separation. New instrumentation is soon expected as regard electrochemical detection in capillary electrophoresis. Actually, in order to increase the responses and improve the selectivity, solid electrodes are facing exponential research dedicated to surface modifications. Perm-selectivity, chelations catalysis, etc. may be considered as appropriate strategies. Microelectrodes and screen printed (disposable) sensors are of considerable interest in cell culture e.g. for single cell excretion analysis and in field (decentralized) assays, respectively. Finally several biosensors and electrochemical immunoassays have been successfully development for the

  5. Applying Stylometric Analysis Techniques to Counter Anonymity in Cyberspace

    Directory of Open Access Journals (Sweden)

    Jianwen Sun

    2012-02-01

    Full Text Available Due to the ubiquitous nature and anonymity abuses in cyberspace, it’s difficult to make criminal identity tracing in cybercrime investigation. Writeprint identification offers a valuable tool to counter anonymity by applying stylometric analysis technique to help identify individuals based on textual traces. In this study, a framework for online writeprint identification is proposed. Variable length character n-gram is used to represent the author’s writing style. The technique of IG seeded GA based feature selection for Ensemble (IGAE is also developed to build an identification model based on individual author level features. Several specific components for dealing with the individual feature set are integrated to improve the performance. The proposed feature and technique are evaluated on a real world data set encompassing reviews posted by 50 Amazon customers. The experimental results show the effectiveness of the proposed framework, with accuracy over 94% for 20 authors and over 80% for 50 ones. Compared with the baseline technique (Support Vector Machine, a higher performance is achieved by using IGAE, resulting in a 2% and 8% improvement over SVM for 20 and 50 authors respectively. Moreover, it has been shown that IGAE is more scalable in terms of the number of authors, than author group level based methods.

  6. Comparative analysis of affinity-based 5-hydroxymethylation enrichment techniques

    Science.gov (United States)

    Thomson, John P.; Hunter, Jennifer M.; Nestor, Colm E.; Dunican, Donncha S.; Terranova, Rémi; Moggs, Jonathan G.; Meehan, Richard R.

    2013-01-01

    The epigenetic modification of 5-hydroxymethylcytosine (5hmC) is receiving great attention due to its potential role in DNA methylation reprogramming and as a cell state identifier. Given this interest, it is important to identify reliable and cost-effective methods for the enrichment of 5hmC marked DNA for downstream analysis. We tested three commonly used affinity-based enrichment techniques; (i) antibody, (ii) chemical capture and (iii) protein affinity enrichment and assessed their ability to accurately and reproducibly report 5hmC profiles in mouse tissues containing high (brain) and lower (liver) levels of 5hmC. The protein-affinity technique is a poor reporter of 5hmC profiles, delivering 5hmC patterns that are incompatible with other methods. Both antibody and chemical capture-based techniques generate highly similar genome-wide patterns for 5hmC, which are independently validated by standard quantitative PCR (qPCR) and glucosyl-sensitive restriction enzyme digestion (gRES-qPCR). Both antibody and chemical capture generated profiles reproducibly link to unique chromatin modification profiles associated with 5hmC. However, there appears to be a slight bias of the antibody to bind to regions of DNA rich in simple repeats. Ultimately, the increased specificity observed with chemical capture-based approaches makes this an attractive method for the analysis of locus-specific or genome-wide patterns of 5hmC. PMID:24214958

  7. Requirements Analyses Integrating Goals and Problem Analysis Techniques

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    One of the difficulties that goal-oriented requirements analyses encounters is that the efficiency of the goal refinement is based on the analysts' subjective knowledge and experience. To improve the efficiency of the requirements eiicitation process, engineers need approaches with more systemized analysis techniques. This paper integrates the goal-oriented requirements language i* with concepts from a structured problem analysis notation, problem frames (PF). The PF approach analyzes software design as a contextualized problem which has to respond to constraints imposed by the environment. The proposed approach is illustrated using the meeting scheduler exemplar. Results show that integration of the goal and the problem analysis enables simultaneous consideration of the designer's subjective intentions and the physical environmental constraints.

  8. Dispersion analysis techniques within the space vehicle dynamics simulation program

    Science.gov (United States)

    Snow, L. S.; Kuhn, A. E.

    1975-01-01

    The Space Vehicle Dynamics Simulation (SVDS) program was evaluated as a dispersion analysis tool. The Linear Error Analysis (LEA) post processor was examined in detail and simulation techniques relative to conducting a dispersion analysis using the SVDS were considered. The LEA processor is a tool for correlating trajectory dispersion data developed by simulating 3 sigma uncertainties as single error source cases. The processor combines trajectory and performance deviations by a root-sum-square (RSS process) and develops a covariance matrix for the deviations. Results are used in dispersion analyses for the baseline reference and orbiter flight test missions. As a part of this study, LEA results were verified as follows: (A) Hand calculating the RSS data and the elements of the covariance matrix for comparison with the LEA processor computed data. (B) Comparing results with previous error analyses. The LEA comparisons and verification are made at main engine cutoff (MECO).

  9. Arc-length technique for nonlinear finite element analysis

    Institute of Scientific and Technical Information of China (English)

    MEMON Bashir-Ahmed; SU Xiao-zu(苏小卒)

    2004-01-01

    Nonlinear solution of reinforced concrete structures, particularly complete load-deflection response, requires tracing of the equilibrium path and proper treatment of the limit and bifurcation points. In this regard, ordinary solution techniques lead to instability near the limit points and also have problems in case of snap-through and snap-back. Thus they fail to predict the complete load-displacement response. The arc-length method serves the purpose well in principle, Received wide acceptance in finite element analysis, and has been used extensively. However modifications to the basic idea are vital to meet the particular needs of the analysis. This paper reviews some of the recent developments of the method in the last two decades, with particular emphasis on nonlinear finite element analysis of reinforced concrete structures.

  10. Empirical Analysis of Data Mining Techniques for Social Network Websites

    Directory of Open Access Journals (Sweden)

    S.G.S Fernando

    2014-02-01

    Full Text Available Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of applications. With the growth of online social networks have significantly increased data content available because profile holders become more active producers and distributors of such data. This paper identifies and analyzes existing web mining techniques used to mine social network data.

  11. EMPIRICAL ANALYSIS OF DATA MINING TECHNIQUES FOR SOCIAL NETWORK WEBSITES

    Directory of Open Access Journals (Sweden)

    S.G.S Fernando

    2015-11-01

    Full Text Available Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of applications. With the growth of online social networks have significantly increased data content available because profile holders become more active producers and distributors of such data. This paper identifies and analyzes existing web mining techniques used to mine social network data.

  12. COMPARISON ANALYSIS OF WEB USAGE MINING USING PATTERN RECOGNITION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Nanhay Singh

    2013-07-01

    Full Text Available Web usage mining is the application of data mining techniques to better serve the needs of web-based applications on the web site. In this paper, we analyze the web usage mining by applying the pattern recognition techniques on web log data. Pattern recognition is defined as the act of taking in raw data and making an action based on the ‘category’ of the pattern. Web usage mining is divided into three partsPreprocessing, Pattern discovery and Pattern analysis. Further, this paper intended with experimental work in which web log data is used. We have taken the web log data from the “NASA” web server which is analyzed with “Web Log Explorer”. Web Log Explorer is a web usage mining tool which plays the vital role to carry out this work.

  13. BaTMAn: Bayesian Technique for Multi-image Analysis

    CERN Document Server

    Casado, J; García-Benito, R; Guidi, G; Choudhury, O S; Bellocchi, E; Sánchez, S; Díaz, A I

    2016-01-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BaTMAn), a novel image segmentation technique based on Bayesian statistics, whose main purpose is to characterize an astronomical dataset containing spatial information and perform a tessellation based on the measurements and errors provided as input. The algorithm will iteratively merge spatial elements as long as they are statistically consistent with carrying the same information (i.e. signal compatible with being identical within the errors). We illustrate its operation and performance with a set of test cases that comprises both synthetic and real Integral-Field Spectroscopic (IFS) data. Our results show that the segmentations obtained by BaTMAn adapt to the underlying structure of the data, regardless of the precise details of their morphology and the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in those regions where the signal is actually con...

  14. An Empirical Analysis of Rough Set Categorical Clustering Techniques

    Science.gov (United States)

    2017-01-01

    Clustering a set of objects into homogeneous groups is a fundamental operation in data mining. Recently, many attentions have been put on categorical data clustering, where data objects are made up of non-numerical attributes. For categorical data clustering the rough set based approaches such as Maximum Dependency Attribute (MDA) and Maximum Significance Attribute (MSA) has outperformed their predecessor approaches like Bi-Clustering (BC), Total Roughness (TR) and Min-Min Roughness(MMR). This paper presents the limitations and issues of MDA and MSA techniques on special type of data sets where both techniques fails to select or faces difficulty in selecting their best clustering attribute. Therefore, this analysis motivates the need to come up with better and more generalize rough set theory approach that can cope the issues with MDA and MSA. Hence, an alternative technique named Maximum Indiscernible Attribute (MIA) for clustering categorical data using rough set indiscernible relations is proposed. The novelty of the proposed approach is that, unlike other rough set theory techniques, it uses the domain knowledge of the data set. It is based on the concept of indiscernibility relation combined with a number of clusters. To show the significance of proposed approach, the effect of number of clusters on rough accuracy, purity and entropy are described in the form of propositions. Moreover, ten different data sets from previously utilized research cases and UCI repository are used for experiments. The results produced in tabular and graphical forms shows that the proposed MIA technique provides better performance in selecting the clustering attribute in terms of purity, entropy, iterations, time, accuracy and rough accuracy. PMID:28068344

  15. An Empirical Analysis of Rough Set Categorical Clustering Techniques.

    Science.gov (United States)

    Uddin, Jamal; Ghazali, Rozaida; Deris, Mustafa Mat

    2017-01-01

    Clustering a set of objects into homogeneous groups is a fundamental operation in data mining. Recently, many attentions have been put on categorical data clustering, where data objects are made up of non-numerical attributes. For categorical data clustering the rough set based approaches such as Maximum Dependency Attribute (MDA) and Maximum Significance Attribute (MSA) has outperformed their predecessor approaches like Bi-Clustering (BC), Total Roughness (TR) and Min-Min Roughness(MMR). This paper presents the limitations and issues of MDA and MSA techniques on special type of data sets where both techniques fails to select or faces difficulty in selecting their best clustering attribute. Therefore, this analysis motivates the need to come up with better and more generalize rough set theory approach that can cope the issues with MDA and MSA. Hence, an alternative technique named Maximum Indiscernible Attribute (MIA) for clustering categorical data using rough set indiscernible relations is proposed. The novelty of the proposed approach is that, unlike other rough set theory techniques, it uses the domain knowledge of the data set. It is based on the concept of indiscernibility relation combined with a number of clusters. To show the significance of proposed approach, the effect of number of clusters on rough accuracy, purity and entropy are described in the form of propositions. Moreover, ten different data sets from previously utilized research cases and UCI repository are used for experiments. The results produced in tabular and graphical forms shows that the proposed MIA technique provides better performance in selecting the clustering attribute in terms of purity, entropy, iterations, time, accuracy and rough accuracy.

  16. Investigation of spectral analysis techniques for randomly sampled velocimetry data

    Science.gov (United States)

    Sree, Dave

    1993-01-01

    It is well known that velocimetry (LV) generates individual realization velocity data that are randomly or unevenly sampled in time. Spectral analysis of such data to obtain the turbulence spectra, and hence turbulence scales information, requires special techniques. The 'slotting' technique of Mayo et al, also described by Roberts and Ajmani, and the 'Direct Transform' method of Gaster and Roberts are well known in the LV community. The slotting technique is faster than the direct transform method in computation. There are practical limitations, however, as to how a high frequency and accurate estimate can be made for a given mean sampling rate. These high frequency estimates are important in obtaining the microscale information of turbulence structure. It was found from previous studies that reliable spectral estimates can be made up to about the mean sampling frequency (mean data rate) or less. If the data were evenly samples, the frequency range would be half the sampling frequency (i.e. up to Nyquist frequency); otherwise, aliasing problem would occur. The mean data rate and the sample size (total number of points) basically limit the frequency range. Also, there are large variabilities or errors associated with the high frequency estimates from randomly sampled signals. Roberts and Ajmani proposed certain pre-filtering techniques to reduce these variabilities, but at the cost of low frequency estimates. The prefiltering acts as a high-pass filter. Further, Shapiro and Silverman showed theoretically that, for Poisson sampled signals, it is possible to obtain alias-free spectral estimates far beyond the mean sampling frequency. But the question is, how far? During his tenure under 1993 NASA-ASEE Summer Faculty Fellowship Program, the author investigated from his studies on the spectral analysis techniques for randomly sampled signals that the spectral estimates can be enhanced or improved up to about 4-5 times the mean sampling frequency by using a suitable

  17. Modular Sampling and Analysis Techniques for the Real-Time Analysis of Human Breath

    Energy Technology Data Exchange (ETDEWEB)

    Frank, M; Farquar, G; Adams, K; Bogan, M; Martin, A; Benner, H; Spadaccini, C; Steele, P; Davis, C; Loyola, B; Morgan, J; Sankaran, S

    2007-07-09

    At LLNL and UC Davis, we are developing several techniques for the real-time sampling and analysis of trace gases, aerosols and exhaled breath that could be useful for a modular, integrated system for breath analysis. Those techniques include single-particle bioaerosol mass spectrometry (BAMS) for the analysis of exhaled aerosol particles or droplets as well as breath samplers integrated with gas chromatography mass spectrometry (GC-MS) or MEMS-based differential mobility spectrometry (DMS). We describe these techniques and present recent data obtained from human breath or breath condensate, in particular, addressing the question of how environmental exposure influences the composition of breath.

  18. Burnout prediction using advance image analysis coal characterization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Edward Lester; Dave Watts; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical Environmental and Mining Engineering

    2003-07-01

    The link between petrographic composition and burnout has been investigated previously by the authors. However, these predictions were based on 'bulk' properties of the coal, including the proportion of each maceral or the reflectance of the macerals in the whole sample. Combustion studies relating burnout with microlithotype analysis, or similar, remain less common partly because the technique is more complex than maceral analysis. Despite this, it is likely that any burnout prediction based on petrographic characteristics will become more accurate if it includes information about the maceral associations and the size of each particle. Chars from 13 coals, 106-125 micron size fractions, were prepared using a Drop Tube Furnace (DTF) at 1300{degree}C and 200 millisecond and 1% Oxygen. These chars were then refired in the DTF at 1300{degree}C 5% oxygen and residence times of 200, 400 and 600 milliseconds. The progressive burnout of each char was compared with the characteristics of the initial coals. This paper presents an extension of previous studies in that it relates combustion behaviour to coals that have been characterized on a particle by particle basis using advanced image analysis techniques. 13 refs., 7 figs.

  19. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    Science.gov (United States)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  20. Image analysis technique applied to lock-exchange gravity currents

    OpenAIRE

    Nogueira, Helena; Adduce, Claudia; Alves, Elsa; Franca, Rodrigues Pereira Da; Jorge, Mario

    2013-01-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in th...

  1. Dynamic Range Analysis of the Phase Generated Carrier Demodulation Technique

    Directory of Open Access Journals (Sweden)

    M. J. Plotnikov

    2014-01-01

    Full Text Available The dependence of the dynamic range of the phase generated carrier (PGC technique on low-pass filters passbands is investigated using a simulation model. A nonlinear character of this dependence, which could lead to dynamic range limitations or measurement uncertainty, is presented for the first time. A detailed theoretical analysis is provided to verify the simulation results and these results are consistent with performed calculations. The method for the calculation of low-pass filters passbands according to the required dynamic range upper limit is proposed.

  2. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune

    1998-01-01

    This article describes the work carried out within the project: Modal Analysis Based on the Random Decrement Technique - Application to Civil Engineering Structures. The project is part of the research programme: Dynamics of Structures sponsored by the Danish Technical Research Counsil. The planned...... contents and the requirement for the project prior to its start are described together with thee results obtained during the 3 year period of the project. The project was mainly carried out as a Ph.D project by the first author from September 1994 to August 1997 in cooperation with associate professor Rune...

  3. New technique for high-speed microjet breakup analysis

    Energy Technology Data Exchange (ETDEWEB)

    Vago, N. [Department of Atomic Physics, Budapest University of Technology and Economics, Budafoki ut 8, 1111, Budapest (Hungary); Synova SA, Ch. Dent d' Oche, 1024 Ecublens (Switzerland); Spiegel, A. [Department of Atomic Physics, Budapest University of Technology and Economics, Budafoki ut 8, 1111, Budapest (Hungary); Couty, P. [Institute of Imaging and Applied Optics, Swiss Federal Institute of Technology, Lausanne, BM, 1015, Lausanne (Switzerland); Wagner, F.R.; Richerzhagen, B. [Synova SA, Ch. Dent d' Oche, 1024 Ecublens (Switzerland)

    2003-10-01

    In this paper we introduce a new technique for visualizing the breakup of thin high-speed liquid jets. Focused light of a He-Ne laser is coupled into a water jet, which behaves as a cylindrical waveguide until the point where the amplitude of surface waves is large enough to scatter out the light from the jet. Observing the jet from a direction perpendicular to its axis, the light that appears indicates the location of breakup. Real-time examination and also statistical analysis of the jet disruption is possible with this method. A ray tracing method was developed to demonstrate the light scattering process. (orig.)

  4. Atmospheric Deposition of Heavy Metals around the Lead and Copper-Zinc Smelters in Baia Mare, Romania, Studied by the Moss Biomonitoring Technique, Neutron Activation Analysis and Flame Atomic Absorption Spectrometry

    CERN Document Server

    Culicov, O A; Steinnes, E; Okina, O S; Santa, Z; Todoran, R

    2002-01-01

    The mosses Pleurozium schreberi, Pseudoscleropodium purum and Rhytidiadelphus squarrosus were used as biomonitors to study the atmospheric deposition of heavy metals around the lead and copper-zinc smelters in Baia Mare. Samples representing the last three years' growth of moss or its green part, collected on the ground at 28 sites located 2-17 km from the source area, were analyzed by instrumental neutron activation analysis using epithermal neutrons and by flame atomic absorption spectrometry. A total of 31 elements were determined, including most of the heavy metals characteristic of emissions from this kind industry. The observed data for Pb, As, Cu, and Cd are all high compared with those observed in other regions of Europe with similar industries, but the concentrations in moss approach regional background levels at a distance of about 8 km from the main source area. Factor analysis of the data distinguishes two industrial components, one characterized by Pb, Cu, As, and Sb, and another one by Zn and Cd...

  5. Golden glazes analysis by PIGE and PIXE techniques

    Energy Technology Data Exchange (ETDEWEB)

    Fonseca, M., E-mail: mmfonseca@itn.pt [Dept. Fisica, Faculdade de Ciencias e Tecnologia, Universidade Nova de Lisboa, Caparica (Portugal); Centro de Fisica Nuclear da Universidade de Lisboa, Lisboa (Portugal); Luis, H., E-mail: heliofluis@itn.pt [Dept. Fisica, Faculdade de Ciencias e Tecnologia, Universidade Nova de Lisboa, Caparica (Portugal); Centro de Fisica Nuclear da Universidade de Lisboa, Lisboa (Portugal); Franco, N., E-mail: nfranco@itn.pt [Instituto Tecnologico Nuclear, Sacavem (Portugal); Reis, M.A., E-mail: mareis@itn.pt [Instituto Tecnologico Nuclear, Sacavem (Portugal); Chaves, P.C., E-mail: cchaves@itn.pt [Instituto Tecnologico Nuclear, Sacavem (Portugal); Taborda, A., E-mail: galaviz@cii.fc.ul.pt [Instituto Tecnologico Nuclear, Sacavem (Portugal); Cruz, J., E-mail: jdc@fct.unl.pt [Dept. Fisica, Faculdade de Ciencias e Tecnologia, Universidade Nova de Lisboa, Caparica (Portugal); Centro de Fisica Nuclear da Universidade de Lisboa, Lisboa (Portugal); Galaviz, D., E-mail: ataborda@itn.pt [Centro de Fisica Nuclear da Universidade de Lisboa, Lisboa (Portugal); Dept. Fisica, Faculdade de Ciencias, Universidade de Lisboa, Lisboa (Portugal); and others

    2011-12-15

    We present the analysis performed on the chemical composition of two golden glazes available in the market using the PIGE and PIXE techniques at the ITN ion beam laboratory. The analysis of the light elements was performed using the Emitted Radiation Yield Analysis (ERYA) code, a standard-free method for PIGE analysis on thick samples. The results were compared to those obtained on an old glaze. Consistently high concentrations of lead and sodium were found in all analyzed golden glazes. The analysis of the samples pointed to Mo and Co as the specific elements responsible of the gold colour at the desired temperature, and allowed Portuguese ceramists to produce a golden glaze at 997 Degree-Sign C. Optical reflection spectra of the glazes are given, showing that the produced glaze has a spectrum similar to the old glaze. Also, in order to help the ceramists, the unknown compositions of four different types of frits (one of the components of glazes) were analysed.

  6. Improvements in analysis techniques for segmented mirror arrays

    Science.gov (United States)

    Michels, Gregory J.; Genberg, Victor L.; Bisson, Gary R.

    2016-08-01

    The employment of actively controlled segmented mirror architectures has become increasingly common in the development of current astronomical telescopes. Optomechanical analysis of such hardware presents unique issues compared to that of monolithic mirror designs. The work presented here is a review of current capabilities and improvements in the methodology of the analysis of mechanically induced surface deformation of such systems. The recent improvements include capability to differentiate surface deformation at the array and segment level. This differentiation allowing surface deformation analysis at each individual segment level offers useful insight into the mechanical behavior of the segments that is unavailable by analysis solely at the parent array level. In addition, capability to characterize the full displacement vector deformation of collections of points allows analysis of mechanical disturbance predictions of assembly interfaces relative to other assembly interfaces. This capability, called racking analysis, allows engineers to develop designs for segment-to-segment phasing performance in assembly integration, 0g release, and thermal stability of operation. The performance predicted by racking has the advantage of being comparable to the measurements used in assembly of hardware. Approaches to all of the above issues are presented and demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  7. METHODOLOGICAL STUDY OF OPINION MINING AND SENTIMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Pravesh Kumar Singh

    2014-02-01

    Full Text Available Decision making both on individual and organizational level is always accompanied by the search of other’s opinion on the same. With tremendous establishment of opinion rich resources like, reviews, forum discussions, blogs, micro-blogs, Twitter etc provide a rich anthology of sentiments. This user generated content can serve as a benefaction to market if the semantic orientations are deliberated. Opinion mining and sentiment analysis are the formalization for studying and construing opinions and sentiments. The digital ecosystem has itself paved way for use of huge volume of opinionated data recorded. This paper is an attempt to review and evaluate the various techniques used for opinion and sentiment analysis.

  8. Recovering prehistoric woodworking skills using spatial analysis techniques

    Science.gov (United States)

    Kovács, K.; Hanke, K.

    2015-08-01

    Recovering of ancient woodworking skills can be achieved by the simultaneous documentation and analysis of the tangible evidences such as the geometry parameters of prehistoric hand tools or the fine morphological characteristics of well preserved wooden archaeological finds. During this study, altogether 10 different hand tool forms and over 60 hand tool impressions were investigated for the better understanding of the Bronze Age woodworking efficiency. Two archaeological experiments were also designed in this methodology and unknown prehistoric adzes could be reconstructed by the results of these studies and by the spatial analysis of the Bronze Age tool marks. Finally, the trimming efficiency of these objects were also implied and these woodworking skills could be quantified in the case of a Bronze Age wooden construction from Austria. The proposed GIS-based tool mark segmentation and comparison can offer an objective, user-independent technique for the related intangible heritage interpretations in the future.

  9. BaTMAn: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2016-12-01

    Bayesian Technique for Multi-image Analysis (BaTMAn) characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). The output segmentations successfully adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. BaTMAn identifies (and keeps) all the statistically-significant information contained in the input multi-image (e.g. an IFS datacube). The main aim of the algorithm is to characterize spatially-resolved data prior to their analysis.

  10. Dynamic analysis of granite rockburst based on the PIV technique

    Institute of Scientific and Technical Information of China (English)

    Wang Hongjian; Liu Da’an; Gong Weili; Li Liyun

    2015-01-01

    This paper describes the deep rockburst simulation system to reproduce the granite instantaneous rock-burst process. Based on the PIV (Particle Image Velocimetry) technique, quantitative analysis of a rock-burst, the images of tracer particle, displacement and strain fields can be obtained, and the debris trajectory described. According to the observation of on-site tests, the dynamic rockburst is actually a gas–solid high speed flow process, which is caused by the interaction of rock fragments and surrounding air. With the help of analysis on high speed video and PIV images, the granite rockburst failure process is composed of six stages of platey fragment spalling and debris ejection. Meanwhile, the elastic energy for these six stages has been calculated to study the energy variation. The results indicate that the rockburst process can be summarized as:an initiating stage, intensive developing stage and gradual decay stage. This research will be helpful for our further understanding of the rockburst mechanism.

  11. Sensitivity analysis techniques for models of human behavior.

    Energy Technology Data Exchange (ETDEWEB)

    Bier, Asmeret Brooke

    2010-09-01

    Human and social modeling has emerged as an important research area at Sandia National Laboratories due to its potential to improve national defense-related decision-making in the presence of uncertainty. To learn about which sensitivity analysis techniques are most suitable for models of human behavior, different promising methods were applied to an example model, tested, and compared. The example model simulates cognitive, behavioral, and social processes and interactions, and involves substantial nonlinearity, uncertainty, and variability. Results showed that some sensitivity analysis methods create similar results, and can thus be considered redundant. However, other methods, such as global methods that consider interactions between inputs, can generate insight not gained from traditional methods.

  12. Pressure transient analysis for long homogeneous reservoirs using TDS technique

    Energy Technology Data Exchange (ETDEWEB)

    Escobar, Freddy Humberto [Universidad Surcolombiana, Av. Pastrana - Cra. 1, Neiva, Huila (Colombia); Hernandez, Yuly Andrea [Hocol S.A., Cra. 7 No 114-43, Floor 16, Bogota (Colombia); Hernandez, Claudia Marcela [Weatherford, Cra. 7 No 81-90, Neiva, Huila (Colombia)

    2007-08-15

    A significant number of well pressure tests are conducted in long, narrow reservoirs with close and open extreme boundaries. It is desirable not only to appropriately identify these types of systems but also to develop an adequate and practical interpretation technique to determine their parameters and size, when possible. An accurate understanding of how the reservoir produces and the magnitude of producible reserves can lead to competent decisions and adequate reservoir management. So far, studies found for identification and determination of parameters for such systems are conducted by conventional techniques (semilog analysis) and semilog and log-log type-curve matching of pressure versus time. Type-curve matching is basically a trial-and-error procedure which may provide inaccurate results. Besides, a limitation in the number of type curves plays a negative role. In this paper, a detailed analysis of pressure derivative behavior for a vertical well in linear reservoirs with open and closed extreme boundaries is presented for the case of constant rate production. We studied independently each flow regime, especially the linear flow regime since it is the most characteristic 'fingerprint' of these systems. We found that when the well is located at one of the extremes of the reservoir, a single linear flow regime develops once radial flow and/or wellbore storage effects have ended. When the well is located at a given distance from both extreme boundaries, the pressure derivative permits the identification of two linear flows toward the well and it has been called that 'dual-linear flow regime'. This is characterized by an increment of the intercept of the 1/2-slope line from {pi}{sup 0.5} to {pi} with a consequent transition between these two straight lines. The identification of intersection points, lines, and characteristic slopes allows us to develop an interpretation technique without employing type-curve matching. This technique uses

  13. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  14. The effects of communication techniques on public relation activities: A sample of hospitality business

    Directory of Open Access Journals (Sweden)

    Şirvan Şen Demir

    2011-07-01

    Full Text Available Nowadays, firms who give importance to public relations have been increasing rapidly in numbers. All modern firms either found public relations department in their body to deal with public relations operations or outsource this activity to consultants in order to communicate with target populations. Among the firms in tourism sector, hospitality companies are the ones that use public relations the most. The purpose of this study is to investigate the communication techniques in public relation and effects of these techniques on public relation activities. A literature review was conducted for research model and then questionnaire was developed from the studies in the literature. Data were collected by researchers in face-to-face interviews with 145 supervisors who are responsible for public relation activities of the hotel and were analyzed with SPSS statistical programs. Structural and convergent validity of the data have revealed with the explanatory factor analysis. It was tested using a regression analysis to determine the effects of independent variables on dependent variables. As a result, independent variables have positive effects on the dependent variables.

  15. Accelerator human interface. 4. Object analysis by OMT technique

    Energy Technology Data Exchange (ETDEWEB)

    Abe, Isamu; Nakahara, Kazuo [National Lab. for High Energy Physics, Tsukuba, Ibaraki (Japan); Mutoh, Masakatsu; Shibasaki, Yoshinobu

    1995-07-01

    The analysis of the objects of various classes in accelerator domain was carried out by OMT technique, and its summary is reported. By changing the technique from the conventional procedural type to object-oriented type, and reconsidering accelerator control, it becomes possible to give large impact to the development of software and accelerator control. Also the importance of establishing fundamental object system and its outline are described. As to the original objects of accelerators, the change due to age is small as compared with computer environment. Accordingly, by extracting the standard objects as far as possible, and making the objects so as to be able to deal with control system and multi-platform progressing with age, the situation that control systems become out of date can be clearly solved. The establishment of optimal objects shows the possibility of presenting optimal control system, and object analysis brings about many merits. OMT was adopted here. Accelerator control domain is divided into device class and generic task class, and the latter is divided into operation, diagnosis, operation support, simulation, data base, indication system and maintenance classes. The abstracting of data and procedure, the succession between devices and the behavior of objects are described. (K.I.)

  16. Application of transport phenomena analysis technique to cerebrospinal fluid.

    Science.gov (United States)

    Lam, C H; Hansen, E A; Hall, W A; Hubel, A

    2013-12-01

    The study of hydrocephalus and the modeling of cerebrospinal fluid flow have proceeded in the past using mathematical analysis that was very capable of prediction phenomenonologically but not well in physiologic parameters. In this paper, the basis of fluid dynamics at the physiologic state is explained using first established equations of transport phenomenon. Then, microscopic and molecular level techniques of modeling are described using porous media theory and chemical kinetic theory and then applied to cerebrospinal fluid (CSF) dynamics. Using techniques of transport analysis allows the field of cerebrospinal fluid dynamics to approach the level of sophistication of urine and blood transport. Concepts such as intracellular and intercellular pathways, compartmentalization, and tortuosity are associated with quantifiable parameters that are relevant to the anatomy and physiology of cerebrospinal fluid transport. The engineering field of transport phenomenon is rich and steeped in architectural, aeronautical, nautical, and more recently biological history. This paper summarizes and reviews the approaches that have been taken in the field of engineering and applies it to CSF flow.

  17. Vortex metrology using Fourier analysis techniques: vortex networks correlation fringes.

    Science.gov (United States)

    Angel-Toro, Luciano; Sierra-Sosa, Daniel; Tebaldi, Myrian; Bolognini, Néstor

    2012-10-20

    In this work, we introduce an alternative method of analysis in vortex metrology based on the application of the Fourier optics techniques. The first part of the procedure is conducted as is usual in vortex metrology for uniform in-plane displacement determination. On the basis of two recorded intensity speckled distributions, corresponding to two states of a diffuser coherently illuminated, we numerically generate an analytical signal from each recorded intensity pattern by using a version of the Riesz integral transform. Then, from each analytical signal, a two-dimensional pseudophase map is generated in which the vortices are located and characterized in terms of their topological charges and their core's structural properties. The second part of the procedure allows obtaining Young's interference fringes when Fourier transforming the light passing through a diffracting mask with multiple apertures at the locations of the homologous vortices. In fact, we use the Fourier transform as a mathematical operation to compute the far-field diffraction intensity pattern corresponding to the multiaperture set. Each aperture from the set is associated with a rectangular hole that coincides both in shape and size with a pixel from recorded images. We show that the fringe analysis can be conducted as in speckle photography in an extended range of displacement measurements. Effects related with speckled decorrelation are also considered. Our experimental results agree with those of speckle photography in the range in which both techniques are applicable.

  18. PVUSA instrumentation and data analysis techniques for photovoltaic systems

    Energy Technology Data Exchange (ETDEWEB)

    Newmiller, J.; Hutchinson, P.; Townsend, T.; Whitaker, C.

    1995-10-01

    The Photovoltaics for Utility Scale Applications (PVUSA) project tests two types of PV systems at the main test site in Davis, California: new module technologies fielded as 20-kW Emerging Module Technology (EMT) arrays and more mature technologies fielded as 70- to 500-kW turnkey Utility-Scale (US) systems. PVUSA members have also installed systems in their service areas. Designed appropriately, data acquisition systems (DASs) can be a convenient and reliable means of assessing system performance, value, and health. Improperly designed, they can be complicated, difficult to use and maintain, and provide data of questionable validity. This report documents PVUSA PV system instrumentation and data analysis techniques and lessons learned. The report is intended to assist utility engineers, PV system designers, and project managers in establishing an objective, then, through a logical series of topics, facilitate selection and design of a DAS to meet the objective. Report sections include Performance Reporting Objectives (including operational versus research DAS), Recommended Measurements, Measurement Techniques, Calibration Issues, and Data Processing and Analysis Techniques. Conclusions and recommendations based on the several years of operation and performance monitoring are offered. This report is one in a series of 1994--1995 PVUSA reports documenting PVUSA lessons learned at the demonstration sites in Davis and Kerman, California. Other topical reports address: five-year assessment of EMTs; validation of the Kerman 500-kW grid support PV plant benefits; construction and safety experience in installing and operating PV systems; balance-of-system design and costs; procurement, acceptance, and rating practices for PV power plants; experience with power conditioning units and power quality.

  19. An effective technique for isolating adult activated Schwann cells

    Institute of Scientific and Technical Information of China (English)

    Jifei Zhang; Lianhong Jin; Yuzhen Zhao

    2006-01-01

    BACKGROUND: Schwann cells (SCs) are neuroglial cells of peripheral nerve and play a key role in repairing peripheral nerve injury; therefore, it provides an important evidence for transplantation of SCs which are characterized by active proliferation and adult high-purity in vitro after nerve injury in clinic, and also develops a new therapeutic way for nerve injury.OBJECTIVE: To investigate an effective technique for isolating adult activated Schwann cells.DESIGN: Controlled observational study.SETTING: Mudanjiang Medical College.MATERIALS: The experiment was completed at the Department of Medical Genetics of Harbin Medical University from March 2003 to April 2005. Health female Wistar rats, aged 2 months, weighting 150-160 g, were randomly divided into 3 groups with 5 in each group.METHODS: The right sciatic nerves from 15 Wistar rats were exposed and transected at the mid thigh under pentobarbital anesthesia (4 mg/kg, I.p). Seven days later, the distal segments of the predegenerated nerves were removed and used to produce adult Schwann cell cultures. The distal segment of the predegenerated nerve, 20 mm in length, was resected. The nerve was cut into pieces 1 mm in length and incubated for 3 hours under CO2 at 37 ℃ with an enzyme mixture of 0.05% collagenase/dispase. Rats were divided into 3 groups:① Group 1: The nerve fragments were explanted in poly-L-lysine and laminin-coated dishes with BS medium from the 1st to the 6th day. On the 6th day, the fragments were removed into a new poly-L-lysine-laminin-coated dish and the BS medium was changed to BS with 10% FBS. The nerve fragments were replaced repeatedly in the same way in new dishes on the 12th and the 18th days. ②Group 2: For the first 3 days, the nerve fragments were fed with BS with 10% FBS. This medium was changed to BS medium on the third day. The nerve fragments were removed to another dish on day 6 and BS medium was changed to BS with 25 mL/L FBS. Hereafter the culture method was the same as

  20. Stalked protozoa identification by image analysis and multivariable statistical techniques

    OpenAIRE

    Amaral, A.L.; Ginoris, Y. P.; Nicolau, Ana; M.A.Z. Coelho; Ferreira, E. C.

    2008-01-01

    Protozoa are considered good indicators of the treatment quality in activated sludge systems as they are sensitive to physical, chemical and operational processes. Therefore, it is possible to correlate the predominance of certain species or groups and several operational parameters of the plant. This work presents a semiautomatic image analysis procedure for the recognition of the stalked protozoa species most frequently found in wastewater treatment plants by determinin...

  1. BATMAN: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2016-12-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real Integral-Field Spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of two. Our analysis reveals that the algorithm prioritizes conservation of all the statistically-significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BATMAN is not to be used as a `black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially-resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  2. Insight to Nanoparticle Size Analysis-Novel and Convenient Image Analysis Method Versus Conventional Techniques.

    Science.gov (United States)

    Vippola, Minnamari; Valkonen, Masi; Sarlin, Essi; Honkanen, Mari; Huttunen, Heikki

    2016-12-01

    The aim of this paper is to introduce a new image analysis program "Nanoannotator" particularly developed for analyzing individual nanoparticles in transmission electron microscopy images. This paper describes the usefulness and efficiency of the program when analyzing nanoparticles, and at the same time, we compare it to more conventional nanoparticle analysis techniques. The techniques which we are concentrating here are transmission electron microscopy (TEM) linked with different image analysis methods and X-ray diffraction techniques. The developed program appeared as a good supplement to the field of particle analysis techniques, since the traditional image analysis programs suffer from the inability to separate the individual particles from agglomerates in the TEM images. The program is more efficient, and it offers more detailed morphological information of the particles than the manual technique. However, particle shapes that are very different from spherical proved to be problematic also for the novel program. When compared to X-ray techniques, the main advantage of the small-angle X-ray scattering (SAXS) method is the average data it provides from a very large amount of particles. However, the SAXS method does not provide any data about the shape or appearance of the sample.

  3. Techniques of DNA methylation analysis with nutritional applications.

    Science.gov (United States)

    Mansego, Maria L; Milagro, Fermín I; Campión, Javier; Martínez, J Alfredo

    2013-01-01

    Epigenetic mechanisms are likely to play an important role in the regulation of metabolism and body weight through gene-nutrient interactions. This review focuses on methods for analyzing one of the most important epigenetic mechanisms, DNA methylation, from single nucleotide to global measurement depending on the study goal and scope. In addition, this study highlights the major principles and methods for DNA methylation analysis with emphasis on nutritional applications. Recent developments concerning epigenetic technologies are showing promising results of DNA methylation levels at a single-base resolution and provide the ability to differentiate between 5-methylcytosine and other nucleotide modifications such as 5-hydroxymethylcytosine. A large number of methods can be used for the analysis of DNA methylation such as pyrosequencing™, primer extension or real-time PCR methods, and genome-wide DNA methylation profile from microarray or sequencing-based methods. Researchers should conduct a preliminary analysis focused on the type of validation and information provided by each technique in order to select the best method fitting for their nutritional research interests.

  4. Comparing dynamical systems concepts and techniques for biomechanical analysis

    Institute of Scientific and Technical Information of China (English)

    Richard E.A. van Emmerik; Scott W. Ducharme; Avelino C. Amado; Joseph Hamill

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new states, and (3) are governed by short-and long-term (fractal) correlational processes at different spatio-temporal scales. These different aspects of system dynamics are typically investigated using concepts related to variability, stability, complexity, and adaptability. The purpose of this paper is to compare and contrast these different concepts and demonstrate that, although related, these terms represent fundamentally different aspects of system dynamics. In particular, we argue that variability should not uniformly be equated with stability or complexity of movement. In addition, current dynamic stability measures based on nonlinear analysis methods (such as the finite maximal Lyapunov exponent) can reveal local instabilities in movement dynamics, but the degree to which these local instabilities relate to global postural and gait stability and the ability to resist external perturbations remains to be explored. Finally, systematic studies are needed to relate observed reductions in complexity with aging and disease to the adaptive capabilities of the movement system and how complexity changes as a function of different task constraints.

  5. Analysis techniques for background rejection at the Majorana Demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Cuestra, Clara [University of Washington; Rielage, Keith Robert [Los Alamos National Laboratory; Elliott, Steven Ray [Los Alamos National Laboratory; Xu, Wenqin [Los Alamos National Laboratory; Goett, John Jerome III [Los Alamos National Laboratory

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  6. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  7. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    Energy Technology Data Exchange (ETDEWEB)

    Cuesta, C.; Buuck, M.; Detwiler, J. A.; Gruszko, J.; Guinn, I. S.; Leon, J.; Robertson, R. G. H. [Center for Experimental Nuclear Physics and Astrophysics, and Department of Physics, University of Washington, Seattle, WA (United States); Abgrall, N.; Bradley, A. W.; Chan, Y-D.; Mertens, S.; Poon, A. W. P. [Nuclear Science Division, Lawrence Berkeley National Laboratory, Berkeley, CA (United States); Arnquist, I. J.; Hoppe, E. W.; Kouzes, R. T.; LaFerriere, B. D.; Orrell, J. L. [Pacific Northwest National Laboratory, Richland, WA (United States); Avignone, F. T. [Department of Physics and Astronomy, University of South Carolina, Columbia, SC (United States); Oak Ridge National Laboratory, Oak Ridge, TN (United States); Baldenegro-Barrera, C. X.; Bertrand, F. E. [Oak Ridge National Laboratory, Oak Ridge, TN (United States); and others

    2015-08-17

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40- kg modular HPGe detector array to search for neutrinoless double beta decay in {sup 76}Ge. In view of the next generation of tonne-scale Ge-based 0νβ β-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR’s germanium detectors allows for significant reduction of gamma background.

  8. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    CERN Document Server

    Cuesta, C; Arnquist, I J; Avignone, F T; Baldenegro-Barrera, C X; Barabash, A S; Bertrand, F E; Bradley, A W; Brudanin, V; Busch, M; Buuck, M; Byram, D; Caldwell, A S; Chan, Y-D; Christofferson, C D; Detwiler, J A; Efremenko, Yu; Ejiri, H; Elliott, S R; Galindo-Uribarri, A; Gilliss, T; Giovanetti, G K; Goett, J; Green, M P; Gruszko, J; Guinn, I S; Guiseppe, V E; Henning, R; Hoppe, E W; Howard, S; Howe, M A; Jasinski, B R; Keeter, K J; Kidd, M F; Konovalov, S I; Kouzes, R T; LaFerriere, B D; Leon, J; MacMullin, J; Martin, R D; Meijer, S J; Mertens, S; Orrell, J L; O'Shaughnessy, C; Poon, A W P; Radford, D C; Rager, J; Rielage, K; Robertson, R G H; Romero-Romero, E; Shanks, B; Shirchenko, M; Snyder, N; Suriano, A M; Tedeschi, D; Trimble, J E; Varner, R L; Vasilyev, S; Vetter, K; Vorren, K; White, B R; Wilkerson, J F; Wiseman, C; Xu, W; Yakushev, E; Yu, C -H; Yumatov, V; Zhitnikov, I

    2015-01-01

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0nbb-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR 0s germanium detectors allows for significant reduction of gamma background.

  9. ANALYSIS OF ANDROID VULNERABILITIES AND MODERN EXPLOITATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Himanshu Shewale

    2014-03-01

    Full Text Available Android is an operating system based on the Linux kernel. It is the most widely used and popular operating system among Smartphones and portable devices. Its programmable and open nature attracts attackers to take undue advantage. Android platform allows developers to freely access and modify source code. But at the same time it increases the security issue. A user is likely to download and install malicious applications written by software hackers. This paper focuses on understanding and analyzing the vulnerabilities present in android platform. In this paper firstly we study the android architecture; analyze the existing threats and security weaknesses. Then we identify various exploit mitigation techniques to mitigate known vulnerabilities. A detailed analysis will help us to identify the existing loopholes and it will give strategic direction to make android operating system more secure.

  10. Adolescent baseball pitching technique: lower extremity biomechanical analysis.

    Science.gov (United States)

    Milewski, Matthew D; Õunpuu, Sylvia; Solomito, Matthew; Westwell, Melany; Nissen, Carl W

    2012-11-01

    Documentation of the lower extremity motion patterns of adolescent pitchers is an important part of understanding the pitching motion and the implication of lower extremity technique on upper extremity loads, injury and performance. The purpose of this study was to take the initial step in this process by documenting the biomechanics of the lower extremities during the pitching cycle in adolescent pitchers and to compare these findings with the published data for older pitchers. Three-dimensional motion analysis using a comprehensive lower extremity model was used to evaluate the fast ball pitch technique in adolescent pitchers. Thirty-two pitchers with a mean age of 12.4 years (range 10.5-14.7 years) and at least 2 years of experience were included in this study. The pitchers showed a mean of 49 ± 12° of knee flexion of the lead leg at foot contact. They tended to maintain this position through ball release, and then extended their knee during the follow through phase (ball release to maximal internal glenohumeral rotation). The lead leg hip rapidly progressed into adduction and flexion during the arm cocking phase with a range of motion of 40 ± 10° adduction and 30 ± 13° flexion. The lead hip mean peak adduction velocity was 434 ± 83°/s and flexion velocity was 456 ± 156°/s. Simultaneously, the trailing leg hip rapidly extended approaching to a mean peak extension of -8 ± 5° at 39% of the pitch cycle, which is close to passive range of motion constraints. Peak hip abduction of the trailing leg at foot contact was -31 ± 12°, which also approached passive range of motion constraints. Differences and similarities were also noted between the adolescent lower extremity kinematics and adult pitchers; however, a more comprehensive analysis using similar methods is needed for a complete comparison.

  11. Acoustical Characteristics of Mastication Sounds: Application of Speech Analysis Techniques

    Science.gov (United States)

    Brochetti, Denise

    Food scientists have used acoustical methods to study characteristics of mastication sounds in relation to food texture. However, a model for analysis of the sounds has not been identified, and reliability of the methods has not been reported. Therefore, speech analysis techniques were applied to mastication sounds, and variation in measures of the sounds was examined. To meet these objectives, two experiments were conducted. In the first experiment, a digital sound spectrograph generated waveforms and wideband spectrograms of sounds by 3 adult subjects (1 male, 2 females) for initial chews of food samples differing in hardness and fracturability. Acoustical characteristics were described and compared. For all sounds, formants appeared in the spectrograms, and energy occurred across a 0 to 8000-Hz range of frequencies. Bursts characterized waveforms for peanut, almond, raw carrot, ginger snap, and hard candy. Duration and amplitude of the sounds varied with the subjects. In the second experiment, the spectrograph was used to measure the duration, amplitude, and formants of sounds for the initial 2 chews of cylindrical food samples (raw carrot, teething toast) differing in diameter (1.27, 1.90, 2.54 cm). Six adult subjects (3 males, 3 females) having normal occlusions and temporomandibular joints chewed the samples between the molar teeth and with the mouth open. Ten repetitions per subject were examined for each food sample. Analysis of estimates of variation indicated an inconsistent intrasubject variation in the acoustical measures. Food type and sample diameter also affected the estimates, indicating the variable nature of mastication. Generally, intrasubject variation was greater than intersubject variation. Analysis of ranks of the data indicated that the effect of sample diameter on the acoustical measures was inconsistent and depended on the subject and type of food. If inferences are to be made concerning food texture from acoustical measures of mastication

  12. Measurement uncertainty on subsurface defects detection using active infrared thermographic technique

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yoon Jae; Kim [Kongju National University, Cheonan (Korea, Republic of); Choi, Won Jae [Center for Safety Measurements, Korea Research Institute of Standards and Science, Daejeon (Korea, Republic of)

    2015-10-15

    Active infrared thermography methods have been known to possess good fault detection capabilities for the detection of defects in materials compared to the conventional passive thermal infrared imaging techniques. However, the reliability of the technique has been under scrutiny. This paper proposes the lock-in thermography technique for the detection and estimation of artificial subsurface defect size and depth with uncertainty measurement.

  13. A comparison between active and passive techniques for measurements of radon emanation factors

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Coto, I. [Dept. Fisica Aplicada, University of Huelva, Huelva (Spain)], E-mail: Israel.lopez@dfa.uhu.es; Mas, J.L. [Dept. de Fisica Aplicada I, E.U.P., University of Seville, Seville (Spain); San Miguel, E.G.; Bolivar, J.P. [Dept. Fisica Aplicada, University of Huelva, Huelva (Spain); Sengupta, D. [Department of Geology and Geophysics, I.I.T. Kharagpur, West Bengal (India)

    2009-05-15

    Some radon related parameters have been determined through two different techniques (passive and active) in soil and phosphogypsum samples. Emanation factors determined through these techniques show a good agreement for soil samples while for phosphogympsum samples appear large discrepancies. In this paper, these discrepancies are analyzed and explained if non-controlled radon leakages in the passive technique are taken into account.

  14. Assessment of the associated particle prompt gamma neutron activation technique for total body nitrogen measurement in vivo

    Science.gov (United States)

    Total Body Nitrogen (TBN) can be used to estimate Total Body Protein (TBP), an important body composition component at the molecular level. A system using the associated particle technique in conjunction with prompt gamma neutron activation analysis has been developed for the measurement of TBN in ...

  15. Characterization of Phoenician pottery from Mothia by neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cesana, A.; Terrani, M. (Politecnico di Milano (Italy). Centro Studi Nucleari E. Fermi); Ciasca, A. (Rome Univ. (Italy)); Cuomo di Caprio, N. (Venice Univ. (Italy)); Tusa, V. (Soprintendenza Archeologica della Sicilia Occidentale, Palermo (Italy))

    1983-01-01

    The concentration of 7 elements (Na, Al, Mg, Ti, Ca, V, Mn) was determined by neutron activation analysis in 35 samples of pottery and 14 samples of clay. The samples were collected in ancient Mothia (Sicily) and in its neighbourhoods. Cluster analysis of the data showed that most of the samples are homogeneous and confirmed the archaeological evidence that they are mostly local ware. The detailed results of the analyses are reported and the technique used for cluster analysis is described.

  16. Ionospheric Behaviour Analysis over Thailand Using Radio Occultation Technique.

    Directory of Open Access Journals (Sweden)

    Ahmed Wasiu Akande

    2015-11-01

    Full Text Available With the advent in the development of science and technology in the field of space and atmospheric science in order to obtain accurate result, hence the use of radio occultation technique in the investigation of the amount of electron density and Total Electron Content presence in equatorial region particularly over Thailand. In this research, radio occultation data obtained from UCAR/CDAAC was used to observe daily, monthly, seasonal and the entire year 2013 Ionospheric TEC and electron density variation due to changes and instability of solar activities from time to time. It was observed that TEC was high (ionosphere was more disturbed or violent in May and spread over a wide range of altitude and summer season has the highest TEC value for the year 2013 which means at this period GNSS measurements was more prone to error. It was noted that ionospheric variations or fluctuations was maximum between 200km and 450km altitude. The results of the study show that ionospheric perturbation effects or irregularities depend on season and solar activity.

  17. Techniques for active embodiment of participants in virtual environments

    Energy Technology Data Exchange (ETDEWEB)

    Hightower, R.; Stansfield, S.

    1996-03-01

    This paper presents preliminary work in the development of an avatar driver. An avatar is the graphical embodiment of a user in a virtual world. In applications such as small team, close quarters training and mission planning and rehearsal, it is important that the user`s avatar reproduce his or her motions naturally and with high fidelity. This paper presents a set of special purpose algorithms for driving the motion of the avatar with minimal information about the posture and position of the user. These algorithms utilize information about natural human motion and posture to produce solutions quickly and accurately without the need for complex general-purpose kinematics algorithms. Several examples illustrating the successful applications of these techniques are included.

  18. Efficient geometric rectification techniques for spectral analysis algorithm

    Science.gov (United States)

    Chang, C. Y.; Pang, S. S.; Curlander, J. C.

    1992-01-01

    The spectral analysis algorithm is a viable technique for processing synthetic aperture radar (SAR) data in near real time throughput rates by trading the image resolution. One major challenge of the spectral analysis algorithm is that the output image, often referred to as the range-Doppler image, is represented in the iso-range and iso-Doppler lines, a curved grid format. This phenomenon is known to be the fanshape effect. Therefore, resampling is required to convert the range-Doppler image into a rectangular grid format before the individual images can be overlaid together to form seamless multi-look strip imagery. An efficient algorithm for geometric rectification of the range-Doppler image is presented. The proposed algorithm, realized in two one-dimensional resampling steps, takes into consideration the fanshape phenomenon of the range-Doppler image as well as the high squint angle and updates of the cross-track and along-track Doppler parameters. No ground reference points are required.

  19. Pattern recognition software and techniques for biological image analysis.

    Directory of Open Access Journals (Sweden)

    Lior Shamir

    Full Text Available The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  20. Pattern recognition software and techniques for biological image analysis.

    Science.gov (United States)

    Shamir, Lior; Delaney, John D; Orlov, Nikita; Eckley, D Mark; Goldberg, Ilya G

    2010-11-24

    The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  1. The Effectiveness of Active and Traditional Teaching Techniques in the Orthopedic Assessment Laboratory

    Science.gov (United States)

    Nottingham, Sara; Verscheure, Susan

    2010-01-01

    Active learning is a teaching methodology with a focus on student-centered learning that engages students in the educational process. This study implemented active learning techniques in an orthopedic assessment laboratory, and the effects of these teaching techniques. Mean scores from written exams, practical exams, and final course evaluations…

  2. The Effectiveness of Active and Traditional Teaching Techniques in the Orthopedic Assessment Laboratory

    Science.gov (United States)

    Nottingham, Sara; Verscheure, Susan

    2010-01-01

    Active learning is a teaching methodology with a focus on student-centered learning that engages students in the educational process. This study implemented active learning techniques in an orthopedic assessment laboratory, and the effects of these teaching techniques. Mean scores from written exams, practical exams, and final course evaluations…

  3. Successful Application of Active Learning Techniques to Introductory Microbiology

    Directory of Open Access Journals (Sweden)

    Elizabeth A. Hoffman

    2009-12-01

    Full Text Available While the traditional lecture format may be a successful way to teach microbiology to both medical and nursing students, it was not an effective means of learning for many prenursing and preprofessional students enrolled in either of the introductory microbiology courses at Ashland Community College, an open enrollment institution. The structure of both Medical Microbiology and Principles of Microbiology was redesigned to allow students to address the material in an active manner. Daily quizzes, student group discussions, scrapbooks, lab project presentations and papers, and extra credit projects were all added in order to allow students maximum exposure to the course material in a manner compatible with various methods of learning. Student knowledge, course evaluations, and student success rates have all improved with the active learning format.

  4. Rotors on Active Magnetic Bearings: Modeling and Control Techniques

    OpenAIRE

    Tonoli, Andrea; Bonfitto, Angelo; Silvagni, Mario; Suarez, Lester D.

    2012-01-01

    In the last decades the deeper and more detailed understanding of rotating machinery dynamic behavior facilitated the study and the design of several devices aiming at friction reduction, vibration damping and control, rotational speed increase and mechanical design optimization. Among these devices a promising technology is represented by active magnetic actuators which found a great spread in rotordynamics and in high precision applications due to (a) the absence of all fatigue and tribolog...

  5. New trends in the development of "active correlations" technique

    Science.gov (United States)

    Tsyganov, Yu. S.

    2016-09-01

    With reaching extremely high intensities of heavy-ion beams, new requirements for the detection system of the Dubna Gas-Filled Recoil Separator (DGFRS) will definitely be set. One of the challenges is how to apply the "active correlations" method [1-6] to suppress beam associated background products without significant losses in the whole long-term experiment efficiency value. Different scenarios and equations for the development of a method according to this requirement are under consideration in the present paper.

  6. Aerial monitoring in active mud volcano by UAV technique

    Science.gov (United States)

    Pisciotta, Antonino; Capasso, Giorgio; Madonia, Paolo

    2016-04-01

    UAV photogrammetry opens various new applications in the close range domain, combining aerial and terrestrial photogrammetry, but also introduces low-cost alternatives to the classical manned aerial photogrammetry. Between 2014 and 2015 tree aerial surveys have been carried out. Using a quadrotor drone, equipped with a compact camera, it was possible to generate high resolution elevation models and orthoimages of The "Salinelle", an active mud volcanoes area, located in territory of Paternò (South Italy). The main risks are related to the damages produced by paroxysmal events. Mud volcanoes show different cyclic phases of activity, including catastrophic events and periods of relative quiescence characterized by moderate activity. Ejected materials often are a mud slurry of fine solids suspended in liquids which may include water and hydrocarbon fluids, the bulk of released gases are carbon dioxide, with some methane and nitrogen, usually pond-shaped of variable dimension (from centimeters to meters in diameter). The scope of the presented work is the performance evaluation of a UAV system that was built to rapidly and autonomously acquire mobile three-dimensional (3D) mapping data in a volcanic monitoring scenario.

  7. Analysis of some herbal plants from India used in the control of diabetes mellitus by NAA and AAS techniques

    Energy Technology Data Exchange (ETDEWEB)

    Rajurkar, N.S.; Pardeshi, B.M. [Pune Univ., Chemistry Dept., Pune (India)

    1997-08-01

    Elemental analysis of some herbal plants used in the control of diabetes has been done by the techniques of Neutron Activation Analysis (NAA) and Atomic Absorption Spectroscopy (AAS). The elements Mn, Na, K, Cl, Al, Cu, Co, Pb, Ni, Cr, Cd, Fe, Ca, Zn and Hg are found to be present in different plants in various proportions. (Author).

  8. Emerging techniques for soil analysis via mid-infrared spectroscopy

    Science.gov (United States)

    Linker, R.; Shaviv, A.

    2009-04-01

    Transmittance and diffuse reflectance (DRIFT) spectroscopy in the mid-IR range are well-established methods for soil analysis. Over the last five years, additional mid-IR techniques have been investigated, and in particular: 1. Attenuated total reflectance (ATR) Attenuated total reflectance is commonly used for analysis of liquids and powders for which simple transmittance measurements are not possible. The method relies on a crystal with a high refractive index, which is in contact with the sample and serves as a waveguide for the IR radiation. The radiation beam is directed in such a way that it hits the crystal/sample interface several times, each time penetrating a few microns into the sample. Since the penetration depth is limited to a few microns, very good contact between the sample and the crystal must be ensured, which can be achieved by working with samples close to water saturation. However, the strong absorbance of water in the mid-infrared range as well as the absorbance of some soil constituents (e.g., calcium carbonate) interfere with some of the absorbance bands of interest. This has led to the development of several post-processing methods for analysis of the spectra. The FTIR-ATR technique has been successfully applied to soil classification as well as to determination of nitrate concentration [1, 6-8, 10]. Furthermore, Shaviv et al. [12] demonstrated the possibility of using fiber optics as an ATR devise for direct determination of nitrate concentration in soil extracts. Recently, Du et al. [5] showed that it is possible to differentiate between 14N and 15N in such spectra, which opens very promising opportunities for developing FTIR-ATR based methods for investigating nitrogen transformation in soils by tracing changes in N-isotopic species. 2. Photo-acoustic spectroscopy Photoacoustic spectroscopy (PAS) is based on absorption-induced heating of the sample, which produces pressure fluctuations in a surrounding gas. These fluctuations are

  9. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    Science.gov (United States)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the

  10. Ionospheric Plasma Drift Analysis Technique Based On Ray Tracing

    Science.gov (United States)

    Ari, Gizem; Toker, Cenk

    2016-07-01

    Ionospheric drift measurements provide important information about the variability in the ionosphere, which can be used to quantify ionospheric disturbances caused by natural phenomena such as solar, geomagnetic, gravitational and seismic activities. One of the prominent ways for drift measurement depends on instrumentation based measurements, e.g. using an ionosonde. The drift estimation of an ionosonde depends on measuring the Doppler shift on the received signal, where the main cause of Doppler shift is the change in the length of the propagation path of the signal between the transmitter and the receiver. Unfortunately, ionosondes are expensive devices and their installation and maintenance require special care. Furthermore, the ionosonde network over the world or even Europe is not dense enough to obtain a global or continental drift map. In order to overcome the difficulties related to an ionosonde, we propose a technique to perform ionospheric drift estimation based on ray tracing. First, a two dimensional TEC map is constructed by using the IONOLAB-MAP tool which spatially interpolates the VTEC estimates obtained from the EUREF CORS network. Next, a three dimensional electron density profile is generated by inputting the TEC estimates to the IRI-2015 model. Eventually, a close-to-real situation electron density profile is obtained in which ray tracing can be performed. These profiles can be constructed periodically with a period of as low as 30 seconds. By processing two consequent snapshots together and calculating the propagation paths, we estimate the drift measurements over any coordinate of concern. We test our technique by comparing the results to the drift measurements taken at the DPS ionosonde at Pruhonice, Czech Republic. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.

  11. High-Throughput Analysis of Enzyme Activities

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Guoxin [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    High-throughput screening (HTS) techniques have been applied to many research fields nowadays. Robot microarray printing technique and automation microtiter handling technique allows HTS performing in both heterogeneous and homogeneous formats, with minimal sample required for each assay element. In this dissertation, new HTS techniques for enzyme activity analysis were developed. First, patterns of immobilized enzyme on nylon screen were detected by multiplexed capillary system. The imaging resolution is limited by the outer diameter of the capillaries. In order to get finer images, capillaries with smaller outer diameters can be used to form the imaging probe. Application of capillary electrophoresis allows separation of the product from the substrate in the reaction mixture, so that the product doesn't have to have different optical properties with the substrate. UV absorption detection allows almost universal detection for organic molecules. Thus, no modifications of either the substrate or the product molecules are necessary. This technique has the potential to be used in screening of local distribution variations of specific bio-molecules in a tissue or in screening of multiple immobilized catalysts. Another high-throughput screening technique is developed by directly monitoring the light intensity of the immobilized-catalyst surface using a scientific charge-coupled device (CCD). Briefly, the surface of enzyme microarray is focused onto a scientific CCD using an objective lens. By carefully choosing the detection wavelength, generation of product on an enzyme spot can be seen by the CCD. Analyzing the light intensity change over time on an enzyme spot can give information of reaction rate. The same microarray can be used for many times. Thus, high-throughput kinetic studies of hundreds of catalytic reactions are made possible. At last, we studied the fluorescence emission spectra of ADP and obtained the detection limits for ADP under three different

  12. MEASURING THE LEANNESS OF SUPPLIERS USING PRINCIPAL COMPONENT ANALYSIS TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Y. Zare Mehrjerdi

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: A technique that helps management to reduce costs and improve quality is ‘lean supply chain management’, which focuses on the elimination of all wastes in every stage of the supply chain and is derived from ‘agile production’. This research aims to assess and rank the suppliers in an auto industry, based upon the concept of ‘production leanness’. The focus of this research is on the suppliers of a company called Touse-Omron Naein. We have examined the literature about leanness, and classified its criteria into ten dimensions and 76 factors. A questionnaire was used to collect the data, and the suppliers were ranked using the principal component analysis (PCA technique.

    AFRIKAANSE OPSOMMING: Lenige voorsieningsbestuur (“lean supply chain management” is ’n tegniek wat bestuur in staat stel om koste te verminder en gehalte te verbeter. Dit fokus op die vermindering van vermorsing op elke stadium van die voorsieningsketting en word afgelei van ratse vervaardiging (“agile production”. Hierdie navorsing poog om leweransiers in ’n motorbedryf te beoordeel aan die hand van die konsep van vervaardigingslenigheid (“production leanness”. Die navorsing fokus op leweransiers van ’n maatskappy genaamd Touse-Omron Naein. ’n Literatuurstudie aangaande lenigheid het gelei tot die klassifikasie van kriteria in tien dimensies en 76 faktore. ’n Vraelys is gebruik om die data te versamel en die leweransiers is in rangvolgorde geplaas aan die hand van die PCA-tegniek.

  13. Two-dimensional Imaging Velocity Interferometry: Technique and Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Erskine, D J; Smith, R F; Bolme, C; Celliers, P; Collins, G

    2011-03-23

    We describe the data analysis procedures for an emerging interferometric technique for measuring motion across a two-dimensional image at a moment in time, i.e. a snapshot 2d-VISAR. Velocity interferometers (VISAR) measuring target motion to high precision have been an important diagnostic in shockwave physics for many years Until recently, this diagnostic has been limited to measuring motion at points or lines across a target. We introduce an emerging interferometric technique for measuring motion across a two-dimensional image, which could be called a snapshot 2d-VISAR. If a sufficiently fast movie camera technology existed, it could be placed behind a traditional VISAR optical system and record a 2d image vs time. But since that technology is not yet available, we use a CCD detector to record a single 2d image, with the pulsed nature of the illumination providing the time resolution. Consequently, since we are using pulsed illumination having a coherence length shorter than the VISAR interferometer delay ({approx}0.1 ns), we must use the white light velocimetry configuration to produce fringes with significant visibility. In this scheme, two interferometers (illuminating, detecting) having nearly identical delays are used in series, with one before the target and one after. This produces fringes with at most 50% visibility, but otherwise has the same fringe shift per target motion of a traditional VISAR. The 2d-VISAR observes a new world of information about shock behavior not readily accessible by traditional point or 1d-VISARS, simultaneously providing both a velocity map and an 'ordinary' snapshot photograph of the target. The 2d-VISAR has been used to observe nonuniformities in NIF related targets (polycrystalline diamond, Be), and in Si and Al.

  14. Cooperative Experimental System Development - cooperative techniques beyound initial design and analysis

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Kyng, Morten; Mogensen, Preben Holst

    1995-01-01

    /design to targeted object oriented design, specification, and realisation; and design for tailorability. The emerging CESD approach is based on several years of experience in applying cooperative analysis and design techniques in projects developing general, tailorable software products. The CESD approach is...... be based solely on observation and detached reflection; prototyping methods often have a narrow focus on the technical construction of various kinds of prototypes; Participatory Design techniques—including the Scandinavian Cooperative Design (CD) approaches—seldom go beyond the early analysis....../design activities of development projects. In contrast, the CESD approach is characterized by its focus on: active user involvement throughout the entire development process; prototyping experiments closely coupled to work-situations and use-scenarios; transforming results from early cooperative analysis...

  15. Smart actuators: a novel technique for active damping

    Science.gov (United States)

    Muth, Michael; Moldovan, Klaus; Goetz, Bernt

    1995-05-01

    Sensors are important components for any automatic process. Their function is to measure physical variables, and thus to allow automatic actions in a technical process, for example in a manufacturing sequence or a measurement. Selecting a sensor for a process, it is mostly overlooked that actuators used in a process also have sensory properties. The reactions of actuators to the state of a process give the possibility to extract relevant information out of the process with actuators. In using the sensory properties of actuators the costs for additional sensors can be saved. Even more important, under some circumstances it may not even be possible to place a special sensor directly at the location of interest: In that case the information about the physical variable is only accessible by analyzing the return signal of the actuator. An example of such a smart actuator combining active and sensory properties is demonstrated in a simple experiment. This experiment shows a steel ball supported as a pendulum. The steel ball can be pushed off, and on swinging back it can be caught in a single pass without any bounce. The actuator uses the piezoelectric effect which shows the underlying principle most clearly: Application of the reversibility of physical effects. In this case mechanical energy can either be produced or absorbed. This experiment is means as a demonstration model for students. It is also used for preliminary investigations developing a fast, actively damped tipping mechanism (optical scanner).

  16. Experimental techniques for screening of antiosteoporotic activity in postmenopausal osteoporosis.

    Science.gov (United States)

    Satpathy, Swaha; Patra, Arjun; Ahirwar, Bharti

    2015-12-01

    Postmenopausal osteoporosis, a silent epidemic, has become a major health hazard, afflicting about 50% of postmenopausal women worldwide and is thought to be a disease with one of the highest incidences in senile people. It is a chronic, progressive condition associated with micro-architectural deterioration of bone tissue that results in low bone mass, decreased bone strength that predisposes to an increased risk of fracture. Women are more likely to develop osteoporosis than men due to reduction in estrogen during menopause which leads to decline in bone formation and increase in bone resorption activity. Estrogen is able to suppress the production of proinflammatory cytokines like interleukin (IL)-1, IL-6, IL-7 and tumor necrosis factor (TNF-α). This is why these cytokines are elevated in postmenopausal women. In this review article we have made an attempt to collate the various methods and parameters most frequently used for screening of antiosteoporotic activity in postmenopausal osteoporosis. Pertaining to ovariectomized animal model, this is the most appropriate model for studying the efficacy of different drugs to prevent bone loss in postmenopausal osteoporosis.

  17. Differences in Pedaling Technique in Cycling: A Cluster Analysis.

    Science.gov (United States)

    Lanferdini, Fábio J; Bini, Rodrigo R; Figueiredo, Pedro; Diefenthaeler, Fernando; Mota, Carlos B; Arndt, Anton; Vaz, Marco A

    2016-10-01

    To employ cluster analysis to assess if cyclists would opt for different strategies in terms of neuromuscular patterns when pedaling at the power output of their second ventilatory threshold (POVT2) compared with cycling at their maximal power output (POMAX). Twenty athletes performed an incremental cycling test to determine their power output (POMAX and POVT2; first session), and pedal forces, muscle activation, muscle-tendon unit length, and vastus lateralis architecture (fascicle length, pennation angle, and muscle thickness) were recorded (second session) in POMAX and POVT2. Athletes were assigned to 2 clusters based on the behavior of outcome variables at POVT2 and POMAX using cluster analysis. Clusters 1 (n = 14) and 2 (n = 6) showed similar power output and oxygen uptake. Cluster 1 presented larger increases in pedal force and knee power than cluster 2, without differences for the index of effectiveness. Cluster 1 presented less variation in knee angle, muscle-tendon unit length, pennation angle, and tendon length than cluster 2. However, clusters 1 and 2 showed similar muscle thickness, fascicle length, and muscle activation. When cycling at POVT2 vs POMAX, cyclists could opt for keeping a constant knee power and pedal-force production, associated with an increase in tendon excursion and a constant fascicle length. Increases in power output lead to greater variations in knee angle, muscle-tendon unit length, tendon length, and pennation angle of vastus lateralis for a similar knee-extensor activation and smaller pedal-force changes in cyclists from cluster 2 than in cluster 1.

  18. 中子活化法表征酸奶与苹果中有机卤化物%Study of organohalogens in yogurt and apple by neutron activation analysis and related techniques

    Institute of Scientific and Technical Information of China (English)

    张鸿; 柴之芳; 孙慧斌

    2008-01-01

    利用仪器中子活化分析、气相色谱和化学分离相结合的方法,研究随机采自北京、深圳超市的酸奶(20个品牌)和苹果(9种)中总卤素、可萃取有机卤素、可萃取持久性有机卤素和可鉴别持久性有机氯.结果显示,Cl、Br和I的INAA探测极限分别为50 ng、8 ng和3.5 ng.酸奶中可萃取有机氯占总氯含量的0.005%~0.043%,其中约24%为耐浓硫酸的可萃取持久性有机氯,可鉴别有机氯占总EPOCl的0.7%~13.1%;苹果中相应比例分别为1.6%~5.1%、34%和0.5%~6.2%,表明酸奶与苹果中的氯化物主要为极性水溶性化合物,EOCl主要为酸溶或酸不稳定氯化物,大部分EPOCl为现代气相色谱技术尚不能鉴别的未知化合物,仍留待人们去认识.%Twenty brands of Chinese commercial yogurt specimens and nine different kinds of apple samples collected randomly from supermarkets in Beijing and Shenzhen,China,were analyzed by instrumental neutron activation analysis (INAA) combined with gas chromatography (GC) and chemical separation methods for total halogens,extractable organohalogens (EOX),extractable persistent organohalogens (EPOX) and identified organochlorines. The INAA detection limits are 50 ng,8 ng and 3.5 ng for Cl,Br and I,respectively. The extractable organochlorines (EOCl) accounted for 0.005% to 0.043% of the total chlorine in yogurt and 1.6% to 5.1% in apple.About 24% of EOCl kept undecomposed as the extractable persistent organochlorines (EPOCl) after treatment with concentrated sulfuric acid in yogurt,and 34% in apple.These results indicated that chlorine in the two selected foodstuffs mainly existed as inorganic species and non-extractable organochlorines,and most EOCl in yogurt and apple were acid-liable or acid-soluble fractions. The Ratios of identified organochlorines to total EPOCl were 0.7% to 13.1% and 0.5% to 6.2% in yogurt and apple samples,respectively,which implying that a major portion of EPOCl measured in yogurt and apple

  19. A comparison of approximation techniques for variance-based sensitivity analysis of biochemical reaction systems

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2010-05-01

    Full Text Available Abstract Background Sensitivity analysis is an indispensable tool for the analysis of complex systems. In a recent paper, we have introduced a thermodynamically consistent variance-based sensitivity analysis approach for studying the robustness and fragility properties of biochemical reaction systems under uncertainty in the standard chemical potentials of the activated complexes of the reactions and the standard chemical potentials of the molecular species. In that approach, key sensitivity indices were estimated by Monte Carlo sampling, which is computationally very demanding and impractical for large biochemical reaction systems. Computationally efficient algorithms are needed to make variance-based sensitivity analysis applicable to realistic cellular networks, modeled by biochemical reaction systems that consist of a large number of reactions and molecular species. Results We present four techniques, derivative approximation (DA, polynomial approximation (PA, Gauss-Hermite integration (GHI, and orthonormal Hermite approximation (OHA, for analytically approximating the variance-based sensitivity indices associated with a biochemical reaction system. By using a well-known model of the mitogen-activated protein kinase signaling cascade as a case study, we numerically compare the approximation quality of these techniques against traditional Monte Carlo sampling. Our results indicate that, although DA is computationally the most attractive technique, special care should be exercised when using it for sensitivity analysis, since it may only be accurate at low levels of uncertainty. On the other hand, PA, GHI, and OHA are computationally more demanding than DA but can work well at high levels of uncertainty. GHI results in a slightly better accuracy than PA, but it is more difficult to implement. OHA produces the most accurate approximation results and can be implemented in a straightforward manner. It turns out that the computational cost of the

  20. Monolithic active pixel radiation detector with shielding techniques

    Energy Technology Data Exchange (ETDEWEB)

    Deptuch, Grzegorz W.

    2016-09-06

    A monolithic active pixel radiation detector including a method of fabricating thereof. The disclosed radiation detector can include a substrate comprising a silicon layer upon which electronics are configured. A plurality of channels can be formed on the silicon layer, wherein the plurality of channels are connected to sources of signals located in a bulk part of the substrate, and wherein the signals flow through electrically conducting vias established in an isolation oxide on the substrate. One or more nested wells can be configured from the substrate, wherein the nested wells assist in collecting charge carriers released in interaction with radiation and wherein the nested wells further separate the electronics from the sensing portion of the detector substrate. The detector can also be configured according to a thick SOA method of fabrication.

  1. Monolithic active pixel radiation detector with shielding techniques

    Science.gov (United States)

    Deptuch, Grzegorz W.

    2016-09-06

    A monolithic active pixel radiation detector including a method of fabricating thereof. The disclosed radiation detector can include a substrate comprising a silicon layer upon which electronics are configured. A plurality of channels can be formed on the silicon layer, wherein the plurality of channels are connected to sources of signals located in a bulk part of the substrate, and wherein the signals flow through electrically conducting vias established in an isolation oxide on the substrate. One or more nested wells can be configured from the substrate, wherein the nested wells assist in collecting charge carriers released in interaction with radiation and wherein the nested wells further separate the electronics from the sensing portion of the detector substrate. The detector can also be configured according to a thick SOA method of fabrication.

  2. Fractographic ceramic failure analysis using the replica technique

    Science.gov (United States)

    Scherrer, Susanne S.; Quinn, Janet B.; Quinn, George D.; Anselm Wiskott, H. W.

    2007-01-01

    Objectives To demonstrate the effectiveness of in vivo replicas of fractured ceramic surfaces for descriptive fractography as applied to the analysis of clinical failures. Methods The fracture surface topography of partially failed veneering ceramic of a Procera Alumina molar and an In Ceram Zirconia premolar were examined utilizing gold-coated epoxy poured replicas viewed using scanning electron microscopy. The replicas were inspected for fractographic features such as hackle, wake hackle, twist hackle, compression curl and arrest lines for determination of the direction of crack propagation and location of the origin. Results For both veneering ceramics, replicas provided an excellent reproduction of the fractured surfaces. Fine details including all characteristic fracture features produced by the interaction of the advancing crack with the material's microstructure could be recognized. The observed features are indicators of the local direction of crack propagation and were used to trace the crack's progression back to its initial starting zone (the origin). Drawbacks of replicas such as artifacts (air bubbles) or imperfections resulting from inadequate epoxy pouring were noted but not critical for the overall analysis of the fractured surfaces. Significance The replica technique proved to be easy to use and allowed an excellent reproduction of failed ceramic surfaces. It should be applied before attempting to remove any failed part remaining in situ as the fracture surface may be damaged during this procedure. These two case studies are intended as an introduction for the clinical researcher in using qualitative (descriptive) fractography as a tool for understanding fracture processes in brittle restorative materials and, secondarily, to draw conclusions as to possible design inadequacies in failed restorations. PMID:17270267

  3. Quantitative analysis of genomic element interactions by molecular colony technique.

    Science.gov (United States)

    Gavrilov, Alexey A; Chetverina, Helena V; Chermnykh, Elina S; Razin, Sergey V; Chetverin, Alexander B

    2014-03-01

    Distant genomic elements were found to interact within the folded eukaryotic genome. However, the used experimental approach (chromosome conformation capture, 3C) enables neither determination of the percentage of cells in which the interactions occur nor demonstration of simultaneous interaction of >2 genomic elements. Each of the above can be done using in-gel replication of interacting DNA segments, the technique reported here. Chromatin fragments released from formaldehyde-cross-linked cells by sodium dodecyl sulfate extraction and sonication are distributed in a polyacrylamide gel layer followed by amplification of selected test regions directly in the gel by multiplex polymerase chain reaction. The fragments that have been cross-linked and separate fragments give rise to multi- and monocomponent molecular colonies, respectively, which can be distinguished and counted. Using in-gel replication of interacting DNA segments, we demonstrate that in the material from mouse erythroid cells, the majority of fragments containing the promoters of active β-globin genes and their remote enhancers do not form complexes stable enough to survive sodium dodecyl sulfate extraction and sonication. This indicates that either these elements do not interact directly in the majority of cells at a given time moment, or the formed DNA-protein complex cannot be stabilized by formaldehyde cross-linking.

  4. Romanian medieval earring analysis by X-ray fluorescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Therese, Laurent; Guillot, Philippe, E-mail: philippe.guillot@univ-jfc.fr [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Muja, Cristina [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Faculty of Biology, University of Bucharest (Romania); Vasile Parvan Institute of Archaeology, Bucharest, (Romania)

    2011-07-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100{mu}m. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two

  5. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Mohamed, A.

    1998-07-10

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results.

  6. Sentiment Analysis of Twitter tweets using supervised classification technique

    Directory of Open Access Journals (Sweden)

    Pranav Waykar

    2016-05-01

    Full Text Available Making use of social media for analyzing the perceptions of the masses over a product, event or a person has gained momentum in recent times. Out of a wide array of social networks, we chose Twitter for our analysis as the opinions expressed their, are concise and bear a distinctive polarity. Here, we collect the most recent tweets on users' area of interest and analyze them. The extracted tweets are then segregated as positive, negative and neutral. We do the classification in following manner: collect the tweets using Twitter API; then we process the collected tweets to convert all letters to lowercase, eliminate special characters etc. which makes the classification more efficient; the processed tweets are classified using a supervised classification technique. We make use of Naive Bayes classifier to segregate the tweets as positive, negative and neutral. We use a set of sample tweets to train the classifier. The percentage of the tweets in each category is then computed and the result is represented graphically. The result can be used further to gain an insight into the views of the people using Twitter about a particular topic that is being searched by the user. It can help corporate houses devise strategies on the basis of the popularity of their product among the masses. It may help the consumers to make informed choices based on the general sentiment expressed by the Twitter users on a product

  7. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Lalitha Jayaraman

    2010-01-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thicknessdirection. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimizing the torque for blankets from different manufacturers.

  8. Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques

    Science.gov (United States)

    Kuan, Gary M

    2008-01-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  9. Seismic margin analysis technique for nuclear power plant structures

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong Moon; Choi, In Kil

    2001-04-01

    In general, the Seismic Probabilistic Risk Assessment (SPRA) and the Seismic Margin Assessment(SAM) are used for the evaluation of realistic seismic capacity of nuclear power plant structures. Seismic PRA is a systematic process to evaluate the seismic safety of nuclear power plant. In our country, SPRA has been used to perform the probabilistic safety assessment for the earthquake event. SMA is a simple and cost effective manner to quantify the seismic margin of individual structural elements. This study was performed to improve the reliability of SMA results and to confirm the assessment procedure. To achieve this goal, review for the current status of the techniques and procedures was performed. Two methodologies, CDFM (Conservative Deterministic Failure Margin) sponsored by NRC and FA (Fragility Analysis) sponsored by EPRI, were developed for the seismic margin review of NPP structures. FA method was originally developed for Seismic PRA. CDFM approach is more amenable to use by experienced design engineers including utility staff design engineers. In this study, detailed review on the procedures of CDFM and FA methodology was performed.

  10. An evaluation of wind turbine blade cross section analysis techniques.

    Energy Technology Data Exchange (ETDEWEB)

    Paquette, Joshua A.; Griffith, Daniel Todd; Laird, Daniel L.; Resor, Brian Ray

    2010-03-01

    The blades of a modern wind turbine are critical components central to capturing and transmitting most of the load experienced by the system. They are complex structural items composed of many layers of fiber and resin composite material and typically, one or more shear webs. Large turbine blades being developed today are beyond the point of effective trial-and-error design of the past and design for reliability is always extremely important. Section analysis tools are used to reduce the three-dimensional continuum blade structure to a simpler beam representation for use in system response calculations to support full system design and certification. One model simplification approach is to analyze the two-dimensional blade cross sections to determine the properties for the beam. Another technique is to determine beam properties using static deflections of a full three-dimensional finite element model of a blade. This paper provides insight into discrepancies observed in outputs from each approach. Simple two-dimensional geometries and three-dimensional blade models are analyzed in this investigation. Finally, a subset of computational and experimental section properties for a full turbine blade are compared.

  11. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Balaraman Kumar

    2010-06-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thickness direction. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimising the torque for blankets from different manufacturers.

  12. Activated mechanisms in proteins: a multiple-temperature activation-relaxation technique study

    Science.gov (United States)

    Malek, Rachid; Mousseau, Normand; Derreumaux, Philippe

    2001-03-01

    The low-temperature dynamics of proteins is controlled by a complex activated dynamics taking place over long time-scales compared with the period of thermal oscillations. In view of the range of relevant time scales, the numerical study of these processes remains a challenge and numerous methods have been introduced to address this problem. We introduce here a mixture of two algorithms, the activation-relaxation technique (ART)^1,2 coupled with the parallel tempering method, and use it to study the structure of the energy landscape around the native state of a 38-residue polypeptide. While ART samples rapidly the local energy landscape, the parallel tempering, which sets up exchanges of configuration between simultaneous runs at multiple temperatures, generates a very efficient sampling of energy basins separated by high barriers^(3). Results show the nature of the barriers and local minima surrounding the native state of this 38-residue peptide, modeled with off-lattice OPEP-like interactions^4. (1) G.T. Barkema and N. Mousseau, PRL 77, 4358 (1996) (2) N. Mousseau and G.T. Barkema, PRE 57, 2419 (1998) (3) E. Marinari and G. Parisi, Europhys. Lett., 19 (6), 451 (1992) (4) Ph. Derreumaux, J. Chem. Phys. 111, 2301 (1999); PRB 85, 206 (2000)

  13. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    Energy Technology Data Exchange (ETDEWEB)

    Clegg, Samuel M [Los Alamos National Laboratory; Barefield, James E [Los Alamos National Laboratory; Wiens, Roger C [Los Alamos National Laboratory; Sklute, Elizabeth [MT HOLYOKE COLLEGE; Dyare, Melinda D [MT HOLYOKE COLLEGE

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  14. Constrained optimization techniques for active control of aeroelastic response

    Science.gov (United States)

    Mukhopadhyay, Vivekananda

    1987-01-01

    Active control of aeroelastic response is a complex problem in which the designer usually tries to satisfy many design criteria which are often conflicting in nature. To further complicate the design problem, the state space equations describing this type of control problem are usually of high order, involving a large number of states to represent the flexible structure and unsteady aerodynamics. Control laws based on the standard Linear - Quadratic - Gaussian method are of the same high order as the aeroelastic plant and may be difficult to implement in the flight computer. To overcome this disadvantage a new approach was developed for designing low-order optimized robust control laws. In this approach, a nonlinear programming algorithm is used to search for the values of control law design variables that minimize a performance index while satisfying several inequality constraints that describe the design criteria on the stability robustness and responses. The method is applied to a gust load alleviation problem and a stability robustness improvement problem of a drone aircraft.

  15. Comparative Analysis of Vehicle Make and Model Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Faiza Ayub Syed

    2014-03-01

    Full Text Available Vehicle Make and Model Recognition (VMMR has emerged as a significant element of vision based systems because of its application in access control systems, traffic control and monitoring systems, security systems and surveillance systems, etc. So far a number of techniques have been developed for vehicle recognition. Each technique follows different methodology and classification approaches. The evaluation results highlight the recognition technique with highest accuracy level. In this paper we have pointed out the working of various vehicle make and model recognition techniques and compare these techniques on the basis of methodology, principles, classification approach, classifier and level of recognition After comparing these factors we concluded that Locally Normalized Harris Corner Strengths (LHNS performs best as compared to other techniques. LHNS uses Bayes and K-NN classification approaches for vehicle classification. It extracts information from frontal view of vehicles for vehicle make and model recognition.

  16. Fluorometric Discrimination Technique of Phytoplankton Population Based on Wavelet Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Shanshan; SU Rongguo; DUAN Yali; ZHANG Cui; SONG Zhijie; WANG Xiulin

    2012-01-01

    The discrete excitation-emission-matrix fluorescence spectra(EEMS)at 12 excitation wavelengths (400,430,450,460,470,490,500,510,525,550,570,and 590 nm)and emission wavelengths ranging from 600-750 nm were determined for 43 phytoplankton species.A two-rank fluorescence spectra database was established by wavelet analysis and a fluorometric discrimination technique for determining phytoplankton population was developed.For laboratory simulatively mixed samples,the samples mixed from 43 algal species(the algae of one division accounted for 25%,50%,75%,85%,and 100% of the gross biomass,respectively),the average discrimination rates at the level of division were 65.0%,87.5%,98.6%,99.0%,and 99.1%,with average relative contents of 18.9%,44.5%,68.9%,73.4%,and 82.9%,respectively;the samples mixed from 32 red tide algal species(the dominant species accounted for 60%,70%,80%,90%,and 100% of the gross biomass,respectively),the average correct discrimination rates of the dominant species at the level of genus were 63.3%,74.2%,78.8%,83.4%,and 79.4%,respectively.For the 81 laboratory mixed samples with the dominant species accounting for 75% of the gross biomass(chlorophyll),the discrimination rates of the dominant species were 95.1% and 72.8% at the level of division and genus,respectively.For the 12 samples collected from the mesocosm experiment in Maidao Bay of Qingdao in August 2007,the dominant species of the 11 samples were recognized at the division level and the dominant species of four of the five samples in which the dominant species accounted for more than 80% of the gross biomass were discriminated at the genus level;for the 12 samples obtained from Jiaozhou Bay in August 2007,the dominant species of all the 12 samples were recognized at the division level.The technique can be directly applied to fluorescence spectrophotometers and to the developing of an in situ algae fluorescence auto-analyzer for

  17. A computerised morphometric technique for the analysis of intimal hyperplasia.

    OpenAIRE

    Tennant, M; McGeachie, J K

    1991-01-01

    The aim of this study was to design, develop and employ a method for the acquisition of a significant data base of thickness measurements. The integration of standard histological techniques (step serial sectioning), modern computer technology and a personally developed software package (specifically designed for thickness measurement) produced a novel technique suitable for the task. The technique allowed the elucidation of a larger data set from tissue samples. Thus a detailed and accurate ...

  18. COMPARATIVE ANALYSIS OF SATELLITE IMAGE PRE-PROCESSING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    T. Sree Sharmila

    2013-01-01

    Full Text Available Satellite images are corrupted by noise in its acquisition and transmission. The removal of noise from the image by attenuating the high frequency image components, removes some important details as well. In order to retain the useful information and improve the visual appearance, an effective denoising and resolution enhancement techniques are required. In this research, Hybrid Directional Lifting (HDL technique is proposed to retain the important details of the image and improve the visual appearance. The Discrete Wavelet Transform (DWT based interpolation technique is developed for enhancing the resolution of the denoised image. The performance of the proposed techniques are tested by Land Remote-Sensing Satellite (LANDSAT images, using the quantitative performance measure, Peak Signal to Noise Ratio (PSNR and computation time to show the significance of the proposed techniques. The PSNR of the HDL technique increases 1.02 dB compared to the standard denoising technique and the DWT based interpolation technique increases 3.94 dB. From the experimental results it reveals that newly developed image denoising and resolution enhancement techniques improve the image visual quality with rich textures.

  19. Biomechanical analysis of cross-country skiing techniques.

    Science.gov (United States)

    Smith, G A

    1992-09-01

    The development of new techniques for cross-country skiing based on skating movements has stimulated biomechanical research aimed at understanding the various movement patterns, the forces driving the motions, and the mechanical factors affecting performance. Research methods have evolved from two-dimensional kinematic descriptions of classic ski techniques to three-dimensional analyses involving measurement of the forces and energy relations of skating. While numerous skiing projects have been completed, most have focused on either the diagonal stride or the V1 skating technique on uphill terrain. Current understanding of skiing mechanics is not sufficiently complete to adequately assess and optimize an individual skier's technique.

  20. Effects on hamstring muscle extensibility, muscle activity, and balance of different stretching techniques.

    Science.gov (United States)

    Lim, Kyoung-Il; Nam, Hyung-Chun; Jung, Kyoung-Sim

    2014-02-01

    [Purpose] The purpose of this study was to investigate the effects of two different stretching techniques on range of motion (ROM), muscle activation, and balance. [Subjects] For the present study, 48 adults with hamstring muscle tightness were recruited and randomly divided into three groups: a static stretching group (n=16), a PNF stretching group (n=16), a control group (n=16). [Methods] Both of the stretching techniques were applied to the hamstring once. Active knee extension angle, muscle activation during maximum voluntary isometric contraction (MVC), and static balance were measured before and after the application of each stretching technique. [Results] Both the static stretching and the PNF stretching groups showed significant increases in knee extension angle compared to the control group. However, there were no significant differences in muscle activation or balance between the groups. [Conclusion] Static stretching and PNF stretching techniques improved ROM without decrease in muscle activation, but neither of them exerted statistically significant effects on balance.

  1. Effects on Hamstring Muscle Extensibility, Muscle Activity, and Balance of Different Stretching Techniques

    Science.gov (United States)

    Lim, Kyoung-Il; Nam, Hyung-Chun; Jung, Kyoung-Sim

    2014-01-01

    [Purpose] The purpose of this study was to investigate the effects of two different stretching techniques on range of motion (ROM), muscle activation, and balance. [Subjects] For the present study, 48 adults with hamstring muscle tightness were recruited and randomly divided into three groups: a static stretching group (n=16), a PNF stretching group (n=16), a control group (n=16). [Methods] Both of the stretching techniques were applied to the hamstring once. Active knee extension angle, muscle activation during maximum voluntary isometric contraction (MVC), and static balance were measured before and after the application of each stretching technique. [Results] Both the static stretching and the PNF stretching groups showed significant increases in knee extension angle compared to the control group. However, there were no significant differences in muscle activation or balance between the groups. [Conclusion] Static stretching and PNF stretching techniques improved ROM without decrease in muscle activation, but neither of them exerted statistically significant effects on balance. PMID:24648633

  2. HPLC-MS technique for radiopharmaceuticals analysis and quality control

    Science.gov (United States)

    Macášek, F.; Búriová, E.; Brúder, P.; Vera-Ruiz, H.

    2003-01-01

    Potentialities of liquid chromatography with mass spectrometric detector (MSD) were investigated with the objective of quality control of radiopharmaceuticals; 2-deoxy-2-[18F]fluoro-D-glucose (FDG) being an example. Screening of suitable MSD analytical lines is presented. Mass-spectrometric monitoring of acetonitrile— aqueous ammonium formate eluant by negatively charged FDG.HCO2 - ions enables isotope analysis (specific activity) of the radiopharmaceutical at m/z 227 and 226. Kryptofix® 222 provides an intense MSD signal of the positive ion associated with NH4 + at m/z 394. Expired FDG injection samples contain decomposition products from which at least one labelled by 18F and characterised by signal of negative ions at m/z 207 does not correspond to FDG fragments but to C5 decomposition products. A glucose chromatographic peak, characterised by m/z 225 negative ion is accompanied by a tail of a component giving a signal of m/z 227, which can belong to [18O]glucose; isobaric sorbitol signals were excluded but FDG-glucose association occurs in the co-elution of separation of model mixtures. The latter can actually lead to a convoluted chromatographic peak, but the absence of 18F makes this inconsistent. Quantification and validation of the FDG component analysis is under way.

  3. A Review of Emerging Analytical Techniques for Objective Physical Activity Measurement in Humans.

    Science.gov (United States)

    Clark, Cain C T; Barnes, Claire M; Stratton, Gareth; McNarry, Melitta A; Mackintosh, Kelly A; Summers, Huw D

    2017-03-01

    Physical inactivity is one of the most prevalent risk factors for non-communicable diseases in the world. A fundamental barrier to enhancing physical activity levels and decreasing sedentary behavior is limited by our understanding of associated measurement and analytical techniques. The number of analytical techniques for physical activity measurement has grown significantly, and although emerging techniques may advance analyses, little consensus is presently available and further synthesis is therefore required. The objective of this review was to identify the accuracy of emerging analytical techniques used for physical activity measurement in humans. We conducted a search of electronic databases using Web of Science, PubMed, and Google Scholar. This review included studies written in English and published between January 2010 and December 2014 that assessed physical activity using emerging analytical techniques and reported technique accuracy. A total of 2064 papers were initially retrieved from three databases. After duplicates were removed and remaining articles screened, 50 full-text articles were reviewed, resulting in the inclusion of 11 articles that met the eligibility criteria. Despite the diverse nature and the range in accuracy associated with some of the analytic techniques, the rapid development of analytics has demonstrated that more sensitive information about physical activity may be attained. However, further refinement of these techniques is needed.

  4. Determination of whole-body nitrogen and radiation assessment using in vivo prompt gamma activation technique.

    Science.gov (United States)

    Chung, C; Wei, Y Y; Chen, Y Y

    1993-06-01

    Body nitrogen content in the phantom is measured by semiconducting and scintillation spectrometers using in vivo prompt gamma-ray activation analysis technique. The effective dose rate equivalents for sensitive organs and tissues inside the phantom are assessed by dosimetric measurement and neutron transport calculation. The bismuth germanate scintillator is found superior to the germanium semiconducting detector to quantitatively measure the photopeak of the 10.829 MeV prompt gamma-ray emitted from the 14N(n, gamma) reaction. Recommended scanning period for current setup using the BGO detector is 1 h on the modified mobile nuclear reactor. The effective dose equivalents from both neutrons and gamma-rays are estimated around 63 microSv per scan in the phantom test, making it a safe and reliable nuclear analytical method for in vivo body nitrogen measurement.

  5. Determination of whole-body nitrogen and radiation assessment using in vivo prompt gamma activation technique

    Energy Technology Data Exchange (ETDEWEB)

    Chien Chung; Yuanyaw Wei; Yayu Chen (National Tsing Hua Univ., Hsinchu, Taiwan (China). Inst. of Nuclear Science)

    1993-06-01

    Body nitrogen content in the phantom is measured by semiconducting and scintillation spectrometers using in vivo prompt gamma-ray activation analysis technique. The effective dose rate equivalents for sensitive organs and tissues inside the phantom are assessed by dosimetric measurement and neutron transport calculation. The bismuth germanate scintillator is found superior to the germanium semiconducting detector to quantitatively measure the photopeak of the 10.829 MeV prompt gamma-ray emitted from the [sup 14]N(n,[gamma]) reaction. Recommended scanning period for current setup using the BGO detector is 1 h on the modified mobile nuclear reactor. The effective dose equivalents from both neutrons and gamma-rays are estimated around 63 [mu]Sv per scan in the phantom test, making it a safe and reliable nuclear analytical method for in vivo body nitrogen measurement. (author).

  6. Structural analysis of irradiated crotoxin by spectroscopic techniques

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Karina C. de; Fucase, Tamara M.; Silva, Ed Carlos S. e; Chagas, Bruno B.; Buchi, Alisson T.; Viala, Vincent L.; Spencer, Patrick J.; Nascimento, Nanci do, E-mail: kcorleto@usp.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro de Biotecnologia

    2013-07-01

    Snake bites are a serious public health problem, especially in subtropical countries. In Brazil, the serum, the only effective treatment in case of snake bites, is produced in horses which, despite of their large size, have a reduced lifespan due to the high toxicity of the antigen. Ionizing radiation has been successfully employed to attenuate the biological activity of animal toxins. Crotoxin, the main toxic compound from Crotalus durissus terrificus (Cdt), is a heterodimeric protein composed of two subunits: crotapotin and phospholipase A{sub 2}. Previous data indicated that this protein, following irradiation process, undergoes unfolding and/or aggregation, resulting in a much lower toxic antigen. The exact mechanisms and structural modifications involved in aggregation process are not clear yet. This work investigates the effects of ionizing radiation on crotoxin employing Infrared Spectroscopy, Circular Dichroism and Dynamic Light Scattering techniques. The infrared spectrum of lyophilized crotoxin showed peaks corresponding to the vibrational spectra of the secondary structure of crotoxin, including β-sheet, random coil, α-helix and β-turns. We calculated the area of these spectral regions after adjusting for baseline and normalization using the amide I band (1590-1700 cm{sup -1}), obtaining the variation of secondary structures of the toxin following irradiation. The Circular Dichroism spectra of native and irradiated crotoxin suggests a conformational change within the molecule after the irradiation process. This data indicates structural changes between the samples, apparently from ordered conformation towards a random coil. The analyses by light scattering indicated that the irradiated crotoxin formed multimers with an average molecular radius 100 folds higher than the native toxin. (author)

  7. Advanced patch-clamp techniques and single-channel analysis

    NARCIS (Netherlands)

    Biskup, B; Elzenga, JTM; Homann, U; Thiel, G; Wissing, F; Maathuis, FJM

    1999-01-01

    Much of our knowledge of ion-transport mechanisms in plant cell membranes comes from experiments using voltage-clamp. This technique allows the measurement of ionic currents across the membrane, whilst the voltage is held under experimental control. The patch-clamp technique was developed to study t

  8. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  9. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  10. Error Analysis for the Airborne Direct Georeferincing Technique

    Science.gov (United States)

    Elsharkawy, Ahmed S.; Habib, Ayman F.

    2016-10-01

    Direct Georeferencing was shown to be an important alternative to standard indirect image orientation using classical or GPS-supported aerial triangulation. Since direct Georeferencing without ground control relies on an extrapolation process only, particular focus has to be laid on the overall system calibration procedure. The accuracy performance of integrated GPS/inertial systems for direct Georeferencing in airborne photogrammetric environments has been tested extensively in the last years. In this approach, the limiting factor is a correct overall system calibration including the GPS/inertial component as well as the imaging sensor itself. Therefore remaining errors in the system calibration will significantly decrease the quality of object point determination. This research paper presents an error analysis for the airborne direct Georeferencing technique, where integrated GPS/IMU positioning and navigation systems are used, in conjunction with aerial cameras for airborne mapping compared with GPS/INS supported AT through the implementation of certain amount of error on the EOP and Boresight parameters and study the effect of these errors on the final ground coordinates. The data set is a block of images consists of 32 images distributed over six flight lines, the interior orientation parameters, IOP, are known through careful camera calibration procedure, also 37 ground control points are known through terrestrial surveying procedure. The exact location of camera station at time of exposure, exterior orientation parameters, EOP, is known through GPS/INS integration process. The preliminary results show that firstly, the DG and GPS-supported AT have similar accuracy and comparing with the conventional aerial photography method, the two technologies reduces the dependence on ground control (used only for quality control purposes). Secondly, In the DG Correcting overall system calibration including the GPS/inertial component as well as the imaging sensor itself

  11. Qualitative analysis of Orzooiyeh plain groundwater resources using GIS techniques

    Directory of Open Access Journals (Sweden)

    Mohsen Pourkhosravani

    2016-09-01

    Full Text Available Background: Unsustainable development of human societies, especially in arid and semi-arid areas, is one of the most important environmental hazards that require preservation of groundwater resources, and permanent study of qualitative and quantitative changes through sampling. Accordingly, this research attempts to assess and analyze the spatial variation of quantitative and qualitative indicators of Orzooiyeh groundwater resources in the Kerman province by using the geographic information system (GIS. Methods: This study attempts to survey the spatial analysis of these indexes using GIS techniques besides the evaluation of the groundwater resources quality in the study area. For this purpose, data quality indicators and statistics such as electrical conductivity, pH, sulphate, residual total dissolved solids (TDS, sodium, calcium; magnesium and chlorine of 28 selected wells sampled by the Kerman regional water organization were used. Results: A comparison of the present research results with standard of Industrial Research of Iran and also the World Health Organization (WHO shows that, among the measured indices, the electrical conductivity and TDS in the chosen samples are higher than the national standard of Iran and of the WHO but other indices are more favourable. Conclusion: Results showed that the electrical conductivity index of 64.3% of the samples have an optimal level, 71.4% have the limit of Iran national standard and only 3.6% of them have the WHO standard. The TDS index, too, did not reach national standards in any of the samples and in 82.1% of the samples this index was on the national standard limit. As per this index, only 32.1% of the samples were in the WHO standards.

  12. Seismic Hazard Analysis Using the Adaptive Kernel Density Estimation Technique for Chennai City

    Science.gov (United States)

    Ramanna, C. K.; Dodagoudar, G. R.

    2012-01-01

    Conventional method of probabilistic seismic hazard analysis (PSHA) using the Cornell-McGuire approach requires identification of homogeneous source zones as the first step. This criterion brings along many issues and, hence, several alternative methods to hazard estimation have come up in the last few years. Methods such as zoneless or zone-free methods, modelling of earth's crust using numerical methods with finite element analysis, have been proposed. Delineating a homogeneous source zone in regions of distributed seismicity and/or diffused seismicity is rather a difficult task. In this study, the zone-free method using the adaptive kernel technique to hazard estimation is explored for regions having distributed and diffused seismicity. Chennai city is in such a region with low to moderate seismicity so it has been used as a case study. The adaptive kernel technique is statistically superior to the fixed kernel technique primarily because the bandwidth of the kernel is varied spatially depending on the clustering or sparseness of the epicentres. Although the fixed kernel technique has proven to work well in general density estimation cases, it fails to perform in the case of multimodal and long tail distributions. In such situations, the adaptive kernel technique serves the purpose and is more relevant in earthquake engineering as the activity rate probability density surface is multimodal in nature. The peak ground acceleration (PGA) obtained from all the three approaches (i.e., the Cornell-McGuire approach, fixed kernel and adaptive kernel techniques) for 10% probability of exceedance in 50 years is around 0.087 g. The uniform hazard spectra (UHS) are also provided for different structural periods.

  13. Effects of nanosuspension and inclusion complex techniques on the in vitro protease inhibitory activity of naproxen

    OpenAIRE

    Dharmalingam, Senthil Rajan; Chidambaram, Kumarappan; Ramamurthy, Srinivasan; Nadaraju,Shamala

    2014-01-01

    This study investigated the effects of nanosuspension and inclusion complex techniques on in vitro trypsin inhibitory activity of naproxen—a member of the propionic acid derivatives, which are a group of antipyretic, analgesic, and non-steroidal anti-inflammatory drugs. Nanosuspension and inclusion complex techniques were used to increase the solubility and anti-inflammatory efficacy of naproxen. The evaporative precipitation into aqueous solution (EPAS) technique and the kneading metho...

  14. Using Metadata Analysis and Base Analysis Techniques in Data Qualities Framework for Data Warehouses

    Directory of Open Access Journals (Sweden)

    Azwa A. Aziz

    2011-01-01

    Full Text Available Information provided by any applications systems in organization is vital in order to obtain a decision. Due to this factor, the quality of data provided by Data Warehouse (DW is really important for organization to produce the best solution for their company to move forwards. DW is complex systems that have to deliver highly-aggregated, high quality data from heterogeneous sources to decision makers. It involves a lot of integration of sources system to support business operations. Problem statement: Many of DW projects are failed because of Data Quality (DQ problems. DQ issues become a major concern over decade. Approach: This study proposes a framework for implementing DQ in DW system architecture using Metadata Analysis Technique and Base Analysis Technique. Those techniques perform comparison between target values and current values gain from the systems. A prototype using PHP is develops to support Base Analysis Techniques. Then a sample schema from Oracle database is used to study differences between applying the framework or not. The prototype is demonstrated to the selected organizations to identify whether it will help to reduce DQ problems. Questionnaires have been given to respondents. Results: The result show user interested in applying DQ processes in their organizations. Conclusion/Recommendation: The implementation of the framework suggested in real situation need to be conducted to obtain more accurate result.

  15. Development of active control technique for engine noise. Engine soon no active seigyo gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    Uchida, H.; Nakao, N.; Butsuen, T. (Mazda Motor Corp., Hiroshima (Japan))

    1994-03-31

    As a measure to reduce engine noise in a car, the active noise control (ANC) technique to eliminate noise by another noise of antiphase has been studied. The conventional filtered-x LMS control algorithm has been generally applied to the ANC, but a large quantity of arithmetic operation used for filtering is practically problematic. This paper proposes the new algorithm of which control effects and practicability have been improved by utilizing periodicity of engine noise and by introducing the idea of error scanning. This algorithm requires only 30-50% of the arithmetic operation of the above LMS method. Concerning the actual system structure, arrangement and the number of microphones have been examined based on the detailed measurement results of the spatial distribution of noise in a car. As a result, the suitable arrangement of only three microphones to reduce noise in the whole interior space of a car is found. Through the experiments, maximum noise reduction of 8dB (A scale) has been achieved at each seat position. 7 refs., 9 figs., 1 tab.

  16. ESTIMATION OF ACTIVATED ENERGY OF DESORPTION OF n—HEXANE ON ACTIVATED CARBONS BY PTD TECHNIQUE

    Institute of Scientific and Technical Information of China (English)

    LIZhong; WANGHongjuan; 等

    2001-01-01

    In this paper,six kinds of activated carbons such as Ag+-activated carbon,Cu2+activated carbon,Fe3+-activated carbon,activated carbon,Ba2+-activated carbon and Ca2+activated carbon were prepared.The model for estimating activated energy of desorption was established.Temperature-programmed desorption(TPD)experiments were conducted to measure the TPD curves of n-hexanol and then estimate the activation energy for desorption of n-hexanol on the activated carbons.Results showed that the activation energy for the desorption of n-hexanol on the Ag+-activated carbon,the Cu2+-activated carbon and the Fe3+-activated carbon were higher than those of n-hexanol on the activated carbon,the Ca2+-activated carbon and the Ba2+-activated carbon.

  17. ESTIMATION OF ACTIVATED ENERGY OF DESORPTION OF n-HEXANE ON ACTIVATED CARBONS BY TPD TECHNIQUE

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In this paper, six kinds of activated carbons such as Ag+-activated carbon, Cu2+-activated carbon, Fe3+- activated carbon, activated carbon, Ba2+- activated carbon and Ca2+-activated carbon were prepared. The model for estimating activated energy of desorption was established. Temperature-programmed desorption (TPD) experiments were conducted to measure the TPD curves of n-hexanol and then estimate the activation energy for desorption of n-hexanol on the activated carbons. Results showed that the activation energy for the desorption of n-hexanol on the Ag+- activated carbon, the Cu2+- activated carbon and the Fe3+- activated carbon were higher than those of n-hexanol on the activated carbon, the Ca2+- activated carbon and the Ba2+- activated carbon.

  18. Getting the Most Out of Dual-Listed Courses: Involving Undergraduate Students in Discussion Through Active Learning Techniques

    Science.gov (United States)

    Tasich, C. M.; Duncan, L. L.; Duncan, B. R.; Burkhardt, B. L.; Benneyworth, L. M.

    2015-12-01

    Dual-listed courses will persist in higher education because of resource limitations. The pedagogical differences between undergraduate and graduate STEM student groups and the underlying distinction in intellectual development levels between the two student groups complicate the inclusion of undergraduates in these courses. Active learning techniques are a possible remedy to the hardships undergraduate students experience in graduate-level courses. Through an analysis of both undergraduate and graduate student experiences while enrolled in a dual-listed course, we implemented a variety of learning techniques used to complement the learning of both student groups and enhance deep discussion. Here, we provide details concerning the implementation of four active learning techniques - role play, game, debate, and small group - that were used to help undergraduate students critically discuss primary literature. Student perceptions were gauged through an anonymous, end-of-course evaluation that contained basic questions comparing the course to other courses at the university and other salient aspects of the course. These were given as a Likert scale on which students rated a variety of statements (1 = strongly disagree, 3 = no opinion, and 5 = strongly agree). Undergraduates found active learning techniques to be preferable to traditional techniques with small-group discussions being rated the highest in both enjoyment and enhanced learning. The graduate student discussion leaders also found active learning techniques to improve discussion. In hindsight, students of all cultures may be better able to take advantage of such approaches and to critically read and discuss primary literature when written assignments are used to guide their reading. Applications of active learning techniques can not only address the gap between differing levels of students, but also serve as a complement to student engagement in any science course design.

  19. Complementary medicine for the management of chronic stress: superiority of active versus passive techniques.

    Science.gov (United States)

    Lucini, Daniela; Malacarne, Mara; Solaro, Nadia; Busin, Silvano; Pagani, Massimo

    2009-12-01

    Recent epidemiological data indicate that chronic stress is an important component of cardiovascular risk, implicitly suggesting that stress management might offer a useful complement to orthodox medical treatment and prevention of hypertension. In this context, information on mechanisms, such as subclinical increases in arterial pressure and sympathetic drive, is well documented. Conversely, evidence on methodologies and comparative efficacy needs to be improved. Accordingly, this study was planned to test the autonomic and subjective effects of two popular modalities of stress management. We studied 70 patients complaining of stress-related symptoms, avoiding any potential autonomic confounder, such as established hypertension or drug treatment. Patients were divided in three groups: group I (n = 30) followed a breathing-guided relaxation training (active); group II (n = 15) an oriental massage, shiatsu (passive); and group III (n = 25) followed a sham intervention. Subjective effects of stress were assessed by validated questionnaires and autonomic nervous system regulation by spectral analysis of RR interval variability. Factor analysis was used to extract information simultaneously embedded in subjective and functional data. Although the problem of a greater quantity of treatment procedure in the active group than in the passive group existed, results showed that active relaxation, further to slightly reducing arterial pressure, might be more effective in relieving symptoms of stress and inducing an improved profile of autonomic cardiovascular regulation, as compared with passive massage or sham intervention. This active technique seems capable of beneficially addressing simultaneously the individual psychological and physiopathological dimensions of stress in clinical settings, with potentially beneficial effects on cardiovascular risk profile.

  20. Exploring Undergraduates' Perceptions of the Use of Active Learning Techniques in Science Lectures

    Science.gov (United States)

    Welsh, Ashley J.

    2012-01-01

    This paper examines students' mixed perceptions of the use of active learning techniques in undergraduate science lectures. Written comments from over 250 students offered an in-depth view of why students perceive these techniques as helping or hindering their learning and experience. Fourth- and fifth-year students were more likely to view…

  1. Image analysis techniques associated with automatic data base generation.

    Science.gov (United States)

    Bond, A. D.; Ramapriyan, H. K.; Atkinson, R. J.; Hodges, B. C.; Thomas, D. T.

    1973-01-01

    This paper considers some basic problems relating to automatic data base generation from imagery, the primary emphasis being on fast and efficient automatic extraction of relevant pictorial information. Among the techniques discussed are recursive implementations of some particular types of filters which are much faster than FFT implementations, a 'sequential similarity detection' technique of implementing matched filters, and sequential linear classification of multispectral imagery. Several applications of the above techniques are presented including enhancement of underwater, aerial and radiographic imagery, detection and reconstruction of particular types of features in images, automatic picture registration and classification of multiband aerial photographs to generate thematic land use maps.

  2. In Vivo Imaging Techniques: A New Era for Histochemical Analysis

    Science.gov (United States)

    Busato, A.; Feruglio, P. Fumene; Parnigotto, P.P.; Marzola, P.; Sbarbati, A.

    2016-01-01

    In vivo imaging techniques can be integrated with classical histochemistry to create an actual histochemistry of water. In particular, Magnetic Resonance Imaging (MRI), an imaging technique primarily used as diagnostic tool in clinical/preclinical research, has excellent anatomical resolution, unlimited penetration depth and intrinsic soft tissue contrast. Thanks to the technological development, MRI is not only capable to provide morphological information but also and more interestingly functional, biophysical and molecular. In this paper we describe the main features of several advanced imaging techniques, such as MRI microscopy, Magnetic Resonance Spectroscopy, functional MRI, Diffusion Tensor Imaging and MRI with contrast agent as a useful support to classical histochemistry. PMID:28076937

  3. Multidimensional scaling technique for analysis of magnetic storms at Indian observatories

    Indian Academy of Sciences (India)

    M Sridharan; A M S Ramasamy

    2002-12-01

    Multidimensional scaling is a powerful technique for analysis of data. The latitudinal dependenceof geomagnetic field variation in horizontal component (H) during magnetic storms is analysed in this paper by employing this technique.

  4. Digital image processing and analysis for activated sludge wastewater treatment.

    Science.gov (United States)

    Khan, Muhammad Burhan; Lee, Xue Yong; Nisar, Humaira; Ng, Choon Aun; Yeap, Kim Ho; Malik, Aamir Saeed

    2015-01-01

    Activated sludge system is generally used in wastewater treatment plants for processing domestic influent. Conventionally the activated sludge wastewater treatment is monitored by measuring physico-chemical parameters like total suspended solids (TSSol), sludge volume index (SVI) and chemical oxygen demand (COD) etc. For the measurement, tests are conducted in the laboratory, which take many hours to give the final measurement. Digital image processing and analysis offers a better alternative not only to monitor and characterize the current state of activated sludge but also to predict the future state. The characterization by image processing and analysis is done by correlating the time evolution of parameters extracted by image analysis of floc and filaments with the physico-chemical parameters. This chapter briefly reviews the activated sludge wastewater treatment; and, procedures of image acquisition, preprocessing, segmentation and analysis in the specific context of activated sludge wastewater treatment. In the latter part additional procedures like z-stacking, image stitching are introduced for wastewater image preprocessing, which are not previously used in the context of activated sludge. Different preprocessing and segmentation techniques are proposed, along with the survey of imaging procedures reported in the literature. Finally the image analysis based morphological parameters and correlation of the parameters with regard to monitoring and prediction of activated sludge are discussed. Hence it is observed that image analysis can play a very useful role in the monitoring of activated sludge wastewater treatment plants.

  5. Using high speed smartphone cameras and video analysis techniques to teach mechanical wave physics

    Science.gov (United States)

    Bonato, Jacopo; Gratton, Luigi M.; Onorato, Pasquale; Oss, Stefano

    2017-07-01

    We propose the use of smartphone-based slow-motion video analysis techniques as a valuable tool for investigating physics concepts ruling mechanical wave propagation. The simple experimental activities presented here, suitable for both high school and undergraduate students, allows one to measure, in a simple yet rigorous way, the speed of pulses along a spring and the period of transverse standing waves generated in the same spring. These experiments can be helpful in addressing several relevant concepts about the physics of mechanical waves and in overcoming some of the typical student misconceptions in this same field.

  6. Applications of surface analysis techniques to photovoltaic research: Grain and grain boundary studies

    Science.gov (United States)

    Kazmerski, L. L.

    Complementary surface analysis techniques (AES, SIMS, XPS) are applied to photovoltaic devices in order to assess the limiting factors of grain and grain boundary chemistry to the performance of polycrystalline solar cells. Results of these compositional and chemical studies are directly correlated with electrical measurements (EBIC) and with resulting device performance. Examples of grain boundary passivation in polycrystalline Si and GaAs solar cells are cited. The quality of the intragrain material used in these devices is shown to be equally important to the grain boundary activity in determining overall photovoltaic performance.

  7. Association of two techniques of frontal sinus radiographic analysis for human identification

    Directory of Open Access Journals (Sweden)

    Rhonan Ferreira da SILVA

    2009-09-01

    Full Text Available Introduction: The analysis of images with human identificationpurpose is a routine activity in the departments of forensic medicine, especially when is necessary to identify burned bodies, skeletal remains or corpses in advanced stage of decomposition. Case report: The feasibility and reliability of the analysis of the morphoradiographic image of the frontal sinus is showed, displayed in a posteroanterior (PA radiography of skull produced in life compared to another produced post-death. Conclusion: The results obtained in the radiographic comparison through the association of two different techniques of analysis of the frontal sinus allowed a positive correlation of the identity of the disappeared person with the body in an advanced stage of decomposition.

  8. A new technique for fractal analysis applied to human, intracerebrally recorded, ictal electroencephalographic signals.

    Science.gov (United States)

    Bullmore, E; Brammer, M; Alarcon, G; Binnie, C

    1992-11-09

    Application of a new method of fractal analysis to human, intracerebrally recorded, ictal electroencephalographic (EEG) signals is reported. 'Frameshift-Richardson' (FR) analysis involves estimation of fractal dimension (1 EEG data; it is suggested that this technique offers significant operational advantages over use of algorithms for FD estimation requiring preliminary reconstruction of EEG data in phase space. FR analysis was found to reduce substantially the volume of EEG data, without loss of diagnostically important information concerning onset, propagation and evolution of ictal EEG discharges. Arrhythmic EEG events were correlated with relatively increased FD; rhythmic EEG events with relatively decreased FD. It is proposed that development of this method may lead to: (i) enhanced definition and localisation of initial ictal changes in the EEG presumed due to multi-unit activity; and (ii) synoptic visualisation of long periods of EEG data.

  9. Comparative analysis of data mining techniques for business data

    Science.gov (United States)

    Jamil, Jastini Mohd; Shaharanee, Izwan Nizal Mohd

    2014-12-01

    Data mining is the process of employing one or more computer learning techniques to automatically analyze and extract knowledge from data contained within a database. Companies are using this tool to further understand their customers, to design targeted sales and marketing campaigns, to predict what product customers will buy and the frequency of purchase, and to spot trends in customer preferences that can lead to new product development. In this paper, we conduct a systematic approach to explore several of data mining techniques in business application. The experimental result reveals that all data mining techniques accomplish their goals perfectly, but each of the technique has its own characteristics and specification that demonstrate their accuracy, proficiency and preference.

  10. RAPD analysis : a rapid technique for differentation of spoilage yeasts

    NARCIS (Netherlands)

    Baleiras Couto, M.M.; Vossen, J.M.B.M. van der; Hofstra, H.; Huis in 't Veld, J.H.J.

    1994-01-01

    Techniques for the identification of the spoilage yeasts Saccharomyces cerevisiae and members of the Zygosaccharomyces genus from food and beverages sources were evaluated. The use of identification systems based on physiological characteristics resulted often in incomplete identification or misiden

  11. Cepstrum Analysis: An Advanced Technique in Vibration Analysis of Defects in Rotating Machinery

    Directory of Open Access Journals (Sweden)

    M. Satyam

    1994-01-01

    Full Text Available Conventional frequency analysis in machinery vibration is not adequate to find out accurately defects in gears, bearings, and blades where sidebands and harmonics are present. Also such an approach is dependent on the transmission path. On the other hand, cepstrum analysis accurately identifies harmonics and sideband families and is a better technique available for fault diagnosis in gears, bearings, and turbine blades of ships and submarines. Cepstrum represents the global power content of a whole family of harmonics and sidebands when more than one family of sidebands are presents at the same time. Also it is insensitive to the transmission path effects since source and transmission path effects are additive and can be separated in cepstrum. The concept, underlying theory and the measurement and analysis involved for using the technique are briefly outlined. Two cases were taken to demonstrate advantage of cepstrum technique over the spectrum analysis. An LP compressor was chosen to study the transmission path effects and a marine gearbox having two sets of sideband families was studied to diagnose the problematic sideband and its severity.

  12. Quantitative Image Analysis Techniques with High-Speed Schlieren Photography

    Science.gov (United States)

    Pollard, Victoria J.; Herron, Andrew J.

    2017-01-01

    Optical flow visualization techniques such as schlieren and shadowgraph photography are essential to understanding fluid flow when interpreting acquired wind tunnel test data. Output of the standard implementations of these visualization techniques in test facilities are often limited only to qualitative interpretation of the resulting images. Although various quantitative optical techniques have been developed, these techniques often require special equipment or are focused on obtaining very precise and accurate data about the visualized flow. These systems are not practical in small, production wind tunnel test facilities. However, high-speed photography capability has become a common upgrade to many test facilities in order to better capture images of unsteady flow phenomena such as oscillating shocks and flow separation. This paper describes novel techniques utilized by the authors to analyze captured high-speed schlieren and shadowgraph imagery from wind tunnel testing for quantification of observed unsteady flow frequency content. Such techniques have applications in parametric geometry studies and in small facilities where more specialized equipment may not be available.

  13. MUMAL: Multivariate analysis in shotgun proteomics using machine learning techniques

    Directory of Open Access Journals (Sweden)

    Cerqueira Fabio R

    2012-10-01

    Full Text Available Abstract Background The shotgun strategy (liquid chromatography coupled with tandem mass spectrometry is widely applied for identification of proteins in complex mixtures. This method gives rise to thousands of spectra in a single run, which are interpreted by computational tools. Such tools normally use a protein database from which peptide sequences are extracted for matching with experimentally derived mass spectral data. After the database search, the correctness of obtained peptide-spectrum matches (PSMs needs to be evaluated also by algorithms, as a manual curation of these huge datasets would be impractical. The target-decoy database strategy is largely used to perform spectrum evaluation. Nonetheless, this method has been applied without considering sensitivity, i.e., only error estimation is taken into account. A recently proposed method termed MUDE treats the target-decoy analysis as an optimization problem, where sensitivity is maximized. This method demonstrates a significant increase in the retrieved number of PSMs for a fixed error rate. However, the MUDE model is constructed in such a way that linear decision boundaries are established to separate correct from incorrect PSMs. Besides, the described heuristic for solving the optimization problem has to be executed many times to achieve a significant augmentation in sensitivity. Results Here, we propose a new method, termed MUMAL, for PSM assessment that is based on machine learning techniques. Our method can establish nonlinear decision boundaries, leading to a higher chance to retrieve more true positives. Furthermore, we need few iterations to achieve high sensitivities, strikingly shortening the running time of the whole process. Experiments show that our method achieves a considerably higher number of PSMs compared with standard tools such as MUDE, PeptideProphet, and typical target-decoy approaches. Conclusion Our approach not only enhances the computational performance, and

  14. A novel technique for active vibration control, based on optimal tracking control

    Indian Academy of Sciences (India)

    BEHROUZ KHEIRI SARABI; MANU SHARMA; DAMANJEET KAUR

    2017-08-01

    In the last few decades, researchers have proposed many control techniques to suppress unwanted vibrations in a structure. In this work, a novel and simple technique is proposed for the active vibration control. In this technique, an optimal tracking control is employed to suppress vibrations in a structure by simultaneously tracking zero references for modes of vibration. To illustrate the technique, a two-degrees of freedom spring-mass-dampersystem is considered as a test system. The mathematical model of the system is derived and then converted into a state-space model. A linear quadratic tracking control law is then used to make the disturbed system track zero references.

  15. A novel technique for active vibration control, based on optimal tracking control

    Science.gov (United States)

    Kheiri Sarabi, Behrouz; Sharma, Manu; Kaur, Damanjeet

    2017-08-01

    In the last few decades, researchers have proposed many control techniques to suppress unwanted vibrations in a structure. In this work, a novel and simple technique is proposed for the active vibration control. In this technique, an optimal tracking control is employed to suppress vibrations in a structure by simultaneously tracking zero references for modes of vibration. To illustrate the technique, a two-degrees of freedom spring-mass-damper system is considered as a test system. The mathematical model of the system is derived and then converted into a state-space model. A linear quadratic tracking control law is then used to make the disturbed system track zero references.

  16. Development of HANARO Activation Analysis System and Utilization Technology

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Y. S.; Moon, J. H.; Cho, H. J. (and others)

    2007-06-15

    1. Establishment of evaluation system using a data for a neutron activation analysis : Improvement of NAA measurement system and its identification, Development of combined data evaluation code of NAA/PGAA, International technical cooperation project 2. Development of technique for a industrial application of high precision gamma nuclide spectroscopic analysis : Analytical quality control, Development of industrial application techniques and its identification 3. Industrial application research for a prompt gamma-ray activation analysis : Improvement of Compton suppression counting system (PGAA), Development of applied technology using a PGAA system 4. Establishment of NAA user supporting system and KOLAS management : Development and validation of KOLAS/ISO accreditation testing and identification method, Cooperation researches for a industrial application, Establishment of integrated user analytical supporting system, Accomplishment of sample irradiation facility.

  17. Reticle defect sizing of optical proximity correction defects using SEM imaging and image analysis techniques

    Science.gov (United States)

    Zurbrick, Larry S.; Wang, Lantian; Konicek, Paul; Laird, Ellen R.

    2000-07-01

    Sizing of programmed defects on optical proximity correction (OPC) feature sis addressed using high resolution scanning electron microscope (SEM) images and image analysis techniques. A comparison and analysis of different sizing methods is made. This paper addresses the issues of OPC defect definition and discusses the experimental measurement results obtained by SEM in combination with image analysis techniques.

  18. TECHNIQUE OF THE STATISTICAL ANALYSIS OF INVESTMENT APPEAL OF THE REGION

    Directory of Open Access Journals (Sweden)

    А. А. Vershinina

    2014-01-01

    Full Text Available The technique of the statistical analysis of investment appeal of the region is given in scientific article for direct foreign investments. Definition of a technique of the statistical analysis is given, analysis stages reveal, the mathematico-statistical tools are considered.

  19. Application of neutron activation techniques and x-ray energy dispersion spectrometry, in analysis of metallic traces adsorbed by chelex-100 resin; Ativacao das tecnicas de ativacao neutronica e espectrometria por dispersao de onda e de energia de raios X, na analise de tracos metalicos adsorvidos pela resina chelex-100

    Energy Technology Data Exchange (ETDEWEB)

    Fernandes, Jair C.; Amaral, Angela M.; Magalhaes, Jesus C.; Pereira, Jose S.J.; Silva, Juliana B. da; Auler, Lucia M.L.A. [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), Belo Horizonte, MG (Brazil)]. E-mail: jcf@urano.cdtn.br

    2000-07-01

    In this work, the authors have investigated optimal conditions of adsorption for several ion metallic groups (cations of heavy metals and transition metals, oxyanions metallics and metalloids and cations of rare earths), as traces (ppb), withdrawn and in mixture of groups, by chelex-100 resin. The experiments have been developed by bath techniques in ammonium acetate tamponade solution 40 mM pH 5,52 content 0,5 g of chelex-100 resin. After magnetic agitation for two hours, resins were dried and submitted to X-ray energy dispersion spectrometry, x-ray fluorescence spectrometry and neutron activation analysis. The results have demonstrated that chelex-100 resin adsorb quantitatively transition element groups and rare earth groups in two cases (withdrawn and simultaneously adsorption)

  20. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    Science.gov (United States)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  1. COMPARATIVE ANALYSIS OF TRAUMA REDUCTION TECHNIQUES IN LAPAROSCOPIC CHOLECYSTECTOMY

    Directory of Open Access Journals (Sweden)

    Anton Koychev

    2017-02-01

    Full Text Available Nowadays, there is no operation in the field of abdominal surgery, which cannot be performed laparoscopically. Both surgeons and patients have at their disposal an increasing number of laparoscopic techniques to perform the surgical interventions. The prevalence of laparoscopic cholecystectomy is due to its undeniable advantages over the traditional open surgery, namely small invasiveness, reducing the frequency and severity of perioperative complications, the incomparably better cosmetic result, and the so much better medical and social, and medical and economic efficiency. Single-port laparoscopic techniques to perform laparoscopic cholecystectomy are acceptable alternative to the classical conventional multi-port techniques. The security of the laparoscopic cholecystectomy requires precise identification of anatomical structures and precise observing the diagnostic and treatment protocols, and criteria for selection of patients to be treated surgically by these methods.

  2. Comparative study of Authorship Identification Techniques for Cyber Forensics Analysis

    Directory of Open Access Journals (Sweden)

    Smita Nirkhi

    2013-06-01

    Full Text Available Authorship Identification techniques are used to identify the most appropriate author from group of potential suspects of online messages and find evidences to support the conclusion. Cybercriminals make misuse of online communication for sending blackmail or a spam email and then attempt to hide their true identities to void detection.Authorship Identification of online messages is the contemporary research issue for identity tracing in cyber forensics. This is highly interdisciplinary area as it takes advantage of machine learning, information retrieval, and natural language processing. In this paper, a study of recent techniques and automated approaches to attributing authorship of online messages is presented. The focus of this review study is to summarize all existing authorship identification techniques used in literature to identify authors of online messages. Also it discusses evaluation criteria and parameters for authorship attribution studies and list open questions that will attract future work in this area.

  3. Molecular field analysis (MFA) and other QSAR techniques in development of phosphatase inhibitors.

    Science.gov (United States)

    Nair, Pramod C

    2011-01-01

    Phosphatases are well known drug targets for diseases such as diabetes, obesity and other autoimmune diseases. Their role in cancer is due to unusual expression patterns in different types of cancer. However, there is strong evidence for selective targeting of phosphatases in cancer therapy. Several experimental and in silico techniques have been attempted for design of phosphatase inhibitors, with focus on diseases such as diabetes, inflammation and obesity. Their utility for cancer therapy is limited and needs to be explored vastly. Quantitative Structure Activity relationship (QSAR) is well established in silico ligand based drug design technique, used by medicinal chemists for prediction of ligand binding affinity and lead design. These techniques have shown promise for subsequent optimization of already existing lead compounds, with an aim of increased potency and pharmacological properties for a particular drug target. Furthermore, their utility in virtual screening and scaffold hopping is highlighted in recent years. This review focuses on the recent molecular field analysis (MFA) and QSAR techniques, directed for design and development of phosphatase inhibitors and their potential use in cancer therapy. In addition, this review also addresses issues concerning the binding orientation and binding conformation of ligands for alignment sensitive QSAR approaches.

  4. OPINION MINING AND SENTIMENT ANALYSIS TECHNIQUES: A RECENT SURVEY

    OpenAIRE

    Ms. Kalyani D. Gaikwad*, Prof. Sonawane V.R

    2016-01-01

    Sentiment analysis (also known as opinion mining) refers to the use of natural language processing, text analysis and computational linguistics to identify and extract subjective information in source materials. Sentiment analysis is widely applied to reviews and social media for a variety of applications, ranging from marketing to customer service. The difficulties of performing sentiment analysis in this domain can be overcome by leveraging on common-sense knowledge bases. Opinion Mining is...

  5. Data Mining Techniques: A Source for Consumer Behavior Analysis

    CERN Document Server

    Raorane, Abhijit

    2011-01-01

    Various studies on consumer purchasing behaviors have been presented and used in real problems. Data mining techniques are expected to be a more effective tool for analyzing consumer behaviors. However, the data mining method has disadvantages as well as advantages. Therefore, it is important to select appropriate techniques to mine databases. The objective of this paper is to know consumer behavior, his psychological condition at the time of purchase and how suitable data mining method apply to improve conventional method. Moreover, in an experiment, association rule is employed to mine rules for trusted customers using sales data in a super market industry

  6. An Information Diffusion Technique for Fire Risk Analysis

    Institute of Scientific and Technical Information of China (English)

    刘静; 黄崇福

    2004-01-01

    There are many kinds of fires occurring under different conditions. For a specific site, it is difficult to collect sufficient data for analyzing the fire risk. In this paper, we suggest an information diffusion technique to analyze fire risk with a small sample. The information distribution method is applied to change crisp observations into fuzzy sets, and then to effectively construct a fuzzy relationship between fire and surroundings. With the data of Shanghai in winter, we show how to use the technique to analyze the fire risk.

  7. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  8. Krylov Subspace Method with Communication Avoiding Technique for Linear System Obtained from Electromagnetic Analysis

    National Research Council Canada - National Science Library

    IKUNO, Soichiro; CHEN, Gong; YAMAMOTO, Susumu; ITOH, Taku; ABE, Kuniyoshi; NAKAMURA, Hiroaki

    2016-01-01

    Krylov subspace method and the variable preconditioned Krylov subspace method with communication avoiding technique for a linear system obtained from electromagnetic analysis are numerically investigated. In the k...

  9. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  10. ACTIVATION-ENERGY SPECTRA FOR STRESS-INDUCED ORDERING IN AMORPHOUS MATERIALS CALCULATED USING FOURIER TECHNIQUES

    NARCIS (Netherlands)

    KASARDOVA, A; Ocelik, Vaclav; CSACH, K; MISKUF, J

    A method for calculating the activation energy spectrum from isothermal data using Fourier techniques is used for studying the deformation processes in amorphous metals. The influence of experimental error on the calculated spectrum is discussed. The activation energy spectrum derived from the

  11. Reconstructing muscle activation during normal walking: a comparison of symbolic and connectionist machine learning techniques

    NARCIS (Netherlands)

    Heller, Ben W.; Veltink, Peter H.; Rijkhoff, Nico J.M.; Rutten, Wim L.C.; Andrews, Brian J.

    1993-01-01

    One symbolic (rule-based inductive learning) and one connectionist (neural network) machine learning technique were used to reconstruct muscle activation patterns from kinematic data measured during normal human walking at several speeds. The activation patterns (or desired outputs) consisted of sur

  12. Reconstructing muscle activation during normal walking: a comparison of symbolic and connectionist machine learning techniques

    NARCIS (Netherlands)

    Heller, Ben W.; Veltink, Petrus H.; Rijkhoff, N.J.M.; Rijkhoff, Nico J.M.; Rutten, Wim; Andrews, Brian J.

    1993-01-01

    One symbolic (rule-based inductive learning) and one connectionist (neural network) machine learning technique were used to reconstruct muscle activation patterns from kinematic data measured during normal human walking at several speeds. The activation patterns (or desired outputs) consisted of

  13. Applying Modern Techniques and Carrying Out English .Extracurricular—— On the Model United Nations Activity

    Institute of Scientific and Technical Information of China (English)

    XuXiaoyu; WangJian

    2004-01-01

    This paper is an introduction of the extracurricular activity of the Model United Nations in Northwestern Polyteehnical University (NPU) and it focuses on the application of the modem techniques in the activity and the pedagogical theories applied in it. An interview and questionnaire research will reveal the influence of the Model United Nations.

  14. ACTIVATION-ENERGY SPECTRA FOR STRESS-INDUCED ORDERING IN AMORPHOUS MATERIALS CALCULATED USING FOURIER TECHNIQUES

    NARCIS (Netherlands)

    KASARDOVA, A; Ocelik, Vaclav; CSACH, K; MISKUF, J

    1995-01-01

    A method for calculating the activation energy spectrum from isothermal data using Fourier techniques is used for studying the deformation processes in amorphous metals. The influence of experimental error on the calculated spectrum is discussed. The activation energy spectrum derived from the anela

  15. Traveling through potential energy surfaces of disordered materials: the activation-relaxation technique

    NARCIS (Netherlands)

    Mousseau, N.; Barkema, G.T.

    A detailed description of the activation-relaxation technique (ART) is presented. This method defines events in the configurational energy landscape of disordered materials such as amorphous semiconductors, glasses and polymers, in a two-step process: first, a configuration is activated from a local

  16. Applying data-mining techniques in honeypot analysis

    CSIR Research Space (South Africa)

    Veerasamy, N

    2006-07-01

    Full Text Available This paper proposes the use of a data mining techniques to analyse the data recorded by the honeypot. This data can also be used to train Intrusion Detection Systems (IDS) in identifying attacks. Since the training is based on real data...

  17. UPLC-ICP-MS - a fast technique for speciation analysis

    DEFF Research Database (Denmark)

    Bendahl, L.; Sturup, S.; Gammelgaard, Bente;

    2005-01-01

    Ultra performance liquid chromatography is a new development of the HPLC separation technique that allows separations on column materials at high pressures up to 10(8) Pa using particle diameters of 1.7 mu m. This increases the efficiency, the resolution and the speed of the separation. Four aque...

  18. Analysis on Poe's Unique Techniques to Achieve Aestheticism

    Institute of Scientific and Technical Information of China (English)

    孔佳鸣

    2008-01-01

    Edgar Allan Poe was one of the most important poets in the American poetic history for his unremitting pursuit for ‘ideal beauty'.This essay proves by various examples chosen from his poems that his aestheticism was obvious in his versification techniques.His poetic theory and practice gave an immortal example for the development of the English poetry.

  19. Tape Stripping Technique for Stratum Corneum Protein Analysis

    DEFF Research Database (Denmark)

    Clausen, Maja-Lisa; Slotved, H.-C.; Krogfelt, Karen Angeliki

    2016-01-01

    The aim of this study was to investigate the amount of protein in stratum corneum in atopic dermatitis (AD) patients and healthy controls, using tape stripping technique. Furthermore, to compare two different methods for protein assessment. Tape stripping was performed in AD patients and healthy ...

  20. Infrared Contrast Analysis Technique for Flash Thermography Nondestructive Evaluation

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    The paper deals with the infrared flash thermography inspection to detect and analyze delamination-like anomalies in nonmetallic materials. It provides information on an IR Contrast technique that involves extracting normalized contrast verses time evolutions from the flash thermography infrared video data. The paper provides the analytical model used in the simulation of infrared image contrast. The contrast evolution simulation is achieved through calibration on measured contrast evolutions from many flat bottom holes in the subject material. The paper also provides formulas to calculate values of the thermal measurement features from the measured contrast evolution curve. Many thermal measurement features of the contrast evolution that relate to the anomaly characteristics are calculated. The measurement features and the contrast simulation are used to evaluate flash thermography inspection data in order to characterize the delamination-like anomalies. In addition, the contrast evolution prediction is matched to the measured anomaly contrast evolution to provide an assessment of the anomaly depth and width in terms of depth and diameter of the corresponding equivalent flat-bottom hole (EFBH) or equivalent uniform gap (EUG). The paper provides anomaly edge detection technique called the half-max technique which is also used to estimate width of an indication. The EFBH/EUG and half-max width estimations are used to assess anomaly size. The paper also provides some information on the "IR Contrast" software application, half-max technique and IR Contrast feature imaging application, which are based on models provided in this paper.

  1. A technique for detecting antifungal activity of proteins separated by polyacrylamide gel electrophoresis.

    Science.gov (United States)

    De Bolle, M F; Goderis, I J; Terras, F R; Cammue, B P; Broekaert, W F

    1991-06-01

    A technique was developed for the detection of antifungal activity of proteins after discontinuous polyacrylamide gel electrophoresis under native conditions. The antifungal activity is detected as growth inhibition zones in a homogeneous fungal lawn, grown in an agar layer spread on top of the polyacrylamide gel. The position of proteins with antifungal activity can be determined on a diffusion blot prepared from the same gel. The technique is illustrated for three antifungal plant proteins, i.e. alpha-purothionin, Urtica dioica agglutinin, and tobacco chitinase.

  2. Hydrogeological activity of lineaments in Yaoundé Cameroon region using remote sensing and GIS techniques

    Directory of Open Access Journals (Sweden)

    William Teikeu Assatse

    2016-06-01

    Full Text Available Though Yaoundé zone is characterized by abundant rains, access to safe drinking water becomes a difficult activity, because of climate change and pollution caused by human activities. Lineament zones on the earth’s surface are important elements in understanding the dynamics of the subsurface fluid flow. However, good exposures of these features are always lacking in some areas around Yaoundé, characterized by thick alteration. During field surveys these conditions, in many cases, hinder the proper characterization of such features. Therefore, an approach that identifies the regional lineaments on remote-sensing images (Landsat Thematic Mapper and shaded digital terrain models, with its large scale synoptic coverage, could be promising. This paper aims to the structural organization of lineament network in the crystalline basement of Yaoundé from remote sensing data and characterize them by statistical and geostatistical techniques. The results were validated on the basis of the geological maps, the hydrogeological maps and the outcrop data. Statistical analysis of the lineaments network shows a distribution along the N0–10, N20–30, N40–60 and N140–150. The correlation between the productivity of high yield wells and the closest lineament confirms that these lineaments are surface traces of regional discontinuities and act as main groundwater flow paths.

  3. Value of Earth Observations: Key principles and techniques of socioeconomic benefits analysis (Invited)

    Science.gov (United States)

    Friedl, L.; Macauley, M.; Bernknopf, R.

    2013-12-01

    Internationally, multiple organizations are placing greater emphasis on the societal benefits that governments, businesses, and NGOs can derive from applications of Earth-observing satellite observations, research, and models. A growing set of qualitative, anecdotal examples on the uses of Earth observations across a range of sectors can be complemented by the quantitative substantiation of the socioeconomic benefits. In turn, the expanding breadth of environmental data available and the awareness of their beneficial applications to inform decisions can support new products and services by companies, agencies, and civil society. There are, however, significant efforts needed to bridge the Earth sciences and social and economic sciences fields to build capacity, develop case studies, and refine analytic techniques in quantifying socioeconomic benefits from the use of Earth observations. Some government programs, such as the NASA Earth Science Division's Applied Sciences Program have initiated activities in recent years to quantify the socioeconomic benefits from applications of Earth observations research, and to develop multidisciplinary models for organizations' decision-making activities. A community of practice has conducted workshops, developed impact analysis reports, published a book, developed a primer, and pursued other activities to advance analytic methodologies and build capacity. This paper will present an overview of measuring socioeconomic impacts of Earth observations and how the measures can be translated into a value of Earth observation information. It will address key terms, techniques, principles and applications of socioeconomic impact analyses. It will also discuss activities to pursue a research agenda on analytic techniques, develop a body of knowledge, and promote broader skills and capabilities.

  4. IN VITRO ANALYSIS OF MIGRATION ACTIVITY OF ENCEPHALYTOGENIC T CEL

    Directory of Open Access Journals (Sweden)

    M. A. Nosov

    2010-01-01

    Full Text Available Experimental autoimmune encephalomyelitis in an adoptive transfer model is caused by injecting animal with activated T cells specific for a CNS antigen, e.g., basic myelin protein. Development of autimmune inflammation in such a model is connected with changed functional stateof encephalytogenic (EG T cells in the coure of disease progression, as reflected by changes in their activation, proliferation and motility levels. Present work describes an original technique allowing for in vitro analysis of encephalytogenic T cell motility, and studying effects of certain compomemts of extracellular matrix upon migration and functional activities of EG T cells.

  5. MAG4 Versus Alternative Techniques for Forecasting Active-Region Flare Productivity

    Science.gov (United States)

    Falconer, David A.; Moore, Ronald L.; Barghouty, Abdulnasser F.; Khazanov, Igor

    2014-01-01

    MAG4 is a technique of forecasting an active region's rate of production of major flares in the coming few days from a free-magnetic-energy proxy. We present a statistical method of measuring the difference in performance between MAG4 and comparable alternative techniques that forecast an active region's major-flare productivity from alternative observed aspects of the active region. We demonstrate the method by measuring the difference in performance between the "Present MAG4" technique and each of three alternative techniques, called "McIntosh Active-Region Class," "Total Magnetic Flux," and "Next MAG4." We do this by using (1) the MAG4 database of magnetograms and major-flare histories of sunspot active regions, (2) the NOAA table of the major-flare productivity of each of 60 McIntosh active-region classes of sunspot active regions, and (3) five technique-performance metrics (Heidke Skill Score, True Skill Score, Percent Correct, Probability of Detection, and False Alarm Rate) evaluated from 2000 random two-by-two contingency tables obtained from the databases. We find that (1) Present MAG4 far outperforms both McIntosh Active-Region Class and Total Magnetic Flux, (2) Next MAG4 significantly outperforms Present MAG4, (3) the performance of Next MAG4 is insensitive to the forward and backward temporal windows used, in the range of one to a few days, and (4) forecasting from the free-energy proxy in combination with either any broad category of McIntosh active-region classes or any Mount Wilson active-region class gives no significant performance improvement over forecasting from the free-energy proxy alone (Present MAG4).

  6. A System of Systems Interface Hazard Analysis Technique

    Science.gov (United States)

    2007-03-01

    Table 3. HAZOP Guide Words for Software or System Interface Analysis....... 22 Table 4. Example System of Systems Architecture Table...steps are applicable for a software HAZOP . 2 Plan HAZOP Establish HAZOP analysis goals, definitions, worksheets, schedule and process. Divide the...Subtle Incorrect Output’s value is wrong, but cannot be detected Table 3. HAZOP Guide Words for Software or System Interface Analysis31 The

  7. Preliminary analysis techniques for ring and stringer stiffened cylindrical shells

    Science.gov (United States)

    Graham, J.

    1993-03-01

    This report outlines methods of analysis for the buckling of thin-walled circumferentially and longitudinally stiffened cylindrical shells. Methods of analysis for the various failure modes are presented in one cohesive package. Where applicable, more than one method of analysis for a failure mode is presented along with standard practices. The results of this report are primarily intended for use in launch vehicle design in the elastic range. A Microsoft Excel worksheet with accompanying macros has been developed to automate the analysis procedures.

  8. Development of an Automated Technique for Failure Modes and Effect Analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Allasia, G.;

    1999-01-01

    implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  9. Development of an automated technique for failure modes and effect analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Bagnoli, F.;

    implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  10. Method development for arsenic analysis by modification in spectrophotometric technique

    Directory of Open Access Journals (Sweden)

    M. A. Tahir

    2012-01-01

    Full Text Available Arsenic is a non-metallic constituent, present naturally in groundwater due to some minerals and rocks. Arsenic is not geologically uncommon and occurs in natural water as arsenate and arsenite. Additionally, arsenic may occur from industrial discharges or insecticide application. World Health Organization (WHO and Pakistan Standard Quality Control Authority have recommended a permissible limit of 10 ppb for arsenic in drinking water. Arsenic at lower concentrations can be determined in water by using high tech instruments like the Atomic Absorption Spectrometer (hydride generation. Because arsenic concentration at low limits of 1 ppb can not be determined easily with simple spectrophotometric technique, the spectrophotometric technique using silver diethyldithiocarbamate was modified to achieve better results, up to the extent of 1 ppb arsenic concentration.

  11. Analysis of Requirement Engineering Processes, Tools/Techniques and Methodologies

    Directory of Open Access Journals (Sweden)

    Tousif ur Rehman

    2013-02-01

    Full Text Available Requirement engineering is an integral part of the software development lifecycle since the basis for developing successful software depends on comprehending its requirements in the first place. Requirement engineering involves a number of processes for gathering requirements in accordance with the needs and demands of users and stakeholders of the software product. In this paper, we have reviewed the prominent processes, tools and technologies used in the requirement gathering phase. The study is useful to perceive the current state of the affairs pertaining to the requirement engineering research and to understand the strengths and limitations of the existing requirement engineering techniques. The study also summarizes the best practices and how to use a blend of the requirement engineering techniques as an effective methodology to successfully conduct the requirement engineering task. The study also highlights the importance of security requirements as though they are part of the non-functional requirement, yet are naturally considered fundamental to secure software development.

  12. Opportunities for innovation in neutron activation analysis

    NARCIS (Netherlands)

    Bode, P.

    2011-01-01

    Neutron activation laboratories worldwide are at a turning point at which new staff has to be found for the retiring pioneers from the 1960s–1970s. A scientific career in a well-understood technique, often characterized as ‘mature’ may only be attractive to young scientists if still challenges for f

  13. Pathways of distinction analysis: a new technique for multi-SNP analysis of GWAS data.

    Science.gov (United States)

    Braun, Rosemary; Buetow, Kenneth

    2011-06-01

    Genome-wide association studies (GWAS) have become increasingly common due to advances in technology and have permitted the identification of differences in single nucleotide polymorphism (SNP) alleles that are associated with diseases. However, while typical GWAS analysis techniques treat markers individually, complex diseases (cancers, diabetes, and Alzheimers, amongst others) are unlikely to have a single causative gene. Thus, there is a pressing need for multi-SNP analysis methods that can reveal system-level differences in cases and controls. Here, we present a novel multi-SNP GWAS analysis method called Pathways of Distinction Analysis (PoDA). The method uses GWAS data and known pathway-gene and gene-SNP associations to identify pathways that permit, ideally, the distinction of cases from controls. The technique is based upon the hypothesis that, if a pathway is related to disease risk, cases will appear more similar to other cases than to controls (or vice versa) for the SNPs associated with that pathway. By systematically applying the method to all pathways of potential interest, we can identify those for which the hypothesis holds true, i.e., pathways containing SNPs for which the samples exhibit greater within-class similarity than across classes. Importantly, PoDA improves on existing single-SNP and SNP-set enrichment analyses, in that it does not require the SNPs in a pathway to exhibit independent main effects. This permits PoDA to reveal pathways in which epistatic interactions drive risk. In this paper, we detail the PoDA method and apply it to two GWAS: one of breast cancer and the other of liver cancer. The results obtained strongly suggest that there exist pathway-wide genomic differences that contribute to disease susceptibility. PoDA thus provides an analytical tool that is complementary to existing techniques and has the power to enrich our understanding of disease genomics at the systems-level.

  14. Magnetic resonance elastography (MRE) in cancer: Technique, analysis, and applications

    Science.gov (United States)

    Pepin, Kay M.; Ehman, Richard L.; McGee, Kiaran P.

    2015-01-01

    Tissue mechanical properties are significantly altered with the development of cancer. Magnetic resonance elastography (MRE) is a noninvasive technique capable of quantifying tissue mechanical properties in vivo. This review describes the basic principles of MRE and introduces some of the many promising MRE methods that have been developed for the detection and characterization of cancer, evaluation of response to therapy, and investigation of the underlying mechanical mechanisms associated with malignancy. PMID:26592944

  15. Infrared Spectroscopy of Explosives Residues: Measurement Techniques and Spectral Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, Mark C.; Bernacki, Bruce E.

    2015-03-11

    Infrared laser spectroscopy of explosives is a promising technique for standoff and non-contact detection applications. However, the interpretation of spectra obtained in typical standoff measurement configurations presents numerous challenges. Understanding the variability in observed spectra from explosives residues and particles is crucial for design and implementation of detection algorithms with high detection confidence and low false alarm probability. We discuss a series of infrared spectroscopic techniques applied toward measuring and interpreting the reflectance spectra obtained from explosives particles and residues. These techniques utilize the high spectral radiance, broad tuning range, rapid wavelength tuning, high scan reproducibility, and low noise of an external cavity quantum cascade laser (ECQCL) system developed at Pacific Northwest National Laboratory. The ECQCL source permits measurements in configurations which would be either impractical or overly time-consuming with broadband, incoherent infrared sources, and enables a combination of rapid measurement speed and high detection sensitivity. The spectroscopic methods employed include standoff hyperspectral reflectance imaging, quantitative measurements of diffuse reflectance spectra, reflection-absorption infrared spectroscopy, microscopic imaging and spectroscopy, and nano-scale imaging and spectroscopy. Measurements of explosives particles and residues reveal important factors affecting observed reflectance spectra, including measurement geometry, substrate on which the explosives are deposited, and morphological effects such as particle shape, size, orientation, and crystal structure.

  16. Analysis of Acoustic Emission Signals using WaveletTransformation Technique

    Directory of Open Access Journals (Sweden)

    S.V. Subba Rao

    2008-07-01

    Full Text Available Acoustic emission (AE monitoring is carried out during proof pressure testing of pressurevessels to find the occurrence of any crack growth-related phenomenon. While carrying out AEmonitoring, it is often found that the background noise is very high. Along with the noise, thesignal includes various phenomena related to crack growth, rubbing of fasteners, leaks, etc. Dueto the presence of noise, it becomes difficult to identify signature of the original signals related to the above phenomenon. Through various filtering/ thresholding techniques, it was found that the original signals were getting filtered out along with noise. Wavelet transformation technique is found to be more appropriate to analyse the AE signals under such situations. Wavelet transformation technique is used to de-noise the AE data. The de-noised signal is classified to identify a signature based on the type of phenomena.Defence Science Journal, 2008, 58(4, pp.559-564, DOI:http://dx.doi.org/10.14429/dsj.58.1677

  17. An ASIC Low Power Primer Analysis, Techniques and Specification

    CERN Document Server

    Chadha, Rakesh

    2013-01-01

    This book provides an invaluable primer on the techniques utilized in the design of low power digital semiconductor devices.  Readers will benefit from the hands-on approach which starts form the ground-up, explaining with basic examples what power is, how it is measured and how it impacts on the design process of application-specific integrated circuits (ASICs).  The authors use both the Unified Power Format (UPF) and Common Power Format (CPF) to describe in detail the power intent for an ASIC and then guide readers through a variety of architectural and implementation techniques that will help meet the power intent.  From analyzing system power consumption, to techniques that can employed in a low power design, to a detailed description of two alternate standards for capturing the power directives at various phases of the design, this book is filled with information that will give ASIC designers a competitive edge in low-power design. Starts from the ground-up and explains what power is, how it is measur...

  18. A COMPARISON OF SOME STATISTICAL TECHNIQUES FOR ROAD ACCIDENT ANALYSIS

    NARCIS (Netherlands)

    OPPE, S INST ROAD SAFETY RES, SWOV

    1992-01-01

    At the TRRL/SWOV Workshop on Accident Analysis Methodology, heldin Amsterdam in 1988, the need to establish a methodology for the analysis of road accidents was firmly stated by all participants. Data from different countries cannot be compared because there is no agreement on research methodology,

  19. Twitter Sentiment Analysis of Movie Reviews using Machine Learning Techniques.

    Directory of Open Access Journals (Sweden)

    Akshay Amolik

    2015-12-01

    Full Text Available Sentiment analysis is basically concerned with analysis of emotions and opinions from text. We can refer sentiment analysis as opinion mining. Sentiment analysis finds and justifies the sentiment of the person with respect to a given source of content. Social media contain huge amount of the sentiment data in the form of tweets, blogs, and updates on the status, posts, etc. Sentiment analysis of this largely generated data is very useful to express the opinion of the mass. Twitter sentiment analysis is tricky as compared to broad sentiment analysis because of the slang words and misspellings and repeated characters. We know that the maximum length of each tweet in Twitter is 140 characters. So it is very important to identify correct sentiment of each word. In our project we are proposing a highly accurate model of sentiment analysis of tweets with respect to latest reviews of upcoming Bollywood or Hollywood movies. With the help of feature vector and classifiers such as Support vector machine and Naïve Bayes, we are correctly classifying these tweets as positive, negative and neutral to give sentiment of each tweet.

  20. Facilitating the analysis of immunological data with visual analytic techniques.

    Science.gov (United States)

    Shih, David C; Ho, Kevin C; Melnick, Kyle M; Rensink, Ronald A; Kollmann, Tobias R; Fortuno, Edgardo S

    2011-01-02

    Visual analytics (VA) has emerged as a new way to analyze large dataset through interactive visual display. We demonstrated the utility and the flexibility of a VA approach in the analysis of biological datasets. Examples of these datasets in immunology include flow cytometry, Luminex data, and genotyping (e.g., single nucleotide polymorphism) data. Contrary to the traditional information visualization approach, VA restores the analysis power in the hands of analyst by allowing the analyst to engage in real-time data exploration process. We selected the VA software called Tableau after evaluating several VA tools. Two types of analysis tasks analysis within and between datasets were demonstrated in the video presentation using an approach called paired analysis. Paired analysis, as defined in VA, is an analysis approach in which a VA tool expert works side-by-side with a domain expert during the analysis. The domain expert is the one who understands the significance of the data, and asks the questions that the collected data might address. The tool expert then creates visualizations to help find patterns in the data that might answer these questions. The short lag-time between the hypothesis generation and the rapid visual display of the data is the main advantage of a VA approach.

  1. A Survey of Techniques for Security Architecture Analysis

    Science.gov (United States)

    2003-05-01

    Effects Analysis FPG Failure Propagation Graph FTA Fault Tree Analysis HAZOP Hazard and Operability studies IATF Information Assurance Technical...represent logical places, within an information system, where people can perform their work by means of software acting on their behalf. People who...Describes the resources used to support the DIE (Including, for example, hardware, software , communication networks, applications and qualified staff

  2. Novel thermal imaging analysis technique for detecting inflammation in thyroid eye disease.

    Science.gov (United States)

    Di Maria, Costanzo; Allen, John; Dickinson, Jane; Neoh, Christopher; Perros, Petros

    2014-12-01

    The disease phase in thyroid eye disease (TED) is commonly assessed by clinical investigation of cardinal signs of inflammation and using the clinical activity score (CAS). Although CAS is the current gold standard, the clinical assessment would benefit if a more objective tool were available. The aim of this work was to explore the clinical value of a novel thermal imaging analysis technique to objectively quantify the thermal characteristics of the eye and peri-orbital region and determine the disease phase in TED. This was a cross-sectional study comparing consecutive patients with active TED (CAS ≥ 3/7) attending a tertiary center, with a group of consecutive patients with inactive TED (CAS Thermal images were acquired from 30 TED patients, 17 with active disease and 13 with inactive disease. Patients underwent standard ophthalmological clinical assessments and thermal imaging. Five novel thermal eye parameters (TEP) were developed to quantify the thermal characteristics of the eyes in terms of the highest level of inflammation (TEP1), overall level of inflammation (TEP2), right-left asymmetry in the level of inflammation (TEP3), maximum temperature variability across the eyes (TEP4), and right-left asymmetry in the temperature variability (TEP5). All five TEP were increased in active TED. TEP1 gave the largest accuracy (77%) at separating the two groups, with 65% sensitivity and 92% specificity. A statistical model combining all five parameters increased the overall accuracy, compared to using only one parameter, to 93% (94% sensitivity and 92% specificity). All five of the parameters were also found to be increased in patients with chemosis compared to those without. The potential diagnostic value of this novel thermal imaging analysis technique has been demonstrated. Further investigation on a larger group of patients is necessary to confirm these results.

  3. Combination of electrochemical, spectrometric and other analytical techniques for high throughput screening of pharmaceutically active compounds.

    Science.gov (United States)

    Suzen, Sibel; Ozkan, Sibel A

    2010-08-01

    Recently, use of electrochemistry and combination of this method with spectroscopic and other analytical techniques are getting one of the important approaches in drug discovery and research as well as quality control, drug stability, determination of physiological activity, measurement of neurotransmitters. Many fundamental physiological processes are depending on oxido-reduction reactions in the body. Therefore, it may be possible to find connections between electrochemical and biochemical reactions concerning electron transfer pathways. Applications of electrochemical techniques to redox-active drug development and studies are one of the recent interests in drug discovery. In this review, the latest developments related to the use of electrochemical techniques in drug research in order to evaluate possible combination spectrometric methods with electrochemical techniques.

  4. The composition-explicit distillation curve technique: Relating chemical analysis and physical properties of complex fluids.

    Science.gov (United States)

    Bruno, Thomas J; Ott, Lisa S; Lovestead, Tara M; Huber, Marcia L

    2010-04-16

    The analysis of complex fluids such as crude oils, fuels, vegetable oils and mixed waste streams poses significant challenges arising primarily from the multiplicity of components, the different properties of the components (polarity, polarizability, etc.) and matrix properties. We have recently introduced an analytical strategy that simplifies many of these analyses, and provides the added potential of linking compositional information with physical property information. This aspect can be used to facilitate equation of state development for the complex fluids. In addition to chemical characterization, the approach provides the ability to calculate thermodynamic properties for such complex heterogeneous streams. The technique is based on the advanced distillation curve (ADC) metrology, which separates a complex fluid by distillation into fractions that are sampled, and for which thermodynamically consistent temperatures are measured at atmospheric pressure. The collected sample fractions can be analyzed by any method that is appropriate. The analytical methods we have applied include gas chromatography (with flame ionization, mass spectrometric and sulfur chemiluminescence detection), thin layer chromatography, FTIR, corrosivity analysis, neutron activation analysis and cold neutron prompt gamma activation analysis. By far, the most widely used analytical technique we have used with the ADC is gas chromatography. This has enabled us to study finished fuels (gasoline, diesel fuels, aviation fuels, rocket propellants), crude oils (including a crude oil made from swine manure) and waste oils streams (used automotive and transformer oils). In this special issue of the Journal of Chromatography, specifically dedicated to extraction technologies, we describe the essential features of the advanced distillation curve metrology as an analytical strategy for complex fluids.

  5. A methodological approach for direct quantification of the activated sludge floc size distribution by using different techniques.

    Science.gov (United States)

    Govoreanu, R; Saveyn, H; Van der Meeren, P; Nopens, I; Vanrolleghem, P A

    2009-01-01

    The activated sludge floc size distribution (FSD) is investigated by using different measurement techniques in order to gain insight in FSD assessment as well as to detect the strengths and limitations of each technique. A second objective was to determine the experimental conditions that allow a representative and accurate measurement of activated sludge floc size distributions. Laser diffraction, Time Of Transition (TOT) and Dynamic Image Analysis (DIA) devices were connected in series. The sample dilution liquid, the dilution factor and hydraulic flow conditions avoiding flocculation proved to be important. All methods had certain advantages and limitations. The MastersizerS has a broader dynamic size range and provides accurate results at high concentrations. However, it suffers from an imprecise evaluation of small size flocs and is susceptible to particle shape effects. TOT suffers less from size overestimation for non-spherical particles. However, care should be taken with the settings of the transparency check. Being primarily a counting technique, DIA suffers from a limited size detection range but is an excellent technique for process visualization. All evaluated techniques turned out to be reliable methods to quantify the floc size distribution. Selection of a certain method depends on the purpose of the measurement.

  6. Effect of Preparation Techniques of Y-Mo/HZSM-5 on Its Activity in Methane Aromatization

    Institute of Scientific and Technical Information of China (English)

    Qiying Wang; Weiming Lin

    2004-01-01

    The production of benzene directly from methane aromatization under oxygen-free condition is currently a new focus in natural gas utilization. The influence of preparation techniques of the catalysts on their catalytic activities is studied in this paper. The influencing factors include the impregnating method, the calcination temperature, the promoter content and the acidity of the zeolite support. Optimum preparation techniques for the catalysts are obtained through this work.

  7. Thermal imaging for detection of SM45C subsurface defects using active infrared thermography techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yoon Jae; Ranjit, Shrestha; Kim, Won Tae [Kongju National University, Cheonan (Korea, Republic of)

    2015-06-15

    Active thermography techniques have the capability of inspecting a broad range simultaneously. By evaluating the phase difference between the defected area and the healthy area, the technique indicates the qualitative location and size of the defect. Previously, the development of the defect detection method used a variety of materials and the test specimen was done. In this study, the proposed technique of lock-in is verified with artificial specimens that have different size and depth of subsurface defects. Finally, the defect detection capability was evaluated using comparisons of the phase image and the amplitude image according to the size and depth of defects.

  8. A Low Latency Electrocardiographic QRS Activity Recovery Technique for Use on the Upper Left Arm

    Directory of Open Access Journals (Sweden)

    William D. Lynn

    2014-07-01

    Full Text Available Empirical mode decomposition is used as a low latency method of recovering the cardiac ventricular activity QRS biopotential signals recorded from the upper arm. The recovery technique is tested and compared with the industry accepted technique of signal averaging using a database of “normal” rhythm traces from bipolar ECG leads along the left arm, recorded from patient volunteers at a cardiology day procedure clinic. The same partial recomposition technique is applied to recordings taken using an innovative dry electrode technology supplied by Plessey Semiconductors. In each case, signal to noise ratio (SNR is used as a metric for comparison.

  9. Activation analysis of meteorites. 3

    Energy Technology Data Exchange (ETDEWEB)

    Nagai, H.; Honda, M.; Sato, H. [Nihon Univ., College of Humanities and Sciences, Tokyo (Japan); Ebihara, M.; Oura, Y.; Setoguchi, M. [Tokyo Metropolitan Univ., Faculty of Science, Tokyo (Japan)

    2001-07-01

    A long-lived cosmogenic nuclide, {sup 53}Mn in extra-terrestrial materials has been determined in the DR-1 hole of the JRR-3M reactor, applying the well-thermalized neutron flux. The neutron flux intensities are variable with the depths whereas the fast thermal ratios are not quite variable. By this method, {sup 53}Mn contents in iron meteorites and metal phases in general could be routinely determined in many samples. The chemical separation method has been modified and a convenient short circuit method has been proposed to shorten the process. The short method is to count the activities of {sup 54}Mn just after the irradiation without further purification of manganese. (author)

  10. Automated image analysis techniques for cardiovascular magnetic resonance imaging

    NARCIS (Netherlands)

    Geest, Robertus Jacobus van der

    2011-01-01

    The introductory chapter provides an overview of various aspects related to quantitative analysis of cardiovascular MR (CMR) imaging studies. Subsequently, the thesis describes several automated methods for quantitative assessment of left ventricular function from CMR imaging studies. Several novel

  11. Cross-impact analysis experimentation using two techniques to ...

    African Journals Online (AJOL)

    coherency. This paper describes cross-impact analysis experimentation in which a Monte ..... [4] is used to accomplish this computational task. Using this method ..... 202–222 in Baldwin MM (Ed), Portraits of complexity: Applications of systems.

  12. Antioxidant activity of Galium mollugo L. extracts obtained by different recovery techniques

    Directory of Open Access Journals (Sweden)

    Milić Petar S.

    2013-01-01

    Full Text Available The yield of extractive substances, antioxidant activity, as well as total phenolic and total flavonoid contents of aqueous-ethanolic extracts obtained from aerial parts of Galium mollugo L. by different extraction techniques (maceration, reflux and ultrasonic extraction were reported. The antioxidant activity of extracts was tested by measuring their ability to scavenge a stable DPPH free radical, while the total phenolic and total flavonoid contents were determined according to the Folin-Ciocalteu procedure and a colorimetric method, respectively. The Duncan’s multiple range tests were used to evaluate if there were significant differences among yields of extractive substances, total phenolics, total flavonoids and EC50 values for the extracts obtained by different extraction techniques. The extracts obtained by the reflux extraction contained higher amounts of extractive substances, as well as phenolic and flavonoid compounds, and showed a better antioxidant activity than those obtained by the two other recovering techniques.

  13. Analysis of the changes in keratoplasty indications and preferred techniques.

    Directory of Open Access Journals (Sweden)

    Stefan J Lang

    Full Text Available Recently, novel techniques introduced to the field of corneal surgery, e.g. Descemet membrane endothelial keratoplasty (DMEK and corneal crosslinking, extended the therapeutic options. Additionally contact lens fitting has developed new alternatives. We herein investigated, whether these techniques have affected volume and spectrum of indications of keratoplasties in both a center more specialized in treating Fuchs' dystrophy (center 1 and a second center that is more specialized in treating keratoconus (center 2.We retrospectively reviewed the waiting lists for indication, transplantation technique and the patients' travel distances to the hospital at both centers.We reviewed a total of 3778 procedures. Fuchs' dystrophy increased at center 1 from 17% (42 to 44% (150 and from 13% (27 to 23% (62 at center 2. In center 1, DMEK increased from zero percent in 2010 to 51% in 2013. In center 2, DMEK was not performed until 2013. The percentage of patients with keratoconus slightly decreased from 15% (36 in 2009 vs. 12% (40 in 2013 in center 1. The respective percentages in center 2 were 28% (57 and 19% (51. In both centers, the patients' travel distances increased.The results from center 1 suggest that DMEK might increase the total number of keratoplasties. The increase in travel distance suggests that this cannot be fully attributed to recruiting the less advanced patients from the hospital proximity. The increase is rather due to more referrals from other regions. The decrease of keratoconus patients in both centers is surprising and may be attributed to optimized contact lens fitting or even to the effect corneal crosslinking procedure.

  14. Improved Tandem Measurement Techniques for Aerosol Particle Analysis

    Science.gov (United States)

    Rawat, Vivek Kumar

    Non-spherical, chemically inhomogeneous (complex) nanoparticles are encountered in a number of natural and engineered environments, including combustion systems (which produces highly non-spherical aggregates), reactors used in gas-phase materials synthesis of doped or multicomponent materials, and in ambient air. These nanoparticles are often highly diverse in size, composition and shape, and hence require determination of property distribution functions for accurate characterization. This thesis focuses on development of tandem mobility-mass measurement techniques coupled with appropriate data inversion routines to facilitate measurement of two dimensional size-mass distribution functions while correcting for the non-idealities of the instruments. Chapter 1 provides the detailed background and motivation for the studies performed in this thesis. In chapter 2, the development of an inversion routine is described which is employed to determine two dimensional size-mass distribution functions from Differential Mobility Analyzer-Aerosol Particle Mass analyzer tandem measurements. Chapter 3 demonstrates the application of the two dimensional distribution function to compute cumulative mass distribution function and also evaluates the validity of this technique by comparing the calculated total mass concentrations to measured values for a variety of aerosols. In Chapter 4, this tandem measurement technique with the inversion routine is employed to analyze colloidal suspensions. Chapter 5 focuses on application of a transverse modulation ion mobility spectrometer coupled with a mass spectrometer to study the effect of vapor dopants on the mobility shifts of sub 2 nm peptide ion clusters. These mobility shifts are then compared to models based on vapor uptake theories. Finally, in Chapter 6, a conclusion of all the studies performed in this thesis is provided and future avenues of research are discussed.

  15. Cooking techniques improve the levels of bioactive compounds and antioxidant activity in kale and red cabbage.

    Science.gov (United States)

    Murador, Daniella Carisa; Mercadante, Adriana Zerlotti; de Rosso, Veridiana Vera

    2016-04-01

    The aim of this study is to investigate the effects of different home cooking techniques (boiling, steaming, and stir-frying) in kale and red cabbage, on the levels of bioactive compounds (carotenoids, anthocyanins and phenolic compounds) determined by high-performance liquid chromatography coupled with photodiode array and mass spectrometry detectors (HPLC-DAD-MS(n)), and on the antioxidant activity evaluated by ABTS, ORAC and cellular antioxidant activity (CAA) assays. The steaming technique resulted in a significant increase in phenolic content in kale (86.1%; pkale, steaming resulted in significant increases in antioxidant activity levels in all of the evaluation methods. In the red cabbage, boiling resulted in a significant increase in antioxidant activity using the ABTS assay but resulted in a significant decrease using the ORAC assay. According to the CAA assay, the stir-fried sample displayed the highest levels of antioxidant activity.

  16. Computational Intelligence Techniques for Electro-Physiological Data Analysis

    OpenAIRE

    Riera Sardà, Alexandre

    2012-01-01

    This work contains the efforts I have made in the last years in the field of Electrophysiological data analysis. Most of the work has been done at Starlab Barcelona S.L. and part of it at the Neurodynamics Laboratory of the Department of Psychiatry and Clinical Psychobiology of the University of Barcelona. The main work deals with the analysis of electroencephalography (EEG) signals, although other signals, such as electrocardiography (ECG), electroculography (EOG) and electromiography (EMG) ...

  17. Automated Techniques for Rapid Analysis of Momentum Exchange Devices

    Science.gov (United States)

    2013-12-01

    Contiguousness At this point, it is necessary to introduce the concept of contiguousness. In this thesis, a state space analysis representation is... concept of contiguousness was established to ensure that the results of the analysis would allow for the CMGs to reach every state in the defined...forces at the attachment points of the RWs and CMGs throughout a spacecraft maneuver. Current pedagogy on this topic focuses on the transfer of

  18. Combined Technique Analysis of Punic Make-up Materials

    Energy Technology Data Exchange (ETDEWEB)

    Huq,A.; Stephens, P.; Ayed, N.; Binous, H.; Burgio, L.; Clark, R.; Pantos, E.

    2006-01-01

    Ten archaeological Punic make-up samples from Tunisia dating from the 4th to the 1st centuries BC were analyzed by several techniques including Raman microscopy and synchrotron X-ray diffraction in order to determine their compositions. Eight samples were red and found to contain either quartz and cinnabar or quartz and haematite. The remaining two samples were pink, the main diffracting phase in them being quartz. Examination of these two samples by optical microscopy and by illumination under a UV lamp suggest that the pink dye is madder. These findings reveal the identities of the materials used by Carthaginians for cosmetic and/or ritual make-up purposes.

  19. Categorical and nonparametric data analysis choosing the best statistical technique

    CERN Document Server

    Nussbaum, E Michael

    2014-01-01

    Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain

  20. Diagnostic Application of Absolute Neutron Activation Analysis in Hematology

    Energy Technology Data Exchange (ETDEWEB)

    Zamboni, C.B.; Oliveira, L.C.; Dalaqua, L. Jr.

    2004-10-03

    The Absolute Neutron Activation Analysis (ANAA) technique was used to determine element concentrations of Cl and Na in blood of healthy group (male and female blood donators), select from Blood Banks at Sao Paulo city, to provide information which can help in diagnosis of patients. This study permitted to perform a discussion about the advantages and limitations of using this nuclear methodology in hematological examinations.

  1. Ratiometric analysis of fura red by flow cytometry: a technique for monitoring intracellular calcium flux in primary cell subsets.

    Directory of Open Access Journals (Sweden)

    Emily R Wendt

    Full Text Available Calcium flux is a rapid and sensitive measure of cell activation whose utility could be enhanced with better techniques for data extraction. We describe a technique to monitor calcium flux by flow cytometry, measuring Fura Red calcium dye by ratiometric analysis. This technique has several advantages: 1 using a single calcium dye provides an additional channel for surface marker characterization, 2 allows robust detection of calcium flux by minority cell populations within a heterogeneous population of primary T cells and monocytes 3 can measure total calcium flux and additionally, the proportion of responding cells, 4 can be applied to studying the effects of drug treatment, simultaneously stimulating and monitoring untreated and drug treated cells. Using chemokine receptor activation as an example, we highlight the utility of this assay, demonstrating that only cells expressing a specific chemokine receptor are activated by cognate chemokine ligand. Furthermore, we describe a technique for simultaneously stimulating and monitoring calcium flux in vehicle and drug treated cells, demonstrating the effects of the Gαi inhibitor, pertussis toxin (PTX, on chemokine stimulated calcium flux. The described real time calcium flux assay provides a robust platform for characterizing cell activation within primary cells, and offers a more accurate technique for studying the effect of drug treatment on receptor activation in a heterogeneous population of primary cells.

  2. Improved analysis techniques for cylindrical and spherical double probes

    Energy Technology Data Exchange (ETDEWEB)

    Beal, Brian; Brown, Daniel; Bromaghim, Daron [Air Force Research Laboratory, 1 Ara Rd., Edwards Air Force Base, California 93524 (United States); Johnson, Lee [Jet Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Dr., Pasadena, California 91109 (United States); Blakely, Joseph [ERC Inc., 1 Ara Rd., Edwards Air Force Base, California 93524 (United States)

    2012-07-15

    A versatile double Langmuir probe technique has been developed by incorporating analytical fits to Laframboise's numerical results for ion current collection by biased electrodes of various sizes relative to the local electron Debye length. Application of these fits to the double probe circuit has produced a set of coupled equations that express the potential of each electrode relative to the plasma potential as well as the resulting probe current as a function of applied probe voltage. These equations can be readily solved via standard numerical techniques in order to determine electron temperature and plasma density from probe current and voltage measurements. Because this method self-consistently accounts for the effects of sheath expansion, it can be readily applied to plasmas with a wide range of densities and low ion temperature (T{sub i}/T{sub e} Much-Less-Than 1) without requiring probe dimensions to be asymptotically large or small with respect to the electron Debye length. The presented approach has been successfully applied to experimental measurements obtained in the plume of a low-power Hall thruster, which produced a quasineutral, flowing xenon plasma during operation at 200 W on xenon. The measured plasma densities and electron temperatures were in the range of 1 Multiplication-Sign 10{sup 12}-1 Multiplication-Sign 10{sup 17} m{sup -3} and 0.5-5.0 eV, respectively. The estimated measurement uncertainty is +6%/-34% in density and +/-30% in electron temperature.

  3. Techniques for hazard analysis and their use at CERN.

    Science.gov (United States)

    Nuttall, C; Schönbacher, H

    2001-01-01

    CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors.

  4. Comparative Analysis of Data Mining Techniques for Malaysian Rainfall Prediction

    Directory of Open Access Journals (Sweden)

    Suhaila Zainudin

    2016-12-01

    Full Text Available Climate change prediction analyses the behaviours of weather for a specific time. Rainfall forecasting is a climate change task where specific features such as humidity and wind will be used to predict rainfall in specific locations. Rainfall prediction can be achieved using classification task under Data Mining. Different techniques lead to different performances depending on rainfall data representation including representation for long term (months patterns and short-term (daily patterns. Selecting an appropriate technique for a specific duration of rainfall is a challenging task. This study analyses multiple classifiers such as Naïve Bayes, Support Vector Machine, Decision Tree, Neural Network and Random Forest for rainfall prediction using Malaysian data. The dataset has been collected from multiple stations in Selangor, Malaysia. Several pre-processing tasks have been applied in order to resolve missing values and eliminating noise. The experimental results show that with small training data (10% from 1581 instances Random Forest correctly classified 1043 instances. This is the strength of an ensemble of trees in Random Forest where a group of classifiers can jointly beat a single classifier.

  5. Thermal Response Analysis of Phospholipid Bilayers Using Ellipsometric Techniques.

    Science.gov (United States)

    González-Henríquez, Carmen M; Villegas-Opazo, Vanessa A; Sagredo-Oyarce, Dallits H; Sarabia-Vallejos, Mauricio A; Terraza, Claudio A

    2017-08-18

    Biomimetic planar artificial membranes have been widely studied due to their multiple applications in several research fields. Their humectation and thermal response are crucial for reaching stability; these characteristics are related to the molecular organization inside the bilayer, which is affected by the aliphatic chain length, saturations, and molecule polarity, among others. Bilayer stability becomes a fundamental factor when technological devices are developed-like biosensors-based on those systems. Thermal studies were performed for different types of phosphatidylcholine (PC) molecules: two pure PC bilayers and four binary PC mixtures. These analyses were carried out through the detection of slight changes in their optical and structural parameters via Ellipsometry and Surface Plasmon Resonance (SPR) techniques. Phospholipid bilayers were prepared by Langmuir-Blodgett technique and deposited over a hydrophilic silicon wafer. Their molecular inclination degree, mobility, and stability of the different phases were detected and analyzed through bilayer thickness changes and their optical phase-amplitude response. Results show that certain binary lipid mixtures-with differences in its aliphatic chain length-present a co-existence of two thermal responses due to non-ideal mixing.

  6. Microscopy Techniques for Analysis of Nickel Metal Hydride Batteries Constituents.

    Science.gov (United States)

    Carpenter, Graham J C; Wronski, Zbigniew

    2015-12-01

    With the need for improvements in the performance of rechargeable batteries has come the necessity to better characterize cell electrodes and their component materials. Electron microscopy has been shown to reveal many important features of microstructure that are becoming increasingly important for understanding the behavior of the components during the many charge/discharge cycles required in modern applications. The aim of this paper is to present an overview of how the full suite of techniques available using transmission electron microscopy (TEM) and scanning transmission electron microscopy was applied to the case of materials for the positive electrode in nickel metal hydride rechargeable battery electrodes. Embedding and sectioning of battery-grade powders with an ultramicrotome was used to produce specimens that could be readily characterized by TEM. Complete electrodes were embedded after drying, and also after dehydration from the original wet state, for examination by optical microscopy and using focused ion beam techniques. Results of these studies are summarized to illustrate the significance of the microstructural information obtained.

  7. Design and Performance Analysis of Various Adders and Multipliers Using GDI Technique

    Directory of Open Access Journals (Sweden)

    Simran kaur

    2015-10-01

    Full Text Available With the active development of portable electronic devices, the need for low power dissipation, high speed and compact implementation, give rise to several research intentions. There are several design techniques used for the circuit configuration in VLSI systems but there are very few design techniques that gives the required extensibility. This paper describes the implementation of various adders and multipliers. The design approach proposed in the article is based on the GDI (Gate Diffusion Input technique. The paper also includes a comparative analysis of this low power method over CMOS design style with respect to power consumption, area complexity and delay. In this paper, a new GDI based cell designs are projected and are found to be efficient in terms of power consumption and area in comparison with existing CMOS based cell functionality. Power and delay has been calculated using Cadence Virtuoso tool at 45nm CMOS technology. The results obtained show better power and delay performance of the proposed designs at 1.3V supply voltage.

  8. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...

  9. Modern Theory of Gratings Resonant Scattering: Analysis Techniques and Phenomena

    CERN Document Server

    Sirenko, Yuriy K

    2010-01-01

    Diffraction gratings are one of the most popular objects of analysis in electromagnetic theory. The requirements of applied optics and microwave engineering lead to many new problems and challenges for the theory of diffraction gratings, which force us to search for new methods and tools for their resolution. In Modern Theory of Gratings, the authors present results of the electromagnetic theory of diffraction gratings that will constitute the base of further development of this theory, which meet the challenges provided by modern requirements of fundamental and applied science. This volume covers: spectral theory of gratings (Chapter 1) giving reliable grounds for physical analysis of space-frequency and space-time transformations of the electromagnetic field in open periodic resonators and waveguides; authentic analytic regularization procedures (Chapter 2) that, in contradistinction to the traditional frequency-domain approaches, fit perfectly for the analysis of resonant wave scattering processes; paramet...

  10. Techniques of EMG signal analysis: detection, processing, classification and applications

    Science.gov (United States)

    Hussain, M.S.; Mohd-Yasin, F.

    2006-01-01

    Electromyography (EMG) signals can be used for clinical/biomedical applications, Evolvable Hardware Chip (EHW) development, and modern human computer interaction. EMG signals acquired from muscles require advanced methods for detection, decomposition, processing, and classification. The purpose of this paper is to illustrate the various methodologies and algorithms for EMG signal analysis to provide efficient and effective ways of understanding the signal and its nature. We further point up some of the hardware implementations using EMG focusing on applications related to prosthetic hand control, grasp recognition, and human computer interaction. A comparison study is also given to show performance of various EMG signal analysis methods. This paper provides researchers a good understanding of EMG signal and its analysis procedures. This knowledge will help them develop more powerful, flexible, and efficient applications. PMID:16799694

  11. Analysis of Self-Excited Combustion Instabilities Using Decomposition Techniques

    Science.gov (United States)

    2016-07-05

    SVD ), while for DMD, the data are reduced using the Arnoldi algorithm. POD decomposes data based on optimality to obtain a set of best representations...analysis is performed with the same domains that were used for the POD analysis. The DMD frequency spectra of pressure and heat-release fluctuations are...temporal data andm columns of spatial data, the POD matrix will be of size N ×m. Once we obtain the POD matrix A, the SVD of A is A UΣVT (A4) where U is an

  12. Finite Element Modeling Techniques for Analysis of VIIP

    Science.gov (United States)

    Feola, Andrew J.; Raykin, J.; Gleason, R.; Mulugeta, Lealem; Myers, Jerry G.; Nelson, Emily S.; Samuels, Brian C.; Ethier, C. Ross

    2015-01-01

    Visual Impairment and Intracranial Pressure (VIIP) syndrome is a major health concern for long-duration space missions. Currently, it is thought that a cephalad fluid shift in microgravity causes elevated intracranial pressure (ICP) that is transmitted along the optic nerve sheath (ONS). We hypothesize that this in turn leads to alteration and remodeling of connective tissue in the posterior eye which impacts vision. Finite element (FE) analysis is a powerful tool for examining the effects of mechanical loads in complex geometries. Our goal is to build a FE analysis framework to understand the response of the lamina cribrosa and optic nerve head to elevations in ICP in VIIP.

  13. Evaluation of Wellness Detection Techniques using Complex Activities Association for Smart Home Ambient

    Directory of Open Access Journals (Sweden)

    Farhan Sabir Ujager

    2016-08-01

    Full Text Available Wireless Sensor Network based smart homes have the potential to meet the growing challenges of independent living of elderly people in smart homes. However, wellness detection of elderly people in smart homes is still a challenging research domain. Many researchers have proposed several techniques; however, majority of these techniques does not provide a comprehensive solution because complex activities cannot be determined easily and comprehensive wellness is difficult to diagnose. In this study’s critical review, it has been observed that strong association lies among the vital wellness determination parameters. In this paper, an association rules based model is proposed for the simple and complex (overlapped activities recognition and comprehensive wellness detection mechanism after analyzing existing techniques. It considers vital wellness detection parameters (temporal association of sub activity location and sub activity, time gaps between two adjacent activities, temporal association of inter and intra activities. Activity recognition and wellness detection will be performed on the basis of extracted temporal association rules and expert knowledgebase. Learning component is an important module of our proposed model to accommodate the changing trends in the frequent pattern behavior of an elderly person and recommend a caregiver/expert to adjust the expert knowledgebase according to the found abnormalities.

  14. Analysis of ultrasonic techniques for monitoring milk coagulation during cheesemaking

    Science.gov (United States)

    Budelli, E.; Pérez, N.; Lema, P.; Negreira, C.

    2012-12-01

    Experimental determination of time of flight and attenuation has been proposed in the literature as alternatives to monitoring the evolution of milk coagulation during cheese manufacturing. However, only laboratory scale procedures have been described. In this work, the use of ultrasonic time of flight and attenuation to determine cutting time and its feasibility to be applied at industrial scale were analyzed. Limitations to implement these techniques at industrial scale are shown experimentally. The main limitation of the use of time of flight is its strong dependence with temperature. Attenuation monitoring is affected by a thin layer of milk skin covering the transducer, which modifies the signal in a non-repetitive way. The results of this work can be used to develop alternative ultrasonic systems suitable for application in the dairy industry.

  15. Radio & Optical Interferometry: Basic Observing Techniques and Data Analysis

    CERN Document Server

    Monnier, John D

    2012-01-01

    Astronomers usually need the highest angular resolution possible, but the blurring effect of diffraction imposes a fundamental limit on the image quality from any single telescope. Interferometry allows light collected at widely-separated telescopes to be combined in order to synthesize an aperture much larger than an individual telescope thereby improving angular resolution by orders of magnitude. Radio and millimeter wave astronomers depend on interferometry to achieve image quality on par with conventional visible and infrared telescopes. Interferometers at visible and infrared wavelengths extend angular resolution below the milli-arcsecond level to open up unique research areas in imaging stellar surfaces and circumstellar environments. In this chapter the basic principles of interferometry are reviewed with an emphasis on the common features for radio and optical observing. While many techniques are common to interferometers of all wavelengths, crucial differences are identified that will help new practi...

  16. Metabolic Engineering: Techniques for analysis of targets for genetic manipulations

    DEFF Research Database (Denmark)

    Nielsen, Jens Bredal

    1998-01-01

    enzymes. Despite the prospect of obtaining major improvement through metabolic engineering, this approach is, however, not expected to completely replace the classical approach to strain improvement-random mutagenesis followed by screening. Identification of the optimal genetic changes for improvement......Metabolic engineering has been defined as the purposeful modification of intermediary metabolism using recombinant DNA techniques. With this definition metabolic engineering includes: (1) inserting new pathways in microorganisms with the aim of producing novel metabolites, e.g., production...... of polyketides by Streptomyces; (2) production of heterologous peptides, e.g., production of human insulin, erythropoitin, and tPA; and (3) improvement of both new and existing processes, e.g., production of antibiotics and industrial enzymes. Metabolic engineering is a multidisciplinary approach, which involves...

  17. Development of a Rapid Soil Water Content Detection Technique Using Active Infrared Thermal Methods for In-Field Applications

    Directory of Open Access Journals (Sweden)

    Federico Pallottino

    2011-10-01

    Full Text Available The aim of this study was to investigate the suitability of active infrared thermography and thermometry in combination with multivariate statistical partial least squares analysis as rapid soil water content detection techniques both in the laboratory and the field. Such techniques allow fast soil water content measurements helpful in both agricultural and environmental fields. These techniques, based on the theory of heat dissipation, were tested by directly measuring temperature dynamic variation of samples after heating. For the assessment of temperature dynamic variations data were collected during three intervals (3, 6 and 10 s. To account for the presence of specific heats differences between water and soil, the analyses were regulated using slopes to linearly describe their trends. For all analyses, the best model was achieved for a 10 s slope. Three different approaches were considered, two in the laboratory and one in the field. The first laboratory-based one was centred on active infrared thermography, considered measurement of temperature variation as independent variable and reported r = 0.74. The second laboratory–based one was focused on active infrared thermometry, added irradiation as independent variable and reported r = 0.76. The in-field experiment was performed by active infrared thermometry, heating bare soil by solar irradiance after exposure due to primary tillage. Some meteorological parameters were inserted as independent variables in the prediction model, which presented r = 0.61. In order to obtain more general and wide estimations in-field a Partial Least Squares Discriminant Analysis on three classes of percentage of soil water content was performed obtaining a high correct classification in the test (88.89%. The prediction error values were lower in the field with respect to laboratory analyses. Both techniques could be used in conjunction with a Geographic Information System for obtaining detailed information

  18. The Analysis of a Phobic Child: Some Problems of Theory and Technique in Child Analysis.

    Science.gov (United States)

    Bornstein, Berta

    2014-01-01

    This paper attempts to clarify some theoretical and technical aspects of child analysis by correlating the course of treatment, the structure of the neurosis, and the technique employed in the case of a phobic boy who was in analysis over a period of three years. The case was chosen for presentation: (1) because of the discrepancy between the clinical simplicity of the symptom and the complicated ego structure behind it; (2) because of the unusual clearness with which the patient brought to the fore the variegated patterns of his libidinal demands; (3) because of the patient's attempts at transitory solutions, oscillations between perversions and symptoms, and processes of new symptom formation; (4) because the vicissitudes and stabilization of character traits could be clearly traced; (5) and finally, because of the rare opportunity to witness during treatment the change from grappling with reality by means of pathological mechanisms, to dealing with reality in a relatively conflict-free fashion.

  19. Sentiment analysis of Arabic tweets using text mining techniques

    Science.gov (United States)

    Al-Horaibi, Lamia; Khan, Muhammad Badruddin

    2016-07-01

    Sentiment analysis has become a flourishing field of text mining and natural language processing. Sentiment analysis aims to determine whether the text is written to express positive, negative, or neutral emotions about a certain domain. Most sentiment analysis researchers focus on English texts, with very limited resources available for other complex languages, such as Arabic. In this study, the target was to develop an initial model that performs satisfactorily and measures Arabic Twitter sentiment by using machine learning approach, Naïve Bayes and Decision Tree for classification algorithms. The datasets used contains more than 2,000 Arabic tweets collected from Twitter. We performed several experiments to check the performance of the two algorithms classifiers using different combinations of text-processing functions. We found that available facilities for Arabic text processing need to be made from scratch or improved to develop accurate classifiers. The small functionalities developed by us in a Python language environment helped improve the results and proved that sentiment analysis in the Arabic domain needs lot of work on the lexicon side.

  20. 48 CFR 15.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... of the offeror's cost trends, on the basis of current and historical cost or pricing data; (C... the FAR looseleaf edition), Cost Accounting Standards. (v) Review to determine whether any cost data... required. (2) Price analysis shall be used when certified cost or pricing data are not required...

  1. Novel microstructures and technologies applied in chemical analysis techniques

    NARCIS (Netherlands)

    Spiering, Vincent L.; Spiering, V.L.; van der Moolen, Johannes N.; Burger, Gert-Jan; Burger, G.J.; van den Berg, Albert

    1997-01-01

    Novel glass and silicon microstructures and their application in chemical analysis are presented. The micro technologies comprise (deep) dry etching, thin layer growth and anodic bonding. With this combination it is possible to create high resolution electrically isolating silicon dioxide structures

  2. Neutron Activation Analysis of Water - A Review

    Science.gov (United States)

    Buchanan, John D.

    1971-01-01

    Recent developments in this field are emphasized. After a brief review of basic principles, topics discussed include sources of neutrons, pre-irradiation physical and chemical treatment of samples, neutron capture and gamma-ray analysis, and selected applications. Applications of neutron activation analysis of water have increased rapidly within the last few years and may be expected to increase in the future.

  3. Determination of Volatile Organic Compounds in the Atmosphere Using Two Complementary Analysis Techniques.

    Science.gov (United States)

    Alonso, L; Durana, N; Navazo, M; García, J A; Ilardia, J L

    1999-08-01

    During a preliminary field campaign of volatile organic compound (VOC) measurements carried out in an urban area, two complementary analysis techniques were applied to establish the technical and scientific bases for a strategy to monitor and control VOCs and photochemical oxidants in the Autonomous Community of the Basque Country. Integrated sampling was conducted using Tenax sorbent tubes and laboratory analysis by gas chromatography, and grab sampling and in situ analysis also were conducted using a portable gas chromatograph. With the first technique, monocyclic aromatic hydrocarbons appeared as the compounds with the higher mean concentrations. The second technique allowed the systematic analysis of eight chlorinated and aromatic hydrocarbons. Results of comparing both techniques, as well as the additional information obtained with the second technique, are included.

  4. Nonlinear techniques for forecasting solar activity directly from its time series

    Science.gov (United States)

    Ashrafi, S.; Roszman, L.; Cooley, J.

    1993-01-01

    This paper presents numerical techniques for constructing nonlinear predictive models to forecast solar flux directly from its time series. This approach makes it possible to extract dynamical in variants of our system without reference to any underlying solar physics. We consider the dynamical evolution of solar activity in a reconstructed phase space that captures the attractor (strange), give a procedure for constructing a predictor of future solar activity, and discuss extraction of dynamical invariants such as Lyapunov exponents and attractor dimension.

  5. Biomechanical analysis technique choreographic movements (for example, "grand battman jete"

    Directory of Open Access Journals (Sweden)

    Batieieva N.P.

    2015-04-01

    Full Text Available Purpose : biomechanical analysis of the execution of choreographic movement "grand battman jete". Material : the study involved students (n = 7 of the department of classical choreography faculty of choreography. Results : biomechanical analysis of choreographic movement "grand battman jete" (classic exercise, obtained kinematic characteristics (path, velocity, acceleration, force of the center of mass (CM bio parts of the body artist (foot, shin, thigh. Built bio kinematic model (phase. The energy characteristics - mechanical work and kinetic energy units legs when performing choreographic movement "grand battman jete". Conclusions : It was found that the ability of an athlete and coach-choreographer analyze the biomechanics of movement has a positive effect on the improvement of choreographic training of qualified athletes in gymnastics (sport, art, figure skating and dance sports.

  6. Transient analysis techniques in performing impact and crash dynamic studies

    Science.gov (United States)

    Pifko, A. B.; Winter, R.

    1989-01-01

    Because of the emphasis being placed on crashworthiness as a design requirement, increasing demands are being made by various organizations to analyze a wide range of complex structures that must perform safely when subjected to severe impact loads, such as those generated in a crash event. The ultimate goal of crashworthiness design and analysis is to produce vehicles with the ability to reduce the dynamic forces experienced by the occupants to specified levels, while maintaining a survivable envelope around them during a specified crash event. DYCAST is a nonlinear structural dynamic finite element computer code that started from the plans systems of a finite element program for static nonlinear structural analysis. The essential features of DYCAST are outlined.

  7. Changes in muscle activation following balance and technique training and a season of Australian football.

    Science.gov (United States)

    Donnelly, C J; Elliott, B C; Doyle, T L A; Finch, C F; Dempsey, A R; Lloyd, D G

    2015-05-01

    Determine if balance and technique training implemented adjunct to 1001 male Australian football players' training influenced the activation/strength of the muscles crossing the knee during pre-planned and unplanned sidestepping. Randomized Control Trial. Each Australian football player participated in either 28 weeks of balance and technique training or 'sham' training. Twenty-eight Australian football players (balance and technique training, n=12; 'sham' training, n=16) completed biomechanical testing pre-to-post training. Peak knee moments and directed co-contraction ratios in three degrees of freedom, as well as total muscle activation were calculated during pre-planned and unplanned sidestepping. No significant differences in muscle activation/strength were observed between the 'sham' training and balance and technique training groups. Following a season of Australian football, knee extensor (p=0.023) and semimembranosus (p=0.006) muscle activation increased during both pre-planned sidestepping and unplanned sidestepping. Following a season of Australian football, total muscle activation was 30% lower and peak valgus knee moments 80% greater (p=0.022) during unplanned sidestepping when compared with pre-planned sidestepping. When implemented in a community level training environment, balance and technique training was not effective in changing the activation of the muscles crossing the knee during sidestepping. Following a season of Australian football, players are better able to support both frontal and sagittal plane knee moments. When compared to pre-planned sidestepping, Australian football players may be at increased risk of anterior cruciate ligament injury during unplanned sidestepping in the latter half of an Australian football season. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  8. Applying Failure Modes, Effects, And Criticality Analysis And Human Reliability Analysis Techniques To Improve Safety Design Of Work Process In Singapore Armed Forces

    Science.gov (United States)

    2016-09-01

    interactions from a purely mechanical standpoint. However, SAF does not apply FMECA to work processes that are typical of SAF training and operational...take considerable effort to complete. Therefore, it should be applied to work processes or activities that are generally static in nature ...AND CRITICALITY ANALYSIS AND HUMAN RELIABILITY ANALYSIS TECHNIQUES TO IMPROVE SAFETY DESIGN OF WORK PROCESS IN SINGAPORE ARMED FORCES by Weihao K

  9. Advances in oriental document analysis and recognition techniques

    CERN Document Server

    Lee, Seong-Whan

    1999-01-01

    In recent years, rapid progress has been made in computer processing of oriental languages, and the research developments in this area have resulted in tremendous changes in handwriting processing, printed oriental character recognition, document analysis and recognition, automatic input methodologies for oriental languages, etc. Advances in computer processing of oriental languages can also be seen in multimedia computing and the World Wide Web. Many of the results in those domains are presented in this book.

  10. Elemental analysis of silver coins by PIXE technique

    Energy Technology Data Exchange (ETDEWEB)

    Tripathy, B.B. [Department of Physics, Silicon Institute of Technology, Patia, Bhubaneswar 751 024 (India); Rautray, Tapash R. [Department of Dental Biomaterials, School of Dentistry, Kyungpook National University, 2-188-1 Samduk -dong, Jung-gu, Daegu 700 412 (Korea, Republic of); ARASMIN, G. Udayagiri, Kandhamal, Orissa 762 100 (India)], E-mail: tapash.rautray@gmail.com; Rautray, A.C. [ARASMIN, G. Udayagiri, Kandhamal, Orissa 762 100 (India); Vijayan, V. [Praveen Institute of Radiation Technology, Flat No. 9A, Avvai Street, New Perungalathur, Chennai 600 063 (India)

    2010-03-15

    Elemental analysis of nine Indian silver coins during British rule was carried out by proton induced X-ray emission spectroscopy. Eight elements, namely Cr, Fe, Ni, Cu, Zn, As, Ag, and Pb were determined in the present study. Ag and Cu were found to be the major elements, Zn was the only minor element and all other elements are present at the trace level. The variation of the elemental concentration may be due to the use of different ores for making coins.

  11. Analysis of active islanding detection methods for grid-connected microinverters for renewable energy processing

    Energy Technology Data Exchange (ETDEWEB)

    Trujillo, C.L. [Grupo de Sistemas Electronicos Industriales del Departamento de Ingenieria Electronica, Universidad Politecnica de Valencia, Camino de Vera S/N, C.P. 46022, Valencia (Spain); Departamento de Ingenieria Electronica, Universidad Distrital Francisco Jose de Caldas, Carrera 7 N 40-53 Piso 5, Bogota (Colombia); Velasco, D.; Figueres, E.; Garcera, G. [Grupo de Sistemas Electronicos Industriales del Departamento de Ingenieria Electronica, Universidad Politecnica de Valencia, Camino de Vera S/N, C.P. 46022, Valencia (Spain)

    2010-11-15

    This paper presents the analysis and comparison of the main active techniques for islanding detection used in grid-connected microinverters for power processing of renewable energy sources. These techniques can be classified into two classes: techniques introducing positive feedback in the control of the inverter and techniques based on harmonics injection. Accurate PSIM trademark simulations have been carried out in order to perform a comparative analysis of the techniques under study and to establish their advantages and disadvantages according to IEEE standards. (author)

  12. Comparative Analysis of Automatic Vehicle Classification Techniques: A Survey

    Directory of Open Access Journals (Sweden)

    Kanwal Yousaf

    2012-09-01

    Full Text Available Vehicle classification has emerged as a significant field of study because of its importance in variety of applications like surveillance, security system, traffic congestion avoidance and accidents prevention etc. So far numerous algorithms have been implemented for classifying vehicle. Each algorithm follows different procedures for detecting vehicles from videos. By evaluating some of the commonly used techniques we highlighted most beneficial methodology for classifying vehicles. In this paper we pointed out the working of several video based vehicle classification algorithms and compare these algorithms on the basis of different performance metrics such as classifiers, classification methodology or principles and vehicle detection ratio etc. After comparing these parameters we concluded that Hybrid Dynamic Bayesian Network (HDBN Classification algorithm is far better than the other algorithms due to its nature of estimating the simplest features of vehicles from different videos. HDBN detects vehicles by following important stages of feature extraction, selection and classification. It extracts the rear view information of vehicles rather than other information such as distance between the wheels and height of wheel etc.

  13. Chromatographic finger print analysis of Naringi crenulata by HPTLC technique

    Institute of Scientific and Technical Information of China (English)

    Subramanian Sampathkumar; Ramakrishnan N

    2011-01-01

    Objective:To establish the fingerprint profile of Naringi crenulata (N. crenulata) (Roxb.) Nicols. using high performance thin layer chromatography (HPTLC) technique. Methods: Preliminary phytochemical screening was done and HPTLC studies were carried out. CAMAG HPTLC system equipped with Linomat V applicator, TLC scanner 3, Reprostar 3 and WIN CATS-4 software was used. Results: The results of preliminary phytochemical studies confirmed the presence of protein, lipid, carbohydrate, reducing sugar, phenol, tannin, flavonoid, saponin, triterpenoid, alkaloid, anthraquinone and quinone. HPTLC finger printing of ethanolic extract of stem revealed 10 spots with Rf values in the range of 0.08 to 0.65;bark showed 8 peaks with Rf values in the range of 0.07 to 0.63 and the ethanol extract of leaf revealed 8 peaks with Rf values in the range of 0.09 to 0.49, respectively. The purity of sample was confirmed by comparing the absorption spectra at start, middle and end position of the band. Conclusions:It can be concluded that HPTLC finger printing of N. crenulata may be useful in differentiating the species from the adulterant and act as a biochemical marker for this medicinally important plant in the pharmaceutical industry and plant systematic studies.

  14. Skills and Vacancy Analysis with Data Mining Techniques

    Directory of Open Access Journals (Sweden)

    Izabela A. Wowczko

    2015-11-01

    Full Text Available Through recognizing the importance of a qualified workforce, skills research has become one of the focal points in economics, sociology, and education. Great effort is dedicated to analyzing labor demand and supply, and actions are taken at many levels to match one with the other. In this work we concentrate on skills needs, a dynamic variable dependent on many aspects such as geography, time, or the type of industry. Historically, skills in demand were easy to evaluate since transitions in that area were fairly slow, gradual, and easy to adjust to. In contrast, current changes are occurring rapidly and might take an unexpected turn. Therefore, we introduce a relatively simple yet effective method of monitoring skills needs straight from the source—as expressed by potential employers in their job advertisements. We employ open source tools such as RapidMiner and R as well as easily accessible online vacancy data. We demonstrate selected techniques, namely classification with k-NN and information extraction from a textual dataset, to determine effective ways of discovering knowledge from a given collection of vacancies.

  15. Manure management and greenhouse gas mitigation techniques : a comparative analysis

    Energy Technology Data Exchange (ETDEWEB)

    Langmead, C.

    2003-09-03

    Alberta is the second largest agricultural producer in Canada, ranking just behind Ontario. Approximately 62 per cent of the province's farm cash receipts are attributable to the livestock industry. Farmers today maintain large numbers of a single animal type. The drivers for more advanced manure management systems include: the trend towards confined feeding operations (CFO) is creating large, concentrated quantities of manure; public perception of CFO; implementation of provincial legislation regulating the expansion and construction of CFO; ratification of the Kyoto Protocol raised interest in the development of improved manure management systems capable of reducing greenhouse gas (GHG) emissions; and rising energy costs. The highest methane emissions factors are found with liquid manure management systems. They contribute more than 80 per cent of the total methane emissions from livestock manure in Alberta. The author identified and analyzed three manure management techniques to mitigate GHG emissions. They were: bio-digesters, gasification systems, and composting. Three recommendations were made to establish a strategy to support emissions offsets and maximize the reduction of methane emissions from the livestock industry. The implementation of bio-digesters, especially for the swine industry, was recommended. It was suggested that a gasification pilot project for poultry manure should be pursued by Climate Change Central. Public outreach programs promoting composting of cattle manure for beef feedlots and older style dairy barns should also be established. 19 refs., 11 tabs., 3 figs.

  16. Modelling, analysis and validation of microwave techniques for the characterisation of metallic nanoparticles

    Science.gov (United States)

    Sulaimalebbe, Aslam

    In the last decade, the study of nanoparticle (NP) systems has become a large and interesting research area due to their novel properties and functionalities, which are different from those of the bulk materials, and also their potential applications in different fields. It is vital to understand the behaviour and properties of nano-materials aiming at implementing nanotechnology, controlling their behaviour and designing new material systems with superior performance. Physical characterisation of NPs falls into two main categories, property and structure analysis, where the properties of the NPs cannot be studied without the knowledge of size and structure. The direct measurement of the electrical properties of metal NPs presents a key challenge and necessitates the use of innovative experimental techniques. There have been numerous reports of two/four point resistance measurements of NPs films and also electrical conductivity of NPs films using the interdigitated microarray (IDA) electrode. However, using microwave techniques such as open ended coaxial probe (OCP) and microwave dielectric resonator (DR) for electrical characterisation of metallic NPs are much more accurate and effective compared to other traditional techniques. This is because they are inexpensive, convenient, non-destructive, contactless, hazardless (i.e. at low power) and require no special sample preparation. This research is the first attempt to determine the microwave properties of Pt and Au NP films, which were appealing materials for nano-scale electronics, using the aforementioned microwave techniques. The ease of synthesis, relatively cheap, unique catalytic activities and control over the size and the shape were the main considerations in choosing Pt and Au NPs for the present study. The initial phase of this research was to implement and validate the aperture admittance model for the OCP measurement through experiments and 3D full wave simulation using the commercially available Ansoft

  17. Multi-scale statistical analysis of coronal solar activity

    Science.gov (United States)

    Gamborino, Diana; del-Castillo-Negrete, Diego; Martinell, Julio J.

    2016-07-01

    Multi-filter images from the solar corona are used to obtain temperature maps that are analyzed using techniques based on proper orthogonal decomposition (POD) in order to extract dynamical and structural information at various scales. Exploring active regions before and after a solar flare and comparing them with quiet regions, we show that the multi-scale behavior presents distinct statistical properties for each case that can be used to characterize the level of activity in a region. Information about the nature of heat transport is also to be extracted from the analysis.

  18. An Active Damping Technique for Small DC-Link Capacitor Based Drive System

    DEFF Research Database (Denmark)

    Maheshwari, Ram Krishan; Munk-Nielsen, Stig; Lu, Kaiyuan

    2013-01-01

    A small dc-link capacitor based drive system shows instability when it is operated with large input line inductance at operating points with high power. This paper presents a simple, new active damping technique that can stabilize effectively the drive system at unstable operating points, offering...

  19. Status of the Usage of Active Learning and Teaching Method and Techniques by Social Studies Teachers

    Science.gov (United States)

    Akman, Özkan

    2016-01-01

    The purpose of this study was to determine the active learning and teaching methods and techniques which are employed by the social studies teachers working in state schools of Turkey. This usage status was assessed using different variables. This was a case study, wherein the research was limited to 241 social studies teachers. These teachers…

  20. Regulation on the Appraisal Activities of The Products,Techniques and Applications Projects of BIRTV

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    I General Provisions 1.The Appraisal Activities of the Products,Techniques andApplications Projects of BIRTV is held for the purpose of strengthen-ing the technical advisory work and pilot role,and providing positiveguidance and effective assistance in the field of new technology

  1. Digital methods of photopeak integration in activation analysis.

    Science.gov (United States)

    Baedecker, P. A.

    1971-01-01

    A study of the precision attainable by several methods of gamma-ray photopeak integration has been carried out. The 'total peak area' method, the methods proposed by Covell, Sterlinski, and Quittner, and some modifications of these methods have been considered. A modification by Wasson of the total peak area method is considered to be the most advantageous due to its simplicity and the relatively high precision obtainable with this technique. A computer routine for the analysis of spectral data from nondestructive activation analysis experiments employing a Ge(Li) detector-spectrometer system is described.

  2. Preliminary Analysis of ULPC Light Curves Using Fourier Decomposition Technique

    CERN Document Server

    Ngeow, Chow-Choong; Kanbur, Shashi; Barrett, Brittany; Lin, Bin

    2013-01-01

    Recent work on Ultra Long Period Cepheids (ULPCs) has suggested their usefulness as a distance indicator, but has not commented on their relationship as compared with other types of variable stars. In this work, we use Fourier analysis to quantify the structure of ULPC light curves and compare them to Classical Cepheids and Mira variables. Our preliminary results suggest that the low order Fourier parameters of ULPCs show a continuous trend defined by Classical Cepheids after the resonance around 10 days. However their Fourier parameters also overlapped with those from Miras, which make the classification of long period variable stars difficult based on the light curves information alone.

  3. [Neuroimaging in psychiatry: multivariate analysis techniques for diagnosis and prognosis].

    Science.gov (United States)

    Kambeitz, J; Koutsouleris, N

    2014-06-01

    Multiple studies successfully applied multivariate analysis to neuroimaging data demonstrating the potential utility of neuroimaging for clinical diagnostic and prognostic purposes. Summary of the current state of research regarding the application of neuroimaging in the field of psychiatry. Literature review of current studies. Results of current studies indicate the potential application of neuroimaging data across various diagnoses, such as depression, schizophrenia, bipolar disorder and dementia. Potential applications include disease classification, differential diagnosis and prediction of disease course. The results of the studies are heterogeneous although some studies report promising findings. Further multicentre studies are needed with clearly specified patient populations to systematically investigate the potential utility of neuroimaging for the clinical routine.

  4. Behavior Change Techniques in Popular Alcohol Reduction Apps: Content Analysis

    Science.gov (United States)

    Garnett, Claire; Brown, James; West, Robert; Michie, Susan

    2015-01-01

    Background Mobile phone apps have the potential to reduce excessive alcohol consumption cost-effectively. Although hundreds of alcohol-related apps are available, there is little information about the behavior change techniques (BCTs) they contain, or the extent to which they are based on evidence or theory and how this relates to their popularity and user ratings. Objective Our aim was to assess the proportion of popular alcohol-related apps available in the United Kingdom that focus on alcohol reduction, identify the BCTs they contain, and explore whether BCTs or the mention of theory or evidence is associated with app popularity and user ratings. Methods We searched the iTunes and Google Play stores with the terms “alcohol” and “drink”, and the first 800 results were classified into alcohol reduction, entertainment, or blood alcohol content measurement. Of those classified as alcohol reduction, all free apps and the top 10 paid apps were coded for BCTs and for reference to evidence or theory. Measures of popularity and user ratings were extracted. Results Of the 800 apps identified, 662 were unique. Of these, 13.7% (91/662) were classified as alcohol reduction (95% CI 11.3-16.6), 53.9% (357/662) entertainment (95% CI 50.1-57.7), 18.9% (125/662) blood alcohol content measurement (95% CI 16.1-22.0) and 13.4% (89/662) other (95% CI 11.1-16.3). The 51 free alcohol reduction apps and the top 10 paid apps contained a mean of 3.6 BCTs (SD 3.4), with approximately 12% (7/61) not including any BCTs. The BCTs used most often were “facilitate self-recording” (54%, 33/61), “provide information on consequences of excessive alcohol use and drinking cessation” (43%, 26/61), “provide feedback on performance” (41%, 25/61), “give options for additional and later support” (25%, 15/61) and “offer/direct towards appropriate written materials” (23%, 14/61). These apps also rarely included any of the 22 BCTs frequently used in other health behavior change

  5. Analysis of Machine Learning Techniques for Heart Failure Readmissions.

    Science.gov (United States)

    Mortazavi, Bobak J; Downing, Nicholas S; Bucholz, Emily M; Dharmarajan, Kumar; Manhapra, Ajay; Li, Shu-Xia; Negahban, Sahand N; Krumholz, Harlan M

    2016-11-01

    The current ability to predict readmissions in patients with heart failure is modest at best. It is unclear whether machine learning techniques that address higher dimensional, nonlinear relationships among variables would enhance prediction. We sought to compare the effectiveness of several machine learning algorithms for predicting readmissions. Using data from the Telemonitoring to Improve Heart Failure Outcomes trial, we compared the effectiveness of random forests, boosting, random forests combined hierarchically with support vector machines or logistic regression (LR), and Poisson regression against traditional LR to predict 30- and 180-day all-cause readmissions and readmissions because of heart failure. We randomly selected 50% of patients for a derivation set, and a validation set comprised the remaining patients, validated using 100 bootstrapped iterations. We compared C statistics for discrimination and distributions of observed outcomes in risk deciles for predictive range. In 30-day all-cause readmission prediction, the best performing machine learning model, random forests, provided a 17.8% improvement over LR (mean C statistics, 0.628 and 0.533, respectively). For readmissions because of heart failure, boosting improved the C statistic by 24.9% over LR (mean C statistic 0.678 and 0.543, respectively). For 30-day all-cause readmission, the observed readmission rates in the lowest and highest deciles of predicted risk with random forests (7.8% and 26.2%, respectively) showed a much wider separation than LR (14.2% and 16.4%, respectively). Machine learning methods improved the prediction of readmission after hospitalization for heart failure compared with LR and provided the greatest predictive range in observed readmission rates. © 2016 American Heart Association, Inc.

  6. Intelligent acoustic data fusion technique for information security analysis

    Science.gov (United States)

    Jiang, Ying; Tang, Yize; Lu, Wenda; Wang, Zhongfeng; Wang, Zepeng; Zhang, Luming

    2017-08-01

    Tone is an essential component of word formation in all tonal languages, and it plays an important role in the transmission of information in speech communication. Therefore, tones characteristics study can be applied into security analysis of acoustic signal by the means of language identification, etc. In speech processing, fundamental frequency (F0) is often viewed as representing tones by researchers of speech synthesis. However, regular F0 values may lead to low naturalness in synthesized speech. Moreover, F0 and tone are not equivalent linguistically; F0 is just a representation of a tone. Therefore, the Electroglottography (EGG) signal is collected for deeper tones characteristics study. In this paper, focusing on the Northern Kam language, which has nine tonal contours and five level tone types, we first collected EGG and speech signals from six natural male speakers of the Northern Kam language, and then achieved the clustering distributions of the tone curves. After summarizing the main characteristics of tones of Northern Kam, we analyzed the relationship between EGG and speech signal parameters, and laid the foundation for further security analysis of acoustic signal.

  7. Analysis of compressive fracture in rock using statistical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Blair, S.C.

    1994-12-01

    Fracture of rock in compression is analyzed using a field-theory model, and the processes of crack coalescence and fracture formation and the effect of grain-scale heterogeneities on macroscopic behavior of rock are studied. The model is based on observations of fracture in laboratory compression tests, and incorporates assumptions developed using fracture mechanics analysis of rock fracture. The model represents grains as discrete sites, and uses superposition of continuum and crack-interaction stresses to create cracks at these sites. The sites are also used to introduce local heterogeneity. Clusters of cracked sites can be analyzed using percolation theory. Stress-strain curves for simulated uniaxial tests were analyzed by studying the location of cracked sites, and partitioning of strain energy for selected intervals. Results show that the model implicitly predicts both development of shear-type fracture surfaces and a strength-vs-size relation that are similar to those observed for real rocks. Results of a parameter-sensitivity analysis indicate that heterogeneity in the local stresses, attributed to the shape and loading of individual grains, has a first-order effect on strength, and that increasing local stress heterogeneity lowers compressive strength following an inverse power law. Peak strength decreased with increasing lattice size and decreasing mean site strength, and was independent of site-strength distribution. A model for rock fracture based on a nearest-neighbor algorithm for stress redistribution is also presented and used to simulate laboratory compression tests, with promising results.

  8. Structural Analysis of Composite Laminates using Analytical and Numerical Techniques

    Directory of Open Access Journals (Sweden)

    Sanghi Divya

    2016-01-01

    Full Text Available A laminated composite material consists of different layers of matrix and fibres. Its properties can vary a lot with each layer’s or ply’s orientation, material property and the number of layers itself. The present paper focuses on a novel approach of incorporating an analytical method to arrive at a preliminary ply layup order of a composite laminate, which acts as a feeder data for the further detailed analysis done on FEA tools. The equations used in our MATLAB are based on analytical study code and supply results that are remarkably close to the final optimized layup found through extensive FEA analysis with a high probabilistic degree. This reduces significant computing time and saves considerable FEA processing to obtain efficient results quickly. The result output by our method also provides the user with the conditions that predicts the successive failure sequence of the composite plies, a result option which is not even available in popular FEM tools. The predicted results are further verified by testing the laminates in the laboratory and the results are found in good agreement.

  9. [THE COMPARATIVE ANALYSIS OF TECHNIQUES OF IDENTIFICATION OF CORYNEBACTERIUM NON DIPHTHERIAE].

    Science.gov (United States)

    Kharseeva, G G; Voronina, N A; Mironov, A Yu; Alutina, E L

    2015-12-01

    The comparative analysis was carried out concerning effectiveness of three techniques of identification of Corynebacterium non diphtheriae: bacteriological, molecular genetic (sequenation on 16SpRNA) andmass-spectrometric (MALDI-ToFMS). The analysis covered 49 strains of Corynebacterium non diphtheriae (C.pseudodiphheriticum, C.amycolatum, C.propinquum, C.falsenii) and 2 strains of Corynebacterium diphtheriae isolated under various pathology form urogenital tract and upper respiratory ways. The corinbacteria were identified using bacteriologic technique, sequenation on 16SpRNA and mass-spectrometric technique (MALDIToF MS). The full concordance of results of species' identification was marked in 26 (51%) of strains of Corynebacterium non diphtheriae at using three analysis techniques; in 43 (84.3%) strains--at comparison of bacteriologic technique with sequenation on 16S pRNA and in 29 (57%)--at mass-spectrometric analysis and sequenation on 16S pRNA. The bacteriologic technique is effective for identification of Corynebacterium diphtheriae. The precise establishment of species belonging of corynebacteria with variable biochemical characteristics the molecular genetic technique of analysis is to be applied. The mass-spectrometric technique (MALDI-ToF MS) requires further renewal of data bases for identifying larger spectrum of representatives of genus Corynebacterium.

  10. A comparative study between standard dry needling technique and rapid dry needling technique on active gluteus medius muscle trigger points

    OpenAIRE

    2012-01-01

    M.Tech. This study aimed to determine the difference between the standard dry needling technique versus the rapid dry needling technique with regards to which technique would provide quicker relief of symptoms, as measured by an increase in participant’s pressure tolerance and range of motion and a decrease in subjective pain. Subjectively it was seen that both groups had a statistical decrease in the participants perceived pain with the Oswestry Disability Index, the McGill’s Pain Questio...

  11. Neural networks and dynamical system techniques for volcanic tremor analysis

    Directory of Open Access Journals (Sweden)

    R. Carniel

    1996-06-01

    Full Text Available A volcano can be seen as a dynamical system, the number of state variables being its dimension N. The state is usually confined on a manifold with a lower dimension f, manifold which is characteristic of a persistent «structural configuration». A change in this manifold may be a hint that something is happening to the dynamics of the volcano, possibly leading to a paroxysmal phase. In this work the original state space of the volcano dynamical system is substituted by a pseudo state space reconstructed by the method of time-delayed coordinates, with suitably chosen lag time and embedding dimension, from experimental time series of seismic activity, i.e. volcanic tremor recorded at Stromboli volcano. The monitoring is done by a neural network which first learns the dynamics of the persistent tremor and then tries to detect structural changes in its behaviour.

  12. Denial of Service Attack Techniques: Analysis, Implementation and Comparison

    Directory of Open Access Journals (Sweden)

    Khaled Elleithy

    2005-02-01

    Full Text Available A denial of service attack (DOS is any type of attack on a networking structure to disable a server from servicing its clients. Attacks range from sending millions of requests to a server in an attempt to slow it down, flooding a server with large packets of invalid data, to sending requests with an invalid or spoofed IP address. In this paper we show the implementation and analysis of three main types of attack: Ping of Death, TCP SYN Flood, and Distributed DOS. The Ping of Death attack will be simulated against a Microsoft Windows 95 computer. The TCP SYN Flood attack will be simulated against a Microsoft Windows 2000 IIS FTP Server. Distributed DOS will be demonstrated by simulating a distribution zombie program that will carry the Ping of Death attack. This paper will demonstrate the potential damage from DOS attacks and analyze the ramifications of the damage.

  13. A Generalized Lanczos-QR Technique for Structural Analysis

    DEFF Research Database (Denmark)

    Vissing, S.

    systems with very special properties. Due to the finite discretization the matrices are sparse and a relatively large number of problems also has real and symmetric matrices. The matrix equation for an undamped vibration contains two matrices describing tangent stiffness and mass distributions....... Alternatively, in a stability analysis, tangent stiffness and geometric stiffness matrices are introduced into an eigenvalue problem used to determine possible bifurcation points. The common basis for these types of problems is that the matrix equation describing the problem contains two real, symmetric......Within the field of solid mechanics such as structural dynamics and linearized as well as non-linear stability, the eigenvalue problem plays an important role. In the class of finite element and finite difference discretized problems these engineering problems are characterized by large matrix...

  14. Nonlinear systems techniques for dynamical analysis and control

    CERN Document Server

    Lefeber, Erjen; Arteaga, Ines

    2017-01-01

    This treatment of modern topics related to the control of nonlinear systems is a collection of contributions celebrating the work of Professor Henk Nijmeijer and honoring his 60th birthday. It addresses several topics that have been the core of Professor Nijmeijer’s work, namely: the control of nonlinear systems, geometric control theory, synchronization, coordinated control, convergent systems and the control of underactuated systems. The book presents recent advances in these areas, contributed by leading international researchers in systems and control. In addition to the theoretical questions treated in the text, particular attention is paid to a number of applications including (mobile) robotics, marine vehicles, neural dynamics and mechanical systems generally. This volume provides a broad picture of the analysis and control of nonlinear systems for scientists and engineers with an interest in the interdisciplinary field of systems and control theory. The reader will benefit from the expert participan...

  15. Multidimensional Analysis of Quenching: Comparison of Inverse Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, K.J.

    1998-11-18

    Understanding the surface heat transfer during quenching can be beneficial. Analysis to estimate the surface heat transfer from internal temperature measurements is referred to as the inverse heat conduction problem (IHCP). Function specification and gradient adjoint methods, which use a gradient search method coupled with an adjoint operator, are widely u led methods to solve the IHCP. In this paper the two methods are presented for the multidimensional case. The focus is not a rigorous comparison of numerical results. Instead after formulating the multidimensional solutions, issues associated with the numerical implementation and practical application of the methods are discussed. In addition, an experiment that measured the surface heat flux and temperatures for a transient experimental case is analyzed. Transient temperatures are used to estimate the surface heat flux, which is compared to the measured values. The estimated surface fluxes are comparable for the two methods.

  16. Design of Process Displays based on Risk Analysis Techniques

    DEFF Research Database (Denmark)

    Paulsen, Jette Lundtang

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, es-pecially in view of the enormous amount of information available in computer-based supervision systems...... in some detail. Finally we address the problem of where to put the dot and the lines: when all information is ‘on the table’, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose...... was to develop a software tool for maintenance supervision of components in a nuclear power plant....

  17. Connectomic analysis of brain networks: novel techniques and future directions

    Directory of Open Access Journals (Sweden)

    Leonie Cazemier

    2016-11-01

    Full Text Available Brain networks, localized or brain-wide, exist only at the cellular level, i.e. between specific pre- and postsynaptic neurons, which are connected through functionally diverse synapses located at specific points of their cell membranes. Connectomics is the emerging subfield of neuroanatomy explicitly aimed at elucidating the wiring of brain networks with cellular resolution and a quantified accuracy. Such data are indispensable for realistic modeling of brain circuitry and function. A connectomic analysis, therefore, needs to identify and measure the soma, dendrites, axonal path and branching patterns together with the synapses and gap junctions of the neurons involved in any given brain circuit or network. However, because of the submicron caliber, 3D complexity and high packing density of most such structures, as well as the fact that axons frequently extend over long distances to make synapses in remote brain regions, creating connectomic maps is technically challenging and requires multi-scale approaches, Such approaches involve the combination of the most sensitive cell labeling and analysis methods available, as well as the development of new ones able to resolve individual cells and synapses with increasing high-throughput. In this review, we provide an overview of recently introduced high-resolution methods, which researchers wanting to enter the field of connectomics may consider. It includes several molecular labeling tools, some of which specifically label synapses, and covers a number of novel imaging tools such as brain clearing protocols and microscopy approaches. Apart from describing the tools, we also provide an assessment of their qualities. The criteria we use assess the qualities that tools need in order to contribute to deciphering the key levels of circuit organization. We conclude with a brief future outlook for neuroanatomic research, computational methods and network modeling, where we also point out several outstanding

  18. Design of process displays based on risk analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lundtang Paulsen, J

    2004-05-01

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, especially in view of the enormous amount of information available in computer-based supervision systems. The state of the art is discussed: How are supervision systems designed today and why? Which strategies are used? What kind of research is going on? Four different plants and their display systems, designed by the author, are described and discussed. Next we outline different methods for eliciting knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chapter presents an overview of the various types of operation references: constitutive equations, set points, design parameters, component characteristics etc., and their validity in different situations. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engineer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described in some detail. Finally we address the problem of where to put the dot and the lines: when all information is on the table, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose was to develop a software tool for maintenance supervision of components in a nuclear power plant. (au)

  19. Prediction of activity type in preschool children using machine learning techniques.

    Science.gov (United States)

    Hagenbuchner, Markus; Cliff, Dylan P; Trost, Stewart G; Van Tuc, Nguyen; Peoples, Gregory E

    2015-07-01

    Recent research has shown that machine learning techniques can accurately predict activity classes from accelerometer data in adolescents and adults. The purpose of this study is to develop and test machine learning models for predicting activity type in preschool-aged children. Participants completed 12 standardised activity trials (TV, reading, tablet game, quiet play, art, treasure hunt, cleaning up, active game, obstacle course, bicycle riding) over two laboratory visits. Eleven children aged 3-6 years (mean age=4.8±0.87; 55% girls) completed the activity trials while wearing an ActiGraph GT3X+ accelerometer on the right hip. Activities were categorised into five activity classes: sedentary activities, light activities, moderate to vigorous activities, walking, and running. A standard feed-forward Artificial Neural Network and a Deep Learning Ensemble Network were trained on features in the accelerometer data used in previous investigations (10th, 25th, 50th, 75th and 90th percentiles and the lag-one autocorrelation). Overall recognition accuracy for the standard feed forward Artificial Neural Network was 69.7%. Recognition accuracy for sedentary activities, light activities and games, moderate-to-vigorous activities, walking, and running was 82%, 79%, 64%, 36% and 46%, respectively. In comparison, overall recognition accuracy for the Deep Learning Ensemble Network was 82.6%. For sedentary activities, light activities and games, moderate-to-vigorous activities, walking, and running recognition accuracy was 84%, 91%, 79%, 73% and 73%, respectively. Ensemble machine learning approaches such as Deep Learning Ensemble Network can accurately predict activity type from accelerometer data in preschool children. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  20. Towards Effective Clustering Techniques for the Analysis of Electric Power Grids

    Energy Technology Data Exchange (ETDEWEB)

    Hogan, Emilie A.; Cotilla Sanchez, Jose E.; Halappanavar, Mahantesh; Wang, Shaobu; Mackey, Patrick S.; Hines, Paul; Huang, Zhenyu

    2013-11-30

    Clustering is an important data analysis technique with numerous applications in the analysis of electric power grids. Standard clustering techniques are oblivious to the rich structural and dynamic information available for power grids. Therefore, by exploiting the inherent topological and electrical structure in the power grid data, we propose new methods for clustering with applications to model reduction, locational marginal pricing, phasor measurement unit (PMU or synchrophasor) placement, and power system protection. We focus our attention on model reduction for analysis based on time-series information from synchrophasor measurement devices, and spectral techniques for clustering. By comparing different clustering techniques on two instances of realistic power grids we show that the solutions are related and therefore one could leverage that relationship for a computational advantage. Thus, by contrasting different clustering techniques we make a case for exploiting structure inherent in the data with implications for several domains including power systems.

  1. Melodic pattern discovery by structural analysis via wavelets and clustering techniques

    DEFF Research Database (Denmark)

    Velarde, Gissel; Meredith, David

    We present an automatic method to support melodic pattern discovery by structural analysis of symbolic representations by means of wavelet analysis and clustering techniques. In previous work, we used the method to recognize the parent works of melodic segments, or to classify tunes into tune...... to support human or computer assisted music analysis and teaching....

  2. Bringing New Tools and Techniques to Bear on Earthquake Hazard Analysis and Mitigation

    Science.gov (United States)

    Willemann, R. J.; Pulliam, J.; Polanco, E.; Louie, J. N.; Huerta-Lopez, C.; Schmitz, M.; Moschetti, M. P.; Huerfano Moreno, V.; Pasyanos, M.

    2013-12-01

    During July 2013, IRIS held an Advanced Studies Institute in Santo Domingo, Dominican Republic, that was designed to enable early-career scientists who already have mastered the fundamentals of seismology to begin collaborating in frontier seismological research. The Institute was conceived of at a strategic planning workshop in Heredia, Costa Rica, that was supported and partially funded by USAID, with a goal of building geophysical capacity to mitigate the effects of future earthquakes. To address this broad goal, we drew participants from a dozen different countries of Middle America. Our objectives were to develop understanding of the principles of earthquake hazard analysis, particularly site characterization techniques, and to facilitate future research collaborations. The Institute was divided into three main sections: overviews on the fundamentals of earthquake hazard analysis and lectures on the theory behind methods of site characterization; fieldwork where participants acquired new data of the types typically used in site characterization; and computer-based analysis projects in which participants applied their newly-learned techniques to the data they collected. This was the first IRIS institute to combine an instructional short course with field work for data acquisition. Participants broke into small teams to acquire data, analyze it on their own computers, and then make presentations to the assembled group describing their techniques and results.Using broadband three-component seismometers, the teams acquired data for Spatial Auto-Correlation (SPAC) analysis at seven array locations, and Horizontal to Vertical Spectral Ratio (HVSR) analysis at 60 individual sites along six profiles throughout Santo Domingo. Using a 24-channel geophone string, the teams acquired data for Refraction Microtremor (SeisOptReMi™ from Optim) analysis at 11 sites, with supplementary data for active-source Multi-channel Spectral Analysis of Surface Waves (MASW) analysis at

  3. Performance Evaluation a Developed Energy Harvesting Interface Circuit in Active Technique

    Directory of Open Access Journals (Sweden)

    Ramizi Mohamed

    2014-10-01

    Full Text Available This study presents the performance evaluation a developed energy harvesting interface circuit in active technique. The energy harvesting interface circuit for micro-power applications uses equivalent voltage of the piezoelectric materials have been developed and simulated. Circuit designs and simulation results are presented for a conventional diode rectifier with voltage doubler in passive technique. Most of the existing techniques are mainly passive-based energy harvesting circuits. Generally, the power harvesting capability of the passive technique is very low. To increase the harvested energy, the active technique and its components such as MOSFET, thyristor and transistor have chosen to design the proposed energy harvesting interface circuit. In this study, it has simulated both the conventional in passive circuit and developed energy harvester in active technique. The developed interface circuits consisting of piezoelectric element with input source of vibration, AC-DC thyristor doubler rectifier circuit and DC-DC boost converter using thyristor with storage device. In the development circuits, it is noted that the components thyristor instead of mainly diode available in conventional circuits have chosen. Because the forward voltage potential (0.7 V is higher than the incoming input voltage (0.2 V. Finally, the complete energy harvester using PSPICE software have designed and simulated. The proposed circuits in PSPICE generate the boost-up DC voltage up to 2 V. The overall efficiency of the developed circuit is 70%, followed by the software simulation, which is greater than conventional circuit efficiency of 20% in performance evaluator. It is concluded that the developed circuit output voltage can be used to operate for the applications in autonomous devices.

  4. NOS/NGS activities to support development of radio interferometric surveying techniques

    Science.gov (United States)

    Carter, W. E.; Dracup, J. F.; Hothem, L. D.; Robertson, D. S.; Strange, W. E.

    1980-01-01

    National Geodetic Survey activities towards the development of operational geodetic survey systems based on radio interferometry are reviewed. Information about the field procedures, data reduction and analysis, and the results obtained to date is presented.

  5. LEMPEL - ZIV - WELCH & HUFFMAN” - THE LOSSLESS COMPRESSION TECHNIQUES; (IMPLEMENTATION ANALYSIS AND COMPARISON THEREOF)

    OpenAIRE

    Kapil Kapoor*, Dr. Abhay Sharma

    2016-01-01

    This paper is about the Implementation Analysis and Comparison of Lossless Compression Techniques viz. Lempel-Ziv-Welch and Huffman. LZW technique assigns fixed length code words. It requires no prior information about the probability of occurrence of symbols to be encoded. Basic idea in Huffman technique is that different gray levels occur with different probability (non-uniform- '•histogram). It uses shorter code words for the more common gray levels and longer code words for the l...

  6. Radiopacity of Esthetic Post Materials: Evaluation with Digital Analysis Technique.

    Science.gov (United States)

    Kaval, Mehmet Emin; Akin, Hakan; Guneri, Pelin

    2017-07-01

    To evaluate the radiopacity of five post materials using a digital image analysis method. Twelve specimens from each post type (two zirconia and three fiber based) of 2 mm in thickness were obtained using a diamond blade mounted on a cutting machine, and digital radiographs were taken along with aluminum step-wedge and dentin discs under standard exposure conditions. The mean gray-values of specimens were measured using a computer graphics program. Data were analyzed using one-way ANOVA followed by Holm-Sidak multicomparison test (p = 0.05). The highest radiopacity was observed in custom zirconia (5.842 millimeters of equivalent Al [mmAl]), and the lowest value was detected with FRC-Postec (Ivoclar Vivadent) (1.716 mmAl). Significant differences were revealed between the radiopacity values among all groups (p materials (p = 0.56). All tested post materials had higher radiopacity than dentin. Further studies will be required to clarify optimum radiopacity properties of the post materials to provide a precise clinical observation. © 2015 by the American College of Prosthodontists.

  7. Financial planning and analysis techniques of mining firms: a note on Canadian practice

    Energy Technology Data Exchange (ETDEWEB)

    Blanco, H.; Zanibbi, L.R. (Laurentian University, Sudbury, ON (Canada). School of Commerce and Administration)

    1992-06-01

    This paper reports on the results of a survey of the financial planning and analysis techniques in use in the mining industry in Canada. The study was undertaken to determine the current status of these practices within mining firms in Canada and to investigate the extent to which the techniques are grouped together within individual firms. In addition, tests were performed on the relationship between these groups of techniques and both organizational size and price volatility of end product. The results show that a few techniques are widely utilized in this industry but that the techniques used most frequently are not as sophisticated as reported in previous, more broadly based surveys. The results also show that firms tend to use 'bundles' of techniques and that the relative use of some of these groups of techniques is weakly associated with both organizational size and type of end product. 19 refs., 7 tabs.

  8. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    Science.gov (United States)

    Migneault, Gerard E.

    1987-01-01

    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  9. Multifractal detrended fluctuation analysis of human EEG: preliminary investigation and comparison with the wavelet transform modulus maxima technique.

    Directory of Open Access Journals (Sweden)

    Todd Zorick

    Full Text Available Recently, many lines of investigation in neuroscience and statistical physics have converged to raise the hypothesis that the underlying pattern of neuronal activation which results in electroencephalography (EEG signals is nonlinear, with self-affine dynamics, while scalp-recorded EEG signals themselves are nonstationary. Therefore, traditional methods of EEG analysis may miss many properties inherent in such signals. Similarly, fractal analysis of EEG signals has shown scaling behaviors that may not be consistent with pure monofractal processes. In this study, we hypothesized that scalp-recorded human EEG signals may be better modeled as an underlying multifractal process. We utilized the Physionet online database, a publicly available database of human EEG signals as a standardized reference database for this study. Herein, we report the use of multifractal detrended fluctuation analysis on human EEG signals derived from waking and different sleep stages, and show evidence that supports the use of multifractal methods. Next, we compare multifractal detrended fluctuation analysis to a previously published multifractal technique, wavelet transform modulus maxima, using EEG signals from waking and sleep, and demonstrate that multifractal detrended fluctuation analysis has lower indices of variability. Finally, we report a preliminary investigation into the use of multifractal detrended fluctuation analysis as a pattern classification technique on human EEG signals from waking and different sleep stages, and demonstrate its potential utility for automatic classification of different states of consciousness. Therefore, multifractal detrended fluctuation analysis may be a useful pattern classification technique to distinguish among different states of brain function.

  10. Nuclear and radiochemical techniques in chemical analysis. Progress report, August 1, 1978-July 31, 1979

    Energy Technology Data Exchange (ETDEWEB)

    Finston, H. L.; Williams, E. T.

    1979-07-01

    Studies of homogeneous liquid-liquid extraction have been extended to include (1) a detailed determination of the phase diagram of the system propylene carbonate-water, (2) the extraction of a large variety of both monodentate and bidentate iron complexes, (3) the solvent extraction characteristics of analogues of propylene carbonate, (4) the behavior under pressure of the propylene carbonate water system, and (5) the extraction behavior of alkaline earth - TTA chelates. One consequence of these studies was the observation that the addition of ethanol to propylene carbonate-water or to isobutylene carbonate-water yields a single homogeneous phase. Subsequent evaporation of the ethanol restores the two immiscible phases. Past neutron activation analysis has been attempted for the heavy elements Pb, Bi, Tl at the Brookhaven HFBR (in- or near-core position) and at the Brookhaven CLIF facility. The latter appears more promising and we have initiated a collaborative program to use the CLIF facility. A milking system which can provide ca. 16 ..mu..Ci of carrier-free /sup 212/Pb was developed for use in an isotope dilution technique for lead. Collaboration with laboratories already determining trace lead by flameless Atomic Absorption or by concentration by electrodeposition into a hanging drop followed by Anodic stripping will be proposed. The Proton X-Ray Emission system has undergone marked improvement with the acquisition of a new high resolution Si(Li) detector and a new multi-channel analyzer system. Various techniques have been explored to dissolve and prepare samples for PIXE analysis and also for verification by Atomic Absorption analysis.

  11. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eom, H.S.; Kim, J.H.; Lee, J.C.; Choi, Y.R.; Moon, S.S

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system.

  12. An Analysis on the Discourse Cohesion Techniques in Children's English Books

    Institute of Scientific and Technical Information of China (English)

    罗春燕

    2014-01-01

    Discourse cohesion techniques analysis attracts much attention both at home and abroad and many scholars have con-ducted their research in this field, however, few of them focus on children’s English books which has its own characteristics and cohesion techniques and deserves our research.

  13. STUDY ON MODULAR FAULT TREE ANALYSIS TECHNIQUE WITH CUT SETS MATRIX METHOD

    Institute of Scientific and Technical Information of China (English)

    1998-01-01

    A new fault tree analysis (FTA) computation method is put forth by using modularization technique in FTA with cut sets matrix, and can reduce NP (Nondeterministic polynomial) difficulty effectively. This software can run in IBM-PC and DOS 3.0 and up. The method provides theoretical basis and computation tool for application of FTA technique in the common engineering system

  14. Effective self-regulation change techniques to promote mental wellbeing among adolescents: a meta-analysis

    NARCIS (Netherlands)

    Genugten, L. van; Dusseldorp, E.; Massey, E.K.; Empelen, P. van

    2017-01-01

    Mental wellbeing is influenced by self-regulation processes. However, little is known on the efficacy of change techniques based on self-regulation to promote mental wellbeing. The aim of this meta-analysis is to identify effective self-regulation techniques (SRTs) in primary and secondary

  15. Analysis of computational modeling techniques for complete rotorcraft configurations

    Science.gov (United States)

    O'Brien, David M., Jr.

    Computational fluid dynamics (CFD) provides the helicopter designer with a powerful tool for identifying problematic aerodynamics. Through the use of CFD, design concepts can be analyzed in a virtual wind tunnel long before a physical model is ever created. Traditional CFD analysis tends to be a time consuming process, where much of the effort is spent generating a high quality computational grid. Recent increases in computing power and memory have created renewed interest in alternative grid schemes such as unstructured grids, which facilitate rapid grid generation by relaxing restrictions on grid structure. Three rotor models have been incorporated into a popular fixed-wing unstructured CFD solver to increase its capability and facilitate availability to the rotorcraft community. The benefit of unstructured grid methods is demonstrated through rapid generation of high fidelity configuration models. The simplest rotor model is the steady state actuator disk approximation. By transforming the unsteady rotor problem into a steady state one, the actuator disk can provide rapid predictions of performance parameters such as lift and drag. The actuator blade and overset blade models provide a depiction of the unsteady rotor wake, but incur a larger computational cost than the actuator disk. The actuator blade model is convenient when the unsteady aerodynamic behavior needs to be investigated, but the computational cost of the overset approach is too large. The overset or chimera method allows the blades loads to be computed from first-principles and therefore provides the most accurate prediction of the rotor wake for the models investigated. The physics of the flow fields generated by these models for rotor/fuselage interactions are explored, along with efficiencies and limitations of each method.

  16. Analysis and interpretation of dynamic FDG PET oncological studies using data reduction techniques

    Directory of Open Access Journals (Sweden)

    Santos Andres

    2007-10-01

    Full Text Available Abstract Background Dynamic positron emission tomography studies produce a large amount of image data, from which clinically useful parametric information can be extracted using tracer kinetic methods. Data reduction methods can facilitate the initial interpretation and visual analysis of these large image sequences and at the same time can preserve important information and allow for basic feature characterization. Methods We have applied principal component analysis to provide high-contrast parametric image sets of lower dimensions than the original data set separating structures based on their kinetic characteristics. Our method has the potential to constitute an alternative quantification method, independent of any kinetic model, and is particularly useful when the retrieval of the arterial input function is complicated. In independent component analysis images, structures that have different kinetic characteristics are assigned opposite values, and are readily discriminated. Furthermore, novel similarity mapping techniques are proposed, which can summarize in a single image the temporal properties of the entire image sequence according to a reference region. Results Using our new cubed sum coefficient similarity measure, we have shown that structures with similar time activity curves can be identified, thus facilitating the detection of lesions that are not easily discriminated using the conventional method employing standardized uptake values.

  17. Refolding techniques for recovering biologically active recombinant proteins from inclusion bodies.

    Science.gov (United States)

    Yamaguchi, Hiroshi; Miyazaki, Masaya

    2014-02-20

    Biologically active proteins are useful for studying the biological functions of genes and for the development of therapeutic drugs and biomaterials in a biotechnology industry. Overexpression of recombinant proteins in bacteria, such as Escherichia coli, often results in the formation of inclusion bodies, which are protein aggregates with non-native conformations. As inclusion bodies contain relatively pure and intact proteins, protein refolding is an important process to obtain active recombinant proteins from inclusion bodies. However, conventional refolding methods, such as dialysis and dilution, are time consuming and, often, recovered yields of active proteins are low, and a trial-and-error process is required to achieve success. Recently, several approaches have been reported to refold these aggregated proteins into an active form. The strategies largely aim at reducing protein aggregation during the refolding procedure. This review focuses on protein refolding techniques using chemical additives and laminar flow in microfluidic chips for the efficient recovery of active proteins from inclusion bodies.

  18. Experimental Study of Active Techniques for Blade/Vortex Interaction Noise Reduction

    Science.gov (United States)

    Kobiki, Noboru; Murashige, Atsushi; Tsuchihashi, Akihiko; Yamakawa, Eiichi

    This paper presents the experimental results of the effect of Higher Harmonic Control (HHC) and Active Flap on the Blade/Vortex Interaction (BVI) noise. Wind tunnel tests were performed with a 1-bladed rotor system to evaluate the simplified BVI phenomenon avoiding the complicated aerodynamic interference which is characteristically and inevitably caused by a multi-bladed rotor. Another merit to use this 1-bladed rotor system is that the several objective active techniques can be evaluated under the same condition installed in the same rotor system. The effects of the active techniques on the BVI noise reduction were evaluated comprehensively by the sound pressure, the blade/vortex miss distance obtained by Laser light Sheet (LLS), the blade surface pressure distribution and the tip vortex structure by Particle Image Velocimetry (PIV). The correlation among these quantities to describe the effect of the active techniques on the BVI conditions is well obtained. The experiments show that the blade/vortex miss distance is more dominant for BVI noise than the other two BVI governing factors, such as blade lift and vortex strength at the moment of BVI.

  19. Diffusion of point defects in crystalline silicon using the kinetic activation-relaxation technique method

    Science.gov (United States)

    Trochet, Mickaël; Béland, Laurent Karim; Joly, Jean-François; Brommer, Peter; Mousseau, Normand

    2015-06-01

    We study point-defect diffusion in crystalline silicon using the kinetic activation-relaxation technique (k-ART), an off-lattice kinetic Monte Carlo method with on-the-fly catalog building capabilities based on the activation-relaxation technique (ART nouveau), coupled to the standard Stillinger-Weber potential. We focus more particularly on the evolution of crystalline cells with one to four vacancies and one to four interstitials in order to provide a detailed picture of both the atomistic diffusion mechanisms and overall kinetics. We show formation energies, activation barriers for the ground state of all eight systems, and migration barriers for those systems that diffuse. Additionally, we characterize diffusion paths and special configurations such as dumbbell complex, di-interstitial (IV-pair+2I) superdiffuser, tetrahedral vacancy complex, and more. This study points to an unsuspected dynamical richness even for this apparently simple system that can only be uncovered by exhaustive and systematic approaches such as the kinetic activation-relaxation technique.

  20. Measurements of fusion neutron yields by neutron activation technique: Uncertainty due to the uncertainty on activation cross-sections

    Energy Technology Data Exchange (ETDEWEB)

    Stankunas, Gediminas, E-mail: gediminas.stankunas@lei.lt [Lithuanian Energy Institute, Laboratory of Nuclear Installation Safety, Breslaujos str. 3, LT-44403 Kaunas (Lithuania); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Batistoni, Paola [ENEA, Via E. Fermi, 45, 00044 Frascati, Rome (Italy); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Sjöstrand, Henrik; Conroy, Sean [Department of Physics and Astronomy, Uppsala University, PO Box 516, SE-75120 Uppsala (Sweden); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom)

    2015-07-11

    The neutron activation technique is routinely used in fusion experiments to measure the neutron yields. This paper investigates the uncertainty on these measurements as due to the uncertainties on dosimetry and activation reactions. For this purpose, activation cross-sections were taken from the International Reactor Dosimetry and Fusion File (IRDFF-v1.05) in 640 groups ENDF-6 format for several reactions of interest for both 2.5 and 14 MeV neutrons. Activation coefficients (reaction rates) have been calculated using the neutron flux spectra at JET vacuum vessel, both for DD and DT plasmas, calculated by MCNP in the required 640-energy group format. The related uncertainties for the JET neutron spectra are evaluated as well using the covariance data available in the library. These uncertainties are in general small, but not negligible when high accuracy is required in the determination of the fusion neutron yields.