WorldWideScience

Sample records for activation analysis techniques

  1. Mathematical analysis techniques for modeling the space network activities

    Science.gov (United States)

    Foster, Lisa M.

    1992-01-01

    The objective of the present work was to explore and identify mathematical analysis techniques, and in particular, the use of linear programming. This topic was then applied to the Tracking and Data Relay Satellite System (TDRSS) in order to understand the space network better. Finally, a small scale version of the system was modeled, variables were identified, data was gathered, and comparisons were made between actual and theoretical data.

  2. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    Science.gov (United States)

    Lindstrom, David J.; Lindstrom, Richard M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably.

  3. Application of thermal analysis techniques in activated carbon production

    Science.gov (United States)

    Donnals, G.L.; DeBarr, J.A.; Rostam-Abadi, M.; Lizzio, A.A.; Brady, T.A.

    1996-01-01

    Thermal analysis techniques have been used at the ISGS as an aid in the development and characterization of carbon adsorbents. Promising adsorbents from fly ash, tires, and Illinois coals have been produced for various applications. Process conditions determined in the preparation of gram quantities of carbons were used as guides in the preparation of larger samples. TG techniques developed to characterize the carbon adsorbents included the measurement of the kinetics of SO2 adsorption, the performance of rapid proximate analyses, and the determination of equilibrium methane adsorption capacities. Thermal regeneration of carbons was assessed by TG to predict the life cycle of carbon adsorbents in different applications. TPD was used to determine the nature of surface functional groups and their effect on a carbon's adsorption properties.

  4. Instrumental Neutron Activation Analysis Technique using Subsecond Radionuclides

    DEFF Research Database (Denmark)

    Nielsen, H.K.; Schmidt, J.O.

    1987-01-01

    The fast irradiation facility Mach-1 installed at the Danish DR 3 reactor has been used in boron determinations by means of Instrumental Neutron Activation Analysis using12B with 20-ms half-life. The performance characteristics of the system are presented and boron determinations of NBS standard...

  5. Chemical weapons detection by fast neutron activation analysis techniques

    Science.gov (United States)

    Bach, P.; Ma, J. L.; Froment, D.; Jaureguy, J. C.

    1993-06-01

    A neutron diagnostic experimental apparatus has been tested for nondestructive verification of sealed munitions. Designed to potentially satisfy a significant number of van-mobile requirements, this equipment is based on an easy to use industrial sealed tube neutron generator that interrogates the munitions of interest with 14 MeV neutrons. Gamma ray spectra are detected with a high purity germanium detector, especially shielded from neutrons and gamma ray background. A mobile shell holder has been used. Possible configurations allow the detection, in continuous or in pulsed modes, of gamma rays from neutron inelastic scattering, from thermal neutron capture, and from fast or thermal neutron activation. Tests on full scale sealed munitions with chemical simulants show that those with chlorine (old generation materials) are detectable in a few minutes, and those including phosphorus (new generation materials) in nearly the same time.

  6. Activated sludge characterization through microscopy: A review on quantitative image analysis and chemometric techniques

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Daniela P. [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Amaral, A. Luís [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Instituto Politécnico de Coimbra, ISEC, DEQB, Rua Pedro Nunes, Quinta da Nora, 3030-199 Coimbra (Portugal); Ferreira, Eugénio C., E-mail: ecferreira@deb.uminho.pt [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal)

    2013-11-13

    Graphical abstract: -- Highlights: •Quantitative image analysis shows potential to monitor activated sludge systems. •Staining techniques increase the potential for detection of operational problems. •Chemometrics combined with quantitative image analysis is valuable for process monitoring. -- Abstract: In wastewater treatment processes, and particularly in activated sludge systems, efficiency is quite dependent on the operating conditions, and a number of problems may arise due to sludge structure and proliferation of specific microorganisms. In fact, bacterial communities and protozoa identification by microscopy inspection is already routinely employed in a considerable number of cases. Furthermore, quantitative image analysis techniques have been increasingly used throughout the years for the assessment of aggregates and filamentous bacteria properties. These procedures are able to provide an ever growing amount of data for wastewater treatment processes in which chemometric techniques can be a valuable tool. However, the determination of microbial communities’ properties remains a current challenge in spite of the great diversity of microscopy techniques applied. In this review, activated sludge characterization is discussed highlighting the aggregates structure and filamentous bacteria determination by image analysis on bright-field, phase-contrast, and fluorescence microscopy. An in-depth analysis is performed to summarize the many new findings that have been obtained, and future developments for these biological processes are further discussed.

  7. FTIR Analysis of Alkali Activated Slag and Fly Ash Using Deconvolution Techniques

    Science.gov (United States)

    Madavarapu, Sateesh Babu

    The studies on aluminosilicate materials to replace traditional construction materials such as ordinary Portland cement (OPC) to reduce the effects caused has been an important research area for the past decades. Many properties like strength have already been studied and the primary focus is to learn about the reaction mechanism and the effect of the parameters on the formed products. The aim of this research was to explore the structural changes and reaction product analysis of geopolymers (Slag & Fly Ash) using Fourier transform infrared spectroscopy (FTIR) and deconvolution techniques. Spectroscopic techniques give valuable information at a molecular level but not all methods are economic and simple. To understand the mechanisms of alkali activated aluminosilicate materials, attenuated total reflectance (ATR) FTIR has been used where the effect of the parameters on the reaction products have been analyzed. To analyze complex systems like geopolymers using FTIR, deconvolution techniques help to obtain the properties of a particular peak attributed to a certain molecular vibration. Time and temperature dependent analysis were done on slag pastes to understand the polymerization of reactive silica in the system with time and temperature variance. For time dependent analysis slag has been activated with sodium and potassium silicates using two different `n'values and three different silica modulus [Ms- (SiO2 /M2 O)] values. The temperature dependent analysis was done by curing the samples at 60°C and 80°C. Similarly fly ash has been studied by activating with alkali hydroxides and alkali silicates. Under the same curing conditions the fly ash samples were evaluated to analyze the effects of added silicates for alkali activation. The peak shifts in the FTIR explains the changes in the structural nature of the matrix and can be identified using the deconvolution technique. A strong correlation is found between the concentrations of silicate monomer in the

  8. Figure analysis: A teaching technique to promote visual literacy and active Learning.

    Science.gov (United States)

    Wiles, Amy M

    2016-07-01

    Learning often improves when active learning techniques are used in place of traditional lectures. For many of these techniques, however, students are expected to apply concepts that they have already grasped. A challenge, therefore, is how to incorporate active learning into the classroom of courses with heavy content, such as molecular-based biology courses. An additional challenge is that visual literacy is often overlooked in undergraduate science education. To address both of these challenges, a technique called figure analysis was developed and implemented in three different levels of undergraduate biology courses. Here, students learn content while gaining practice in interpreting visual information by discussing figures with their peers. Student groups also make connections between new and previously learned concepts on their own while in class. The instructor summarizes the material for the class only after students grapple with it in small groups. Students reported a preference for learning by figure analysis over traditional lecture, and female students in particular reported increased confidence in their analytical abilities. There is not a technology requirement for this technique; therefore, it may be utilized both in classrooms and in nontraditional spaces. Additionally, the amount of preparation required is comparable to that of a traditional lecture. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(4):336-344, 2016.

  9. Communication Analysis modelling techniques

    CERN Document Server

    España, Sergio; Pastor, Óscar; Ruiz, Marcela

    2012-01-01

    This report describes and illustrates several modelling techniques proposed by Communication Analysis; namely Communicative Event Diagram, Message Structures and Event Specification Templates. The Communicative Event Diagram is a business process modelling technique that adopts a communicational perspective by focusing on communicative interactions when describing the organizational work practice, instead of focusing on physical activities1; at this abstraction level, we refer to business activities as communicative events. Message Structures is a technique based on structured text that allows specifying the messages associated to communicative events. Event Specification Templates are a means to organise the requirements concerning a communicative event. This report can be useful to analysts and business process modellers in general, since, according to our industrial experience, it is possible to apply many Communication Analysis concepts, guidelines and criteria to other business process modelling notation...

  10. Using ACTH Challenges to Validate Techniques for Adrenocortical Activity Analysis in Various African Wildlife Species

    Directory of Open Access Journals (Sweden)

    Diana M. Armstrong

    2012-04-01

    Full Text Available Monitoring adrenocortical activity using fecal hormonal analysis can provide information on how environmental changes are affecting non-domestic species health and success in the field; however, this noninvasive method needs proper validation to ensure that analysis reflects true physiological events. Our objectives were to use adrenocorticotropic hormone (ACTH challenges as a physiological validation method to test the suitability of a new corticosterone enzyme immunoassay (EIA to accurately assess the adrenocortical activity using fecal samples in four African wildlife species-the black rhinoceros (rhino; Diceros bicornis, African elephant (Loxodonta africana, chimpanzee (Pan troglodytes and African lion (Panthera leo krugeri. In the rhino and elephant, fecal Glucocorticoid metabolites (GC surged 75 and 51 h post-ACTH injection, respectively. In the chimpanzee, fecal GC metabolites peaked at 29 h post-injection. And the lion had a peak of fecal GC at 24 h post-ACTH. This study determined that adrenocortical activity was reflected in concentrations of fecal GC metabolites suggesting that this corticosterone EIA is an effective technique for the monitoring stress in four African species.

  11. Activity analysis: measurement of the effectiveness of surgical training and operative technique.

    Science.gov (United States)

    Shepherd, J P; Brickley, M

    1992-11-01

    All surgical procedures are characterised by a sequence of steps and instrument changes. Although surgical efficiency and training in operative technique closely relate to this process, few studies have attempted to analyse it quantitatively. Because efficiency is particularly important in day surgery and lower third molar removal is a high-volume procedure, the need for which is responsible for particularly long waiting-lists in almost all UK health regions, this operation was selected for evaluation. A series of 80 consecutive procedures, carried out for 43 day-stay patients under general anaesthesia by seven junior staff (senior house officers and registrars: 39 procedures) and four senior staff (senior registrars and consultants: 41 procedures) were analysed. Median operating time for procedures which required retraction of periosteum was 9.5 min (range 2.7-23.3 min). Where these steps were necessary, median time for incision was 25 s (range 10-90 s); for retraction of periosteum, 79 s (range 5-340 s); for bone removal, 118 s (range 10-380 s); for tooth excision, 131 s (range 10-900 s); for debridement, 74 s (range 5-270 s); and for suture, 144 s (range 25-320 s). Junior surgeons could be differentiated from senior surgeons on the basis of omission, repetition and duration of these steps. Juniors omitted retraction of periosteum in 10% of procedures (seniors 23%) and suture in 13% (seniors 32%). Juniors repeated steps in 47% of operations; seniors, 14%. Junior surgeons took significantly more time than senior surgeons for incision, bone removal and tooth excision. No significant differences between junior and senior surgeons were found in relation to the incidence of altered lingual and labial sensation at 7 days. It was concluded that activity analysis may be a useful measure of the effectiveness of surgical training and the efficiency of operative technique.

  12. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  13. Fluorous-assisted metal chelate affinity extraction technique for analysis of protein kinase activity.

    Science.gov (United States)

    Hayama, Tadashi; Kiyokawa, Ena; Yoshida, Hideyuki; Imakyure, Osamu; Yamaguchi, Masatoshi; Nohta, Hitoshi

    2016-08-15

    We have developed a fluorous affinity-based extraction method for measurement of protein kinase activity. In this method, a fluorescent peptide substrate was phosphorylated by a protein kinase, and the obtained phosphopeptide was selectively captured with Fe(III)-immobilized perfluoroalkyliminodiacetic acid reagent via a metal chelate affinity technique. Next, the captured phosphopeptide was selectively extracted into a fluorous solvent mixture, tetradecafluorohexane and 1H,1H,2H,2H-tridecafluoro-1-n-octanol (3:1, v/v), using the specificity of fluorous affinity (fluorophilicity). In contrast, the remained substrate peptide in the aqueous (non-fluorous) phase was easily measured fluorimetrically. Finally, the enzyme activity could be assayed by measuring the decrease in fluorescence. The feasibility of this method was demonstrated by applying the method for measurement of the activity of cAMP-dependent protein kinase (PKA) using its substrate peptide (kemptide) pre-labeled with carboxytetramethylrhodamine (TAMRA).

  14. Normal coordinate analysis and fungicidal activity study on anilazine and its related compound using spectroscopic techniques

    Science.gov (United States)

    Sheeja Mol, Gilbert Pushpam; Arul Dhas, Deva Dhas; Hubert Joe, Isaac; Balachandran, Sreedharan

    2016-06-01

    The FTIR and FT-Raman spectra of anilazine have been recorded in the range 400-4000 cm-1 and 50-3500 cm-1 respectively. The optimized geometrical parameters of the compound were calculated using B3LYP method with 6-311G(d,p) basis set. The distribution of the vibrational bands were carried out with the help of normal coordinate analysis (NCA). The 1H and 13C nuclear spectra have been recorded and chemical shifts of the molecule were also calculated using the gauge independent atomic orbital (GIAO) method. The UV-Visible spectrum of the compound was recorded in the region 190-900 nm and the electronic properties were determined by time-dependent DFT (TD-DFT) approach. Anilazine was screened for its antifungal activity. Molecular docking studies are conducted to predict its fungicidal activity.

  15. Cost Analysis of MRI Services in Iran: An Application of Activity Based Costing Technique

    Directory of Open Access Journals (Sweden)

    Bayati

    2015-09-01

    Full Text Available Background Considerable development of MRI technology in diagnostic imaging, high cost of MRI technology and controversial issues concerning official charges (tariffs have been the main motivations to define and implement this study. Objectives The present study aimed to calculate the unit-cost of MRI services using activity-based costing (ABC as a modern cost accounting system and to fairly compare calculated unit-costs with official charges (tariffs. Materials and Methods We included both direct and indirect costs of MRI services delivered in fiscal year 2011 in Shiraz Shahid Faghihi hospital. Direct allocation method was used for distribution of overhead costs. We used micro-costing approach to calculate unit-cost of all different MRI services. Clinical cost data were retrieved from the hospital registering system. Straight-line method was used for depreciation cost estimation. To cope with uncertainty and to increase the robustness of study results, unit costs of 33 MRI services was calculated in terms of two scenarios. Results Total annual cost of MRI activity center (AC was calculated at USD 400,746 and USD 532,104 based on first and second scenarios, respectively. Ten percent of the total cost was allocated from supportive departments. The annual variable costs of MRI center were calculated at USD 295,904. Capital costs measured at USD 104,842 and USD 236, 200 resulted from the first and second scenario, respectively. Existing tariffs for more than half of MRI services were above the calculated costs. Conclusion As a public hospital, there are considerable limitations in both financial and administrative databases of Shahid Faghihi hospital. Labor cost has the greatest share of total annual cost of Shahid Faghihi hospital. The gap between unit costs and tariffs implies that the claim for extra budget from health providers may not be relevant for all services delivered by the studied MRI center. With some adjustments, ABC could be

  16. Cold neutron prompt gamma activation analysis, a non-destructive technique for hydrogen level assessment in zirconium alloys

    Science.gov (United States)

    Couet, Adrien; Motta, Arthur T.; Comstock, Robert J.; Paul, Rick L.

    2012-06-01

    We propose a novel use of a non-destructive technique to quantitatively assess hydrogen concentration in zirconium alloys. The technique, called Cold Neutron Prompt Gamma Activation Analysis (CNPGAA), is based on measuring prompt gamma rays following the absorption of cold neutrons, and comparing the rate of detection of characteristic hydrogen gamma rays to that of gamma rays from matrix atoms. Because the emission is prompt, this method has to be performed in close proximity to a neutron source such as the one at the National Institute of Technology (NIST) Center for Neutron Research. Determination shown here to be simple and accurate, matching the results given by usual destructive techniques such as Vacuum Hot Extraction (VHE), with a precision of ±2 mg kg-1 (or wt ppm). Very low levels of hydrogen (as low as 5 mg kg-1 (wt ppm)) can be detected. Also, it is demonstrated that CNPGAA can be applied sequentially on an individual corrosion coupon during autoclave testing, to measure a gradually increasing hydrogen concentration. Thus, this technique can replace destructive techniques performed on "sister" samples thereby reducing experimental uncertainties.

  17. Techniques for active passivation

    Energy Technology Data Exchange (ETDEWEB)

    Roscioli, Joseph R.; Herndon, Scott C.; Nelson, Jr., David D.

    2016-12-20

    In one embodiment, active (continuous or intermittent) passivation may be employed to prevent interaction of sticky molecules with interfaces inside of an instrument (e.g., an infrared absorption spectrometer) and thereby improve response time. A passivation species may be continuously or intermittently applied to an inlet of the instrument while a sample gas stream is being applied. The passivation species may have a highly polar functional group that strongly binds to either water or polar groups of the interfaces, and once bound presents a non-polar group to the gas phase in order to prevent further binding of polar molecules. The instrument may be actively used to detect the sticky molecules while the passivation species is being applied.

  18. Activity analysis: measurement of the effectiveness of surgical training and operative technique.

    OpenAIRE

    Shepherd, J P; Brickley, M.

    1992-01-01

    All surgical procedures are characterised by a sequence of steps and instrument changes. Although surgical efficiency and training in operative technique closely relate to this process, few studies have attempted to analyse it quantitatively. Because efficiency is particularly important in day surgery and lower third molar removal is a high-volume procedure, the need for which is responsible for particularly long waiting-lists in almost all UK health regions, this operation was selected for e...

  19. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    Science.gov (United States)

    Miller, Mary A.; Tangyunyong, Paiboon; Cole, Edward I.

    2016-01-01

    Laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes (LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increased leakage is not present in devices without AVM signals. Transmission electron microscopy analysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].

  20. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Mary A.; Tangyunyong, Paiboon; Cole, Edward I. [Sandia National Laboratories, Albuquerque, New Mexico 87185-1086 (United States)

    2016-01-14

    Laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes (LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increased leakage is not present in devices without AVM signals. Transmission electron microscopy analysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].

  1. Thermal signature analysis of human face during jogging activity using infrared thermography technique

    Science.gov (United States)

    Budiarti, Putria W.; Kusumawardhani, Apriani; Setijono, Heru

    2016-11-01

    Thermal imaging has been widely used for many applications. Thermal camera is used to measure object's temperature above absolute temperature of 0 Kelvin using infrared radiation emitted by the object. Thermal imaging is color mapping taken using false color that represents temperature. Human body is one of the objects that emits infrared radiation. Human infrared radiations vary according to the activity that is being done. Physical activities such as jogging is among ones that is commonly done. Therefore this experiment will investigate the thermal signature profile of jogging activity in human body, especially in the face parts. The results show that the significant increase is found in periorbital area that is near eyes and forehand by the number of 7.5%. Graphical temperature distributions show that all region, eyes, nose, cheeks, and chin at the temperature of 28.5 - 30.2°C the pixel area tends to be constant since it is the surrounding temperature. At the temperature of 30.2 - 34.7°C the pixel area tends to increase, while at the temperature of 34.7 - 37.1°C the pixel area tends to decrease because pixels at temperature of 34.7 - 37.1°C after jogging activity change into temperature of 30.2 - 34.7°C so that the pixel area increases. The trendline of jogging activity during 10 minutes period also shows the increasing of temperature. The results of each person also show variations due to physiological nature of each person, such as sweat production during physical activities.

  2. Decision Analysis Technique

    Directory of Open Access Journals (Sweden)

    Hammad Dabo Baba

    2014-01-01

    Full Text Available One of the most significant step in building structure maintenance decision is the physical inspection of the facility to be maintained. The physical inspection involved cursory assessment of the structure and ratings of the identified defects based on expert evaluation. The objective of this paper is to describe present a novel approach to prioritizing the criticality of physical defects in a residential building system using multi criteria decision analysis approach. A residential building constructed in 1985 was considered in this study. Four criteria which includes; Physical Condition of the building system (PC, Effect on Asset (EA, effect on Occupants (EO and Maintenance Cost (MC are considered in the inspection. The building was divided in to nine systems regarded as alternatives. Expert's choice software was used in comparing the importance of the criteria against the main objective, whereas structured Proforma was used in quantifying the defects observed on all building systems against each criteria. The defects severity score of each building system was identified and later multiplied by the weight of the criteria and final hierarchy was derived. The final ranking indicates that, electrical system was considered the most critical system with a risk value of 0.134 while ceiling system scored the lowest risk value of 0.066. The technique is often used in prioritizing mechanical equipment for maintenance planning. However, result of this study indicates that the technique could be used in prioritizing building systems for maintenance planning

  3. Model building techniques for analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald; Cordova, Theresa Elena; Henry, Ronald C.; Brooks, Sean; Martin, Wilbur D.

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.

  4. Application of synchrotron-radiation-based x-ray microprobe techniques for the analysis of recombination activity of metals precipitated at Si/SiGe misfit dislocations

    Energy Technology Data Exchange (ETDEWEB)

    Vyvenko, O F [University of California, LBNL, 1 Cyclotron Rd, Berkeley, CA 94720 (United States); Buonassisi, T [University of California, LBNL, 1 Cyclotron Rd, Berkeley, CA 94720 (United States); Istratov, A A [University of California, LBNL, 1 Cyclotron Rd, Berkeley, CA 94720 (United States); Weber, E R [University of California, LBNL, 1 Cyclotron Rd, Berkeley, CA 94720 (United States); Kittler, M [IHP, Im Technologiepark 25, D-15236 Frankfurt (Oder) (Germany); Seifert, W [IHP, Im Technologiepark 25, D-15236 Frankfurt (Oder) (Germany)

    2002-12-09

    In this study we report application of synchrotron-radiation-based x-ray microprobe techniques (the x-ray-beam-induced current (XBIC) and x-ray fluorescence ({mu}-XRF) methods) to the analysis of the recombination activity and space distribution of copper and iron in the vicinity of dislocations in silicon/silicon-germanium structures. A combination of these two techniques enables one to study the chemical nature of the defects and impurities and their recombination activity in situ and to map metal clusters with a micron-scale resolution. XRF analysis revealed that copper formed clearly distinguishable precipitates along the misfit dislocations. A proportional dependence between the XBIC contrast and the number of copper atoms in the precipitates was established. In hydrogen-passivated iron-contaminated samples we observed clusters of iron precipitates which had no recombination activity detectable by the XBIC technique as well as iron clusters which were not completely passivated.

  5. Improved techniques in data analysis and interpretation of potential fields: examples of application in volcanic and seismically active areas

    Directory of Open Access Journals (Sweden)

    G. Florio

    2002-06-01

    Full Text Available Geopotential data may be interpreted by many different techniques, depending on the nature of the mathematical equations correlating specific unknown ground parameters to the measured data set. The investigation based on the study of the gravity and magnetic anomaly fields represents one of the most important geophysical approaches in the earth sciences. It has now evolved aimed both at improving of known methods and testing other new and reliable techniques. This paper outlines a general framework for several applications of recent techniques in the study of the potential methods for the earth sciences. Most of them are here described and significant case histories are shown to illustrate their reliability on active seismic and volcanic areas.

  6. Two non-destructive neutron inspection techniques: prompt gamma-ray activation analysis and cold neutron tomography

    OpenAIRE

    Baechler, Sébastien; Dousse, Jean-Claude; Jolie, Jan

    2005-01-01

    Deux techniques d’inspection non-destructives utilisant des faisceaux de neutrons froids ont été développées à la source de neutrons SINQ de l’Institut Paul Scherrer : (1) l’analyse par activation neutronique prompte (PGAA) et (2) la tomographie neutronique. L’analyse par PGA (Prompt Gamma-ray Activation) est une méthode nucléaire qui permet de déterminer la concentration d’éléments présents dans un échantillon. Cette technique consiste à détecter les rayons gamma prompts émis par l’échantill...

  7. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  8. Kinetic activation-relaxation technique

    Science.gov (United States)

    Béland, Laurent Karim; Brommer, Peter; El-Mellouhi, Fedwa; Joly, Jean-François; Mousseau, Normand

    2011-10-01

    We present a detailed description of the kinetic activation-relaxation technique (k-ART), an off-lattice, self-learning kinetic Monte Carlo (KMC) algorithm with on-the-fly event search. Combining a topological classification for local environments and event generation with ART nouveau, an efficient unbiased sampling method for finding transition states, k-ART can be applied to complex materials with atoms in off-lattice positions or with elastic deformations that cannot be handled with standard KMC approaches. In addition to presenting the various elements of the algorithm, we demonstrate the general character of k-ART by applying the algorithm to three challenging systems: self-defect annihilation in c-Si (crystalline silicon), self-interstitial diffusion in Fe, and structural relaxation in a-Si (amorphous silicon).

  9. Kinetic activation-relaxation technique.

    Science.gov (United States)

    Béland, Laurent Karim; Brommer, Peter; El-Mellouhi, Fedwa; Joly, Jean-François; Mousseau, Normand

    2011-10-01

    We present a detailed description of the kinetic activation-relaxation technique (k-ART), an off-lattice, self-learning kinetic Monte Carlo (KMC) algorithm with on-the-fly event search. Combining a topological classification for local environments and event generation with ART nouveau, an efficient unbiased sampling method for finding transition states, k-ART can be applied to complex materials with atoms in off-lattice positions or with elastic deformations that cannot be handled with standard KMC approaches. In addition to presenting the various elements of the algorithm, we demonstrate the general character of k-ART by applying the algorithm to three challenging systems: self-defect annihilation in c-Si (crystalline silicon), self-interstitial diffusion in Fe, and structural relaxation in a-Si (amorphous silicon).

  10. Study of some Ayurvedic Indian medicinal plants for the essential trace elemental contents by instrumental neutron activation analysis and atomic absorption spectroscopy techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lokhande, R.S.; Singare, P.U.; Andhele, M.L. [Dept. of Chemistry, Univ. of Mumbai, Santacruz, Mumbai (India); Acharya, R.; Nair, A.G.C.; Reddy, A.V.R. [Radiochemistry Div., Bhabha Atomic Research Centre, Trombay, Mumbai (India)

    2009-07-01

    Elemental analysis of some medicinal plants used in the Indian Ayurvedic system was performed by employing instrumental neutron activation analysis (INAA) and atomic absorption spectroscopy (AAS) techniques. The samples were irradiated with thermal neutrons in a nuclear reactor and the induced activity was counted by gamma ray spectrometry using an efficiency calibrated high resolution high purity germanium (HPGe) detector. Most of the medicinal plants were found to be rich in one or more of the elements under study. The variation in elemental concentration in same medicinal plants samples collected in summer, winter and rainy seasons was studied and the biological effects of these elements on human beings are discussed. (orig.)

  11. Quantitative Techniques in Volumetric Analysis

    Science.gov (United States)

    Zimmerman, John; Jacobsen, Jerrold J.

    1996-12-01

    Quantitative Techniques in Volumetric Analysis is a visual library of techniques used in making volumetric measurements. This 40-minute VHS videotape is designed as a resource for introducing students to proper volumetric methods and procedures. The entire tape, or relevant segments of the tape, can also be used to review procedures used in subsequent experiments that rely on the traditional art of quantitative analysis laboratory practice. The techniques included are: Quantitative transfer of a solid with a weighing spoon Quantitative transfer of a solid with a finger held weighing bottle Quantitative transfer of a solid with a paper strap held bottle Quantitative transfer of a solid with a spatula Examples of common quantitative weighing errors Quantitative transfer of a solid from dish to beaker to volumetric flask Quantitative transfer of a solid from dish to volumetric flask Volumetric transfer pipet A complete acid-base titration Hand technique variations The conventional view of contemporary quantitative chemical measurement tends to focus on instrumental systems, computers, and robotics. In this view, the analyst is relegated to placing standards and samples on a tray. A robotic arm delivers a sample to the analysis center, while a computer controls the analysis conditions and records the results. In spite of this, it is rare to find an analysis process that does not rely on some aspect of more traditional quantitative analysis techniques, such as careful dilution to the mark of a volumetric flask. Figure 2. Transfer of a solid with a spatula. Clearly, errors in a classical step will affect the quality of the final analysis. Because of this, it is still important for students to master the key elements of the traditional art of quantitative chemical analysis laboratory practice. Some aspects of chemical analysis, like careful rinsing to insure quantitative transfer, are often an automated part of an instrumental process that must be understood by the

  12. Triangulation of Data Analysis Techniques

    Directory of Open Access Journals (Sweden)

    Lauri, M

    2011-10-01

    Full Text Available In psychology, as in other disciplines, the concepts of validity and reliability are considered essential to give an accurate interpretation of results. While in quantitative research the idea is well established, in qualitative research, validity and reliability take on a different dimension. Researchers like Miles and Huberman (1994 and Silverman (2000, 2001, have shown how these issues are addressed in qualitative research. In this paper I am proposing that the same corpus of data, in this case the transcripts of focus group discussions, can be analysed using more than one data analysis technique. I refer to this idea as ‘triangulation of data analysis techniques’ and argue that such triangulation increases the reliability of the results. If the results obtained through a particular data analysis technique, for example thematic analysis, are congruent with the results obtained by analysing the same transcripts using a different technique, for example correspondence analysis, it is reasonable to argue that the analysis and interpretation of the data is valid.

  13. Atmospheric deposition of rare earth elements in Albania studied by the moss biomonitoring technique, neutron activation analysis and GIS technology.

    Science.gov (United States)

    Allajbeu, Sh; Yushin, N S; Qarri, F; Duliu, O G; Lazo, P; Frontasyeva, M V

    2016-07-01

    Rare earth elements (REEs) are typically conservative elements that are scarcely derived from anthropogenic sources. The mobilization of REEs in the environment requires the monitoring of these elements in environmental matrices, in which they are present at trace level. The determination of 11 REEs in carpet-forming moss species (Hypnum cupressiforme) collected from 44 sampling sites over the whole territory of the country were done by using epithermal neutron activation analysis (ENAA) at IBR-2 fast pulsed reactor in Dubna. This paper is focused on REEs (lanthanides) and Sc. Fe as typical consistent element and Th that appeared good correlations between the elements of lanthanides are included in this paper. Th, Sc, and REEs were never previously determined in the air deposition of Albania. Descriptive statistics were used for data treatment using MINITAB 17 software package. The median values of the elements under investigation were compared with those of the neighboring countries such as Bulgaria, Macedonia, Romania, and Serbia, as well as Norway which is selected as a clean area. Geographical distribution maps of the elements over the sampled territory were constructed using geographic information system (GIS) technology. Geochemical behavior of REEs in moss samples has been studied by using the ternary diagram of Sc-La-Th, Spider diagrams and multivariate analysis. It was revealed that the accumulation of REEs in current mosses is associated with the wind-blowing metal-enriched soils that is pointed out as the main emitting factor of the elements under investigation.

  14. Proof-of-principle results for identifying the composition of dust particles and volcanic ash samples through the technique of photon activation analysis at the IAC

    Science.gov (United States)

    Mamtimin, Mayir; Cole, Philip L.; Segebade, Christian

    2013-04-01

    Instrumental analytical methods are preferable in studying sub-milligram quantities of airborne particulates collected in dust filters. The multi-step analytical procedure used in treating samples through chemical separation can be quite complicated. Further, due to the minute masses of the airborne particulates collected on filters, such chemical treatment can easily lead to significant levels of contamination. Radio-analytical techniques, and in particular, activation analysis methods offer a far cleaner alternative. Activation methods require minimal sample preparation and provide sufficient sensitivity for detecting the vast majority of the elements throughout the periodic table. In this paper, we will give a general overview of the technique of photon activation analysis. We will show that by activating dust particles with 10- to 30-MeV bremsstrahlung photons, we can ascertain their elemental composition. The samples are embedded in dust-collection filters and are irradiated "as is" by these photons. The radioactivity of the photonuclear reaction products is measured with appropriate spectrometers and the respective analytes are quantified using multi-component calibration materials. We shall provide specific examples of identifying the elemental components of airborne dust particles and volcanic ash by making use of bremsstrahlung photons from an electron linear accelerator at the Idaho Accelerator Center in Pocatello, Idaho.

  15. Use of the Taguchi method for biomechanical comparison of flexor-tendon-repair techniques to allow immediate active flexion. A new method of analysis and optimization of technique to improve the quality of the repair.

    Science.gov (United States)

    Singer, G; Ebramzadeh, E; Jones, N F; Meals, R

    1998-10-01

    The current trend toward early active flexion after repair of the flexor tendons necessitates a stronger repair than that provided by a modified Kessler technique with use of 4-0 nylon suture. The purpose of the current study was to determine, with use of the Taguchi method of analysis, the strongest and most consistent repair of the flexor tendons. Flexor tendons were obtained from fresh-frozen hands of human cadavera. Eight flexor tendons initially were repaired with the modified Kessler technique with use of 4-0 nylon core suture and 6-0 nylon epitenon suture. A test matrix was used to analyze a total of twenty variables in sixty-four tests. These variables included eight techniques for core-suture repair, four types of core suture, two sizes of core suture, four techniques for suture of the epitenon, and two distances from the repair site for placement of the core suture. After each repair, the specimens were mounted in a servohydraulic mechanical testing machine for tension-testing to failure. The optimum combination of variables was determined, with the Taguchi method, to be an augmented Becker technique with use of 3-0 Mersilene core suture, placed 0.75 centimeter from the cut edge with volar epitenon suture. The four-strand, double modified Kessler technique provided the second strongest repair. Five tendons that had been repaired with use of the optimum combination then were tested and compared with tendons that had been repaired with the standard modified Kessler technique. With the optimum combination of variables, the strength of the repair improved from a mean (and standard deviation) of 17.2 +/- 2.9 to 128 +/- 5.6 newtons, and the stiffness improved from a mean of 4.6 to 16.2 newtons per millimeter.

  16. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  17. Influence of elemental concentration in soil on vegetables applying analytical nuclear techniques: k{sub 0}-instrumental neutron activation analysis and radiometry

    Energy Technology Data Exchange (ETDEWEB)

    Menezes, Maria Angela de B.C. [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), Belo Horizonte, MG (Brazil). Servico de Reator e Irradiacao]. E-mail: menezes@cdtn.br; Mingote, Raquel Maia [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), Belo Horizonte, MG (Brazil). Servico de Quimica e Radioquimica; Silva, Lucilene Guerra e; Pedrosa, Lorena Gomes [Minas Gerais Univ., Belo Horizonte, MG (Brazil). Faculdade de Farmacia

    2005-07-01

    Samples from two vegetable gardens where analysed aiming at determining the elemental concentration. The vegetables selected to be studied are grown by the people for their own use and are present in daily meal. One vegetable garden studied is close to a mining activity in a region inserted in the Iron Quadrangle (Quadrilatero Ferrifero), located in the Brazilian state of Minas Gerais. This region is considered one of the richest mineral bearing regions in the world. Another vegetable garden studied is far from this region and without any mining activity It was also studied as a comparative site. This assessment was carried out to evaluate the elemental concentration in soil and vegetables, matrixes connected with the chain food, applying the k{sub 0}-Instrumental Neutron Activation Analysis (k{sub 0}-INAA) at the Laboratory for Neutron Activation Analysis. However, this work reports only the results of thorium, uranium and rare-earth obtained in samples collected during the dry season, focusing on the influence of these elements on vegetable elemental composition. Results of natural radioactivity determined by Gross Alpha and Gross Beta measurements, are also reported. This study is related to the BRA 11920 project, entitled 'Iron Quadrangle, Brazil: assessment of health impact caused by mining pollutants through chain food applying nuclear and related techniques', one of the researches co-ordinated by the IAEA (Vienna, Austria). (author)

  18. Chromatographic finger print analysis of anti-inflammatory active extract fractions of aerial parts of Tribulus terrestris by HPTLC technique

    Institute of Scientific and Technical Information of China (English)

    Mona Salih Mohammed; Mohamed Fahad Alajmi; Perwez Alam; Hassan Subki Khalid; Abelkhalig Muddathir Mahmoud; Wadah Jamal Ahmed

    2014-01-01

    Objective:To develop HPTLC fingerprint profile of anti-inflammatory active extract fractions of Tribulus terrestris (family Zygophyllaceae). Methods:The anti-inflammatory activity was tested for the methanol and its fractions (chloroform, ethyl acetate, n-butanol and aqueous) and chloroform extract of Tribulus terrestris (aerial parts) by injecting different groups of rats (6 each) with carrageenan in hind paw and measuring the edema volume before and 1, 2 and 3 h after carrageenan injection. Control group received saline i.p. The extracts treatment was injected i.p. in doses of 200 mg/kg 1 h before carrageenan administration. Indomethacin (30 mg/kg) was used as standard. HPTLC studies were carried out using CAMAG HPTLC system equipped with Linomat IV applicator, TLC scanner 3, Reprostar 3, CAMAG ADC 2 and WIN CATS-4 software for the active fractions of chloroform fraction of methanol extract. Results:The methanol extract showed good antiedematous effect with percentage of inhibition more than 72%, indicating its ability to inhibit the inflammatory mediators. The methanol extract was re-dissolved in 100 mL of distilled water and fractionated with chloroform, ethyl acetate and n-butanol. The four fractions (chloroform, ethyl acetate, n-butanol and aqueous) were subjected to anti-inflammatory activity. Chloroform fraction showed good anti-inflammatory activity at dose of 200 mg/kg. Chloroform fraction was then subjected to normal phase silica gel column chromatography and eluted with petroleum ether-chloroform, chloroform-ethyl acetate mixtures of increasing polarity which produced 15 fractions (F1-F15). Only fractions F1, F2, F4, F5, F7, F9, F11 and F14 were found to be active, hence these were analyzed with HPTLC to develop their finger print profile. These fractions showed different spots with different Rf values. Conclusions:The different chloroform fractions F1, F2, F4, F5, F7, F9, F11 and F14 revealed 4, 7, 7, 8, 9, 7, 7 and 6 major spots, respectively. The

  19. BIOMECHANICS AND HISTOLOGICAL ANALYSIS IN RABBIT FLEXOR TENDONS REPAIRED USING THREE SUTURE TECHNIQUES (FOUR AND SIX STRANDS) WITH EARLY ACTIVE MOBILIZATION

    Science.gov (United States)

    Severo, Antônio Lourenço; Arenhart, Rodrigo; Silveira, Daniela; Ávila, Aluísio Otávio Vargas; Berral, Francisco José; Lemos, Marcelo Barreto; Piluski, Paulo César Faiad; Lech, Osvandré Luís Canfield; Fukushima, Walter Yoshinori

    2015-01-01

    Objective: Analyzing suture time, biomechanics (deformity between the stumps) and the histology of three groups of tendinous surgical repair: Brazil-2 (4-strands) which the end knot (core) is located outside the tendon, Indiana (4-strands) and Tsai (6-strands) with sutures technique which the end knot (core) is inner of the tendon, associated with early active mobilization. Methods: The right calcaneal tendons (plantar flexor of the hind paw) of 36 rabbits of the New Zealand breed (Oryctolagus cuniculus) were used in the analysis. This sample presents similar size to human flexor tendon that has approximately 4.5 mm (varying from 2mm). The selected sample showed the same mass (2.5 to 3kg) and were male or female adults (from 8 ½ months). For the flexor tendons of the hind paws, sterile and driven techniques were used in accordance to the Committee on Animal Research and Ethics (CETEA) of the University of the State of Santa Catarina (UDESC), municipality of Lages, in Brazil (protocol # 1.33.09). Results: In the biomechanical analysis (deformity) carried out between tendinous stumps, there was no statistically significant difference (p>0.01). There was no statistical difference in relation to surgical time in all three suture techniques with a mean of 6.0 minutes for Tsai (6- strands), 5.7 minutes for Indiana (4-strands) and 5.6 minutes for Brazil (4-strands) (p>0.01). With the early active mobility, there was qualitative and quantitative evidence of thickening of collagen in 38.9% on the 15th day and in 66.7% on the 30th day, making the biological tissue stronger and more resistant (p=0.095). Conclusion: This study demonstrated that there was no histological difference between the results achieved with an inside or outside end knot with respect to the repaired tendon and the number of strands did not affect healing, vascularization or sliding of the tendon in the osteofibrous tunnel, which are associated with early active mobility, with the repair techniques

  20. Active learning techniques for librarians practical examples

    CERN Document Server

    Walsh, Andrew

    2010-01-01

    A practical work outlining the theory and practice of using active learning techniques in library settings. It explains the theory of active learning and argues for its importance in our teaching and is illustrated using a large number of examples of techniques that can be easily transferred and used in teaching library and information skills to a range of learners within all library sectors. These practical examples recognise that for most of us involved in teaching library and information skills the one off session is the norm, so we need techniques that allow us to quickly grab and hold our

  1. Air Pollution Studies in Central Russia (Tver and Yaroslavl Regions) Using the Moss Biomonitoring Technique and Neutron Activation Analysis

    CERN Document Server

    Ermakova, E V; Pavlov, S S; Povtoreiko, E A; Steinnes, E; Cheremisina, Ye N

    2003-01-01

    Data of 34 elements, including heavy metals, halogens, rare-earth elements, U, and Th in 140 moss samples, collected in central Russia (Tver and Yaroslavl regions and the northern part of Moscow Region) in 2000-2002, are presented. Factor analysis with VARIMAX rotation was applied to identify possible sources of the elements determined in the mosses. The seven resulting factors represent crust, vegetation and anthropogenic components in the moss. Some of the factors were interpreted as being associated with ferrous smelters (Fe, Zn, Sb, Ta); combination of non-ferrous smelters and other industries (Mn, Co, Mo, Cr, Ni, W); an oil-refining plant, and oil combustion at the thermal power plant (V, Ni). The geographical distribution patterns of the factor scores are also presented. The dependency equations of elemental content in mosses versus distance from the source are derived.

  2. Determination of concentrations of Fe, Mg, and Zn in some ferrite samples using neutron activation analysis and X-ray fluorescence techniques.

    Science.gov (United States)

    Ali, I A; Mohamed, Gehan Y; Azzam, A; Sattar, A A

    2017-01-14

    Mg-Zn ferrite is considered as one of the important materials with potential uses in many applications. In this work, samples of ferrite Mg(1-x)ZnxFe2O4 (where x=0.0, 0.2, 0.4, 0.6, 0.8 and 1) were synthesized by the sol-gel method for use in some hyperthermia applications. The composition and purity of the prepared samples hardly affected their properties. Therefore, the elemental concentration of these samples was measured by the X-ray fluorescence technique and thermal neutron activation analysis to check the quality of the prepared samples. The results of both methods were compared with each other and with the molecular ratios of the as-prepared samples. In addition, no existing elemental impurity, with considerable concentration, was measured.

  3. Prefractionation techniques in proteome analysis.

    Science.gov (United States)

    Righetti, Pier Giorgio; Castagna, Annalisa; Herbert, Ben; Reymond, Frederic; Rossier, Joël S

    2003-08-01

    The present review deals with a number of prefractionation protocols in preparation for two-dimensional map analysis, both in the fields of chromatography and in the field of electrophoresis. In the first case, Fountoulaki's groups has reported just about any chromatographic procedure useful as a prefractionation step, including affinity, ion-exchange, and reversed-phase resins. As a result of the various enrichment steps, several hundred new species, previously undetected in unfractionated samples, could be revealed for the first time. Electrophoretic prefractionation protocols include all those electrokinetic methodologies which are performed in free solution, essentially all relying on isoelectric focusing steps. The devices here reviewed include multichamber apparatus, such as the multicompartment electrolyzer with Immobiline membranes, Off-Gel electrophoresis in a multicup device and the Rotofor, an instrument also based on a multichamber system but exploiting the conventional technique of carrier-ampholyte-focusing. Other instruments of interest are the Octopus, a continuous-flow device for isoelectric focusing in a upward flowing liquid curtain, and the Gradiflow, where different pI cuts are obtained by a multistep passage through two compartments buffered at different pH values. It is felt that this panoply of methods could offer a strong step forward in "mining below the tip of the iceberg" for detecting the "unseen proteome".

  4. Comparative Analysis of Hand Gesture Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Arpana K. Patel

    2015-03-01

    Full Text Available During past few years, human hand gesture for interaction with computing devices has continues to be active area of research. In this paper survey of hand gesture recognition is provided. Hand Gesture Recognition is contained three stages: Pre-processing, Feature Extraction or matching and Classification or recognition. Each stage contains different methods and techniques. In this paper define small description of different methods used for hand gesture recognition in existing system with comparative analysis of all method with its benefits and drawbacks are provided.

  5. Characterization of ancient glass excavated in Enez (Ancient Ainos) Turkey by combined Instrumental Neutron Activation Analysis and Fourier Transform Infrared spectrometry techniques

    Energy Technology Data Exchange (ETDEWEB)

    Akyuz, Sevim, E-mail: s.akyuz@iku.edu.tr [Physics Department, Science and Letters Faculty, Istanbul Kultur University, Atakoy Campus, Bakirkoy 34156, Istanbul (Turkey); Akyuz, Tanil [Physics Department, Science and Letters Faculty, Istanbul Kultur University, Atakoy Campus, Bakirkoy 34156, Istanbul (Turkey); Mukhamedshina, Nuranya M.; Mirsagatova, A. Adiba [Institute of Nuclear Physics, Uzbek Academy of Sciences, 702132, Ulugbek, Tashkent (Uzbekistan); Basaran, Sait; Cakan, Banu [Department of Restoration and Conservation of Artefacts, Letters Faculty, Istanbul University, Vezneciler, Istanbul (Turkey)

    2012-05-15

    Ancient glass fragments excavated in the archaeological district Enez (Ancient Ainos)-Turkey were investigated by combined Instrumental Neutron Activation Analysis (INAA) and Fourier Transform Infrared (FTIR) spectrometry techniques. Multi-elemental contents of 15 glass fragments that belong to Hellenistic, Roman, Byzantine, and Ottoman Periods, were determined by INAA. The concentrations of twenty six elements (Na, K, Ca, Sc, Cr, Mn, Fe, Co, Cu, Zn, As, Rb, Sr, Sb, Cs, Ba, Ce, Sm, Eu, Tb, Yb, Lu, Hf, Ta, Au and Th), which might be present in the samples as flux, stabilizers, colorants or opacifiers, and impurities, were examined. Chemometric treatment of the INAA data was performed and principle component analysis revealed presence of 3 distinct groups. The thermal history of the glass samples was determined by FTIR spectrometry. - Highlights: Black-Right-Pointing-Pointer INAA was performed to determine elemental compositions of ancient glass fragments. Black-Right-Pointing-Pointer Basic, coloring/discoloring elements and impurities have been determined. Black-Right-Pointing-Pointer PCA discriminated the glasses depending on their chronological order. Black-Right-Pointing-Pointer The thermal history of the glass samples was determined by FTIR spectrometry.

  6. Active load control techniques for wind turbines.

    Energy Technology Data Exchange (ETDEWEB)

    van Dam, C.P. (University of California, Davis, CA); Berg, Dale E.; Johnson, Scott J. (University of California, Davis, CA)

    2008-07-01

    This report provides an overview on the current state of wind turbine control and introduces a number of active techniques that could be potentially used for control of wind turbine blades. The focus is on research regarding active flow control (AFC) as it applies to wind turbine performance and loads. The techniques and concepts described here are often described as 'smart structures' or 'smart rotor control'. This field is rapidly growing and there are numerous concepts currently being investigated around the world; some concepts already are focused on the wind energy industry and others are intended for use in other fields, but have the potential for wind turbine control. An AFC system can be broken into three categories: controls and sensors, actuators and devices, and the flow phenomena. This report focuses on the research involved with the actuators and devices and the generated flow phenomena caused by each device.

  7. PGNAA 方法学的发展与现状%Development and Status of Prompt Gamma Neutron Activation Analysis Technique Methodology

    Institute of Scientific and Technical Information of China (English)

    王兴华; 孙洪超; 姚永刚; 肖才锦; 张贵英; 金象春; 华龙; 周四春

    2014-01-01

    瞬发伽马中子活化分析(PGNAA)为非破坏性、在线测量的核分析方法。目前国际上有30多座研究堆建立了PGNAA实验室。本文介绍了三种定量瞬发伽马活化分析方法:相对法、校准曲线法、k0因子法,阐述了基本原理及其应用领域,以及针对短寿命核素高精度测量的束流斩波器技术,针对大样品测量带来的中子自吸收和伽马自屏蔽效应的内标法。此外还简介了基于CARR堆的热中子瞬发伽马活化分析装置进展情况,对国内的PGNAA问题进行了探讨。%Prompt Gamma Neutron Activation Analysis (PGNAA) is one of the nonde‐structive and On‐line measurement of nuclear analytical methods ,There are more than 30 PGNAA laboratories which are established based on the research reactor currently . The basic principle and the application field of three kinds of analytical method of PGNAA were introduced ,such as the relative comparison method、calibration method、k0‐factor method .T he short life nuclides is proposed using the beam chopper technique in order to improve the measurement accuracy . T he internal standard method w as proposed for that large sample neutron measurement that brings self absorption and gamma‐ray self shielding effect .The PGNAA system was introduced at CARR .It pro‐vides methodology reference to establish the prompt gamma activation analysis on the base of CARR for our country .

  8. Innovative Techniques Simplify Vibration Analysis

    Science.gov (United States)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  9. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  10. Analysis of neutron flux distribution using the Monte Carlo method for the feasibility study of the Prompt Gamma Activation Analysis technique at the IPR-R1 TRIGA reactor

    Energy Technology Data Exchange (ETDEWEB)

    Guerra, Bruno T.; Pereira, Claubia, E-mail: brunoteixeiraguerra@yahoo.com.br, E-mail: claubia@nuclear.ufmg.br [Universidade Federal de Minas Gerais (DEN/UFMG), Belo Horizonte, MG (Brazil). Departmento de Energia Nuclear; Soares, Alexandre L.; Menezes, Maria Angela B.C., E-mail: menezes@cdtn.br, E-mail: asleal@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2015-07-01

    The IPR-R1 is a reactor type TRIGA, Mark-I model, manufactured by the General Atomic Company and installed at Nuclear Technology Development Centre (CDTN), Brazilian Commission for Nuclear Energy (CNEN), in Belo Horizonte, Brazil. It is a light water moderated and cooled, graphite-reflected, open-pool type research reactor and operates at 100 kW. It presents low power, low pressure, for application in research, training and radioisotopes production. The fuel is an alloy of zirconium hydride and uranium enriched at 20% in {sup 235}U. The implementation of the PGNAA (Prompt Gamma Neutron Activation Analysis) using this research reactor will significantly increase in number of chemical elements analysed and the kind of matrices. A project is underway in order to implement this technique at CDTN. The objective of this study was to contribute in feasibility analysis of implementing this technique. For this purpose, MCNP is being used. Some variance reduction tools in the methodology, that has been already developed, was introduced for calculating of the neutron flux in the neutron extractor inclined. The objective was to reduce the code error and thereby increasing the reliability of the results. With the implementation of the variance reduction tools, the results of the thermal and epithermal neutron fluxes presented a significant improvement in both calculations. (author)

  11. Applications of electrochemical techniques in mineral analysis.

    Science.gov (United States)

    Niu, Yusheng; Sun, Fengyue; Xu, Yuanhong; Cong, Zhichao; Wang, Erkang

    2014-09-01

    This review, covering reports published in recent decade from 2004 to 2013, shows how electrochemical (EC) techniques such as voltammetry, electrochemical impedance spectroscopy, potentiometry, coulometry, etc., have made significant contributions in the analysis of minerals such as clay, sulfide, oxide, and oxysalt. It was discussed based on the classifications of both the types of the used EC techniques and kinds of the analyzed minerals. Furthermore, minerals as electrode modification materials for EC analysis have also been summarized. Accordingly, research vacancies and future development trends in these areas are discussed.

  12. Kinematic Analysis of Healthy Hips during Weight-Bearing Activities by 3D-to-2D Model-to-Image Registration Technique

    Directory of Open Access Journals (Sweden)

    Daisuke Hara

    2014-01-01

    Full Text Available Dynamic hip kinematics during weight-bearing activities were analyzed for six healthy subjects. Continuous X-ray images of gait, chair-rising, squatting, and twisting were taken using a flat panel X-ray detector. Digitally reconstructed radiographic images were used for 3D-to-2D model-to-image registration technique. The root-mean-square errors associated with tracking the pelvis and femur were less than 0.3 mm and 0.3° for translations and rotations. For gait, chair-rising, and squatting, the maximum hip flexion angles averaged 29.6°, 81.3°, and 102.4°, respectively. The pelvis was tilted anteriorly around 4.4° on average during full gait cycle. For chair-rising and squatting, the maximum absolute value of anterior/posterior pelvic tilt averaged 12.4°/11.7° and 10.7°/10.8°, respectively. Hip flexion peaked on the way of movement due to further anterior pelvic tilt during both chair-rising and squatting. For twisting, the maximum absolute value of hip internal/external rotation averaged 29.2°/30.7°. This study revealed activity dependent kinematics of healthy hip joints with coordinated pelvic and femoral dynamic movements. Kinematics’ data during activities of daily living may provide important insight as to the evaluating kinematics of pathological and reconstructed hips.

  13. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    OpenAIRE

    T. Subramani; J. Jothi

    2014-01-01

    This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case s...

  14. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  15. Comparative study between the PIXE technique and neutron activation analysis for Zinc determination; Estudo comparativo entre a tecnica de inducao de raios X por particulas e analise por ativacao com neutrons na determinacao do metal pesado zinco

    Energy Technology Data Exchange (ETDEWEB)

    Cruvinel, Paulo Estevao; Crestana, Silvio [Empresa Brasileira de Pesquisa Agropecuaria, Sao Carlos, SP (Brazil). CNPDIA. E-mail: cruvinel@cnpdia.embrapa.br; Armelin, Maria Jose Aguirre [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil); Artaxo Netto, Paulo Eduardo [Sao Paulo Univ., SP (Brazil). Inst. de Fisica

    1997-07-01

    This work presents a comparative study between the PIXE, proton beams and neutron activation analysis (NAA) techniques, for determination of total zinc concentration. Particularly, soil samples from the Pindorama, Instituto Agronomico de Campinas, Sao Paulo State, Brazil, experimental station have been analysed and measuring the zinc contents in {mu}g/g. The results presented good correlation between the mentioned techniques. The PIXE and NAA analyses have been carried out by using the series S, 2.4 MeV proton beams Pelletron accelerator and the IPEN/CNEN-IEA-R1 reactor, both installed at the Sao Paulo - Brazil university.

  16. PHOTOGRAMMETRIC TECHNIQUES FOR ROAD SURFACE ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. A. Knyaz

    2016-06-01

    Full Text Available The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  17. Photogrammetric Techniques for Road Surface Analysis

    Science.gov (United States)

    Knyaz, V. A.; Chibunichev, A. G.

    2016-06-01

    The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  18. Root Cause Analysis - A Diagnostic Failure Analysis Technique for Managers

    Science.gov (United States)

    1975-03-26

    AA~ TECHNICAL REPORT RF-75-2 yAbom 0 ROOT CAUSE ANALYSIS - A DIAGNOSTIC FAILURE ANALYSIS TECHNIQUE FOR MANAGERS Augustine E. Magistro Nuclear...through 1975. rB Augustine E. Magistro has participated in root cause analysis task tem including team member and Blue Ribbon A panel reviewer, team

  19. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  20. PRELIMINARY RESULTS OF ATMOSPHERIC DEPOSITION OF MAJOR AND TRACE ELEMENTS IN THE GREATER AND LESSER CAUCASUS MOUNTAINS STUDIED BY THE MOSS TECHNIQUE AND NEUTRON ACTIVATION ANALYSIS

    Directory of Open Access Journals (Sweden)

    S. Shetekauri

    2015-05-01

    Full Text Available The method of moss biomonitoring of atmospheric deposition of trace elements was applied for the first time in the western Caucasus Mountains to assess the environmental situation in this region. The sixteen moss samples have been collected in 2014 summer growth period along altitudinal gradients in the range of altitudes from 600 m to 2665 m. Concentrations of Na, Mg, Al, Cl, K, Ca, Ti, V, Mn, Fe, Zn, As, Br, Rb, Mo, Cd, I, Sb, Ba, La, Sm, W, Au, and U determined by neutron activation analysis in the moss samples are reported. A comparison with the data for moss collected in Norway (pristine area was carried out.  Multivariate statistical analysis of the results was used for assessment pollution sources in the studied part of the Caucasus. The increase in concentrations of most of elements with rising altitude due to gradually disappearing vegetation cover and wind erosion of soil was observed. A comparison with the available data for moss collected in the Alps at the same altitude (~ 2500 m was performed.

  1. UPLC: a preeminent technique in pharmaceutical analysis.

    Science.gov (United States)

    Kumar, Ashok; Saini, Gautam; Nair, Anroop; Sharma, Rishbha

    2012-01-01

    The pharmaceutical companies today are driven to create novel and more efficient tools to discover, develop, deliver and monitor the drugs. In this contest the development of rapid chromatographic method is crucial for the analytical laboratories. In precedent decade, substantial technological advances have been done in enhancing particle chemistry performance, improving detector design and in optimizing the system, data processors and various controls of chromatographic techniques. When all was blended together, it resulted in the outstanding performance via ultra-high performance liquid chromatography (UPLC), which holds back the principle of HPLC technique. UPLC shows a dramatic enhancement in speed, resolution as well as the sensitivity of analysis by using particle size less than 2 pm and the system is operational at higher pressure, while the mobile phase could be able to run at greater linear velocities as compared to HPLC. This technique is considered as a new focal point in field of liquid chromatographic studies. This review focuses on the basic principle, instrumentation of UPLC and its advantages over HPLC, furthermore, this article emphasizes various pharmaceutical applications of this technique.

  2. A Comparative Analysis of Biomarker Selection Techniques

    Directory of Open Access Journals (Sweden)

    Nicoletta Dessì

    2013-01-01

    Full Text Available Feature selection has become the essential step in biomarker discovery from high-dimensional genomics data. It is recognized that different feature selection techniques may result in different set of biomarkers, that is, different groups of genes highly correlated to a given pathological condition, but few direct comparisons exist which quantify these differences in a systematic way. In this paper, we propose a general methodology for comparing the outcomes of different selection techniques in the context of biomarker discovery. The comparison is carried out along two dimensions: (i measuring the similarity/dissimilarity of selected gene sets; (ii evaluating the implications of these differences in terms of both predictive performance and stability of selected gene sets. As a case study, we considered three benchmarks deriving from DNA microarray experiments and conducted a comparative analysis among eight selection methods, representatives of different classes of feature selection techniques. Our results show that the proposed approach can provide useful insight about the pattern of agreement of biomarker discovery techniques.

  3. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  4. Biomechanical Analysis of Contemporary Throwing Technique Theory

    Directory of Open Access Journals (Sweden)

    Chen Jian

    2015-01-01

    Full Text Available Based on the movement process of throwing and in order to further improve the throwing technique of our country, this paper will first illustrate the main influence factors which will affect the shot distance via the mutual combination of movement equation and geometrical analysis. And then, it will give the equation of the acting force that the throwing athletes have to bear during throwing movement; and will reach the speed relationship between each arthrosis during throwing and batting based on the kinetic analysis of the throwing athletes’ arms while throwing. This paper will obtain the momentum relationship of the athletes’ each arthrosis by means of rotational inertia analysis; and then establish a restricted particle dynamics equation from the Lagrange equation. The obtained result shows that the momentum of throwing depends on the momentum of the athletes’ wrist joints while batting.

  5. Development of a technique using MCNPX code for determination of nitrogen content of explosive materials using prompt gamma neutron activation analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Nasrabadi, M.N., E-mail: mnnasrabadi@ast.ui.ac.ir [Department of Nuclear Engineering, Faculty of Advanced Sciences and Technologies, University of Isfahan, Isfahan 81746-73441 (Iran, Islamic Republic of); Bakhshi, F.; Jalali, M.; Mohammadi, A. [Department of Nuclear Engineering, Faculty of Advanced Sciences and Technologies, University of Isfahan, Isfahan 81746-73441 (Iran, Islamic Republic of)

    2011-12-11

    Nuclear-based explosive detection methods can detect explosives by identifying their elemental components, especially nitrogen. Thermal neutron capture reactions have been used for detecting prompt gamma 10.8 MeV following radioactive neutron capture by {sup 14}N nuclei. We aimed to study the feasibility of using field-portable prompt gamma neutron activation analysis (PGNAA) along with improved nuclear equipment to detect and identify explosives, illicit substances or landmines. A {sup 252}Cf radio-isotopic source was embedded in a cylinder made of high-density polyethylene (HDPE) and the cylinder was then placed in another cylindrical container filled with water. Measurements were performed on high nitrogen content compounds such as melamine (C{sub 3}H{sub 6}N{sub 6}). Melamine powder in a HDPE bottle was placed underneath the vessel containing water and the neutron source. Gamma rays were detected using two NaI(Tl) crystals. The results were simulated with MCNP4c code calculations. The theoretical calculations and experimental measurements were in good agreement indicating that this method can be used for detection of explosives and illicit drugs.

  6. COSIMA data analysis using multivariate techniques

    Directory of Open Access Journals (Sweden)

    J. Silén

    2014-08-01

    Full Text Available We describe how to use multivariate analysis of complex TOF-SIMS spectra introducing the method of random projections. The technique allows us to do full clustering and classification of the measured mass spectra. In this paper we use the tool for classification purposes. The presentation describes calibration experiments of 19 minerals on Ag and Au substrates using positive mode ion spectra. The discrimination between individual minerals gives a crossvalidation Cohen κ for classification of typically about 80%. We intend to use the method as a fast tool to deduce a qualitative similarity of measurements.

  7. Learning by Doing: An Empirical Study of Active Teaching Techniques

    Science.gov (United States)

    Hackathorn, Jana; Solomon, Erin D.; Blankmeyer, Kate L.; Tennial, Rachel E.; Garczynski, Amy M.

    2011-01-01

    The current study sought to examine the effectiveness of four teaching techniques (lecture, demonstrations, discussions, and in-class activities) in the classroom. As each technique offers different benefits to the instructor and students, each technique was expected to aid in a different depth of learning. The current findings indicated that each…

  8. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    S V Dhurandhar

    2004-10-01

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting binary inspirals; Fourier transforms over Doppler shifted time intervals are computed for long duration periodic sources; optimally weighted cross-correlations for stochastic background. Some recent schemes which efficiently search for inspirals will be described. The performance of some of these techniques on real data obtained will be discussed. Finally, some results on cancellation of systematic noises in laser interferometric space antenna (LISA) will be presented and future directions indicated.

  9. Application of Electromigration Techniques in Environmental Analysis

    Science.gov (United States)

    Bald, Edward; Kubalczyk, Paweł; Studzińska, Sylwia; Dziubakiewicz, Ewelina; Buszewski, Bogusław

    Inherently trace-level concentration of pollutants in the environment, together with the complexity of sample matrices, place a strong demand on the detection capabilities of electromigration methods. Significant progress is continually being made, widening the applicability of these techniques, mostly capillary zone electrophoresis, micellar electrokinetic chromatography, and capillary electrochromatography, to the analysis of real-world environmental samples, including the concentration sensitivity and robustness of the developed analytical procedures. This chapter covers the recent major developments in the domain of capillary electrophoresis analysis of environmental samples for pesticides, polycyclic aromatic hydrocarbons, phenols, amines, carboxylic acids, explosives, pharmaceuticals, and ionic liquids. Emphasis is made on pre-capillary and on-capillary chromatography and electrophoresis-based concentration of analytes and detection improvement.

  10. A numerical comparison of sensitivity analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hamby, D.M.

    1993-12-31

    Engineering and scientific phenomena are often studied with the aid of mathematical models designed to simulate complex physical processes. In the nuclear industry, modeling the movement and consequence of radioactive pollutants is extremely important for environmental protection and facility control. One of the steps in model development is the determination of the parameters most influential on model results. A {open_quotes}sensitivity analysis{close_quotes} of these parameters is not only critical to model validation but also serves to guide future research. A previous manuscript (Hamby) detailed many of the available methods for conducting sensitivity analyses. The current paper is a comparative assessment of several methods for estimating relative parameter sensitivity. Method practicality is based on calculational ease and usefulness of the results. It is the intent of this report to demonstrate calculational rigor and to compare parameter sensitivity rankings resulting from various sensitivity analysis techniques. An atmospheric tritium dosimetry model (Hamby) is used here as an example, but the techniques described can be applied to many different modeling problems. Other investigators (Rose; Dalrymple and Broyd) present comparisons of sensitivity analyses methodologies, but none as comprehensive as the current work.

  11. Gas Chromatographic-Mass Spectrometric Analysis of Volatiles Obtained by Four Different Techniques from Salvia rosifolia Sm. and Evaluation for Biological Activity

    Science.gov (United States)

    Volatile constituents from the aerial parts of Salvia rosifolia Sm. (Lamiaceae), endemic to Turkey, were obtained by four different isolation techniques and then analyzed by gas chromatography (GC/FID) and gas chromatography – mass spectrometry (GC/MS) methods. Also in scope of the present work, the...

  12. Conference on Instrumental Activation Analysis: IAA 89

    Science.gov (United States)

    Vobecky, M.; Obrusnik, I.

    1989-05-01

    The proceedings contain 40 abstracts of papers all of which have been incorporated in INIS. The papers were centred on the applications of radioanalytical methods, especially on neutron activation analysis, x ray fluorescence analysis, PIXE analysis and tracer techniques in biology, medicine and metallurgy, measuring instruments including microcomputers, and data processing methods.

  13. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  14. Performance Comparison of Active Queue Management Techniques

    Directory of Open Access Journals (Sweden)

    T. B. Reddy

    2008-01-01

    Full Text Available Congestion is an important issue which researchers focus on in the Transmission Control Protocol (TCP network environment. To keep the stability of the whole network, congestion control algorithms have been extensively studied. Queue management method employed by the routers is one of the important issues in the congestion control study. Active Queue Management (AQM has been proposed as a router-based mechanism for early detection of congestion inside the network. In this study, we are comparing AQM two popular queue management methods, Random Early Detection (RED and droptail, in different aspects, such as throughput and fairness Index. The comparison results indicate RED performed slightly better with higher throughput and higher fairness Index than droptail. Simulation is done by using Network Simulator (NS2 and the graphs are drawn using X- graph.

  15. Analytical techniques in pharmaceutical analysis: A review

    Directory of Open Access Journals (Sweden)

    Masoom Raza Siddiqui

    2017-02-01

    Full Text Available The development of the pharmaceuticals brought a revolution in human health. These pharmaceuticals would serve their intent only if they are free from impurities and are administered in an appropriate amount. To make drugs serve their purpose various chemical and instrumental methods were developed at regular intervals which are involved in the estimation of drugs. These pharmaceuticals may develop impurities at various stages of their development, transportation and storage which makes the pharmaceutical risky to be administered thus they must be detected and quantitated. For this analytical instrumentation and methods play an important role. This review highlights the role of the analytical instrumentation and the analytical methods in assessing the quality of the drugs. The review highlights a variety of analytical techniques such as titrimetric, chromatographic, spectroscopic, electrophoretic, and electrochemical and their corresponding methods that have been applied in the analysis of pharmaceuticals.

  16. Techniques for Analysis of Plant Phenolic Compounds

    Directory of Open Access Journals (Sweden)

    Thomas H. Roberts

    2013-02-01

    Full Text Available Phenolic compounds are well-known phytochemicals found in all plants. They consist of simple phenols, benzoic and cinnamic acid, coumarins, tannins, lignins, lignans and flavonoids. Substantial developments in research focused on the extraction, identification and quantification of phenolic compounds as medicinal and/or dietary molecules have occurred over the last 25 years. Organic solvent extraction is the main method used to extract phenolics. Chemical procedures are used to detect the presence of total phenolics, while spectrophotometric and chromatographic techniques are utilized to identify and quantify individual phenolic compounds. This review addresses the application of different methodologies utilized in the analysis of phenolic compounds in plant-based products, including recent technical developments in the quantification of phenolics.

  17. Input techniques that dynamically change their cursor activation area

    DEFF Research Database (Denmark)

    Hertzum, Morten; Hornbæk, Kasper

    2007-01-01

    cursor, whose activation area always contains the closest object, and two variants of cell cursors, whose activation areas contain a set of objects in the vicinity of the cursor. We report two experiments that compare these techniques to a point cursor; in one experiment participants use a touchpad......Efficient pointing is crucial to graphical user interfaces, and input techniques that dynamically change their activation area may yield improvements over point cursors by making objects selectable at a distance. Input techniques that dynamically change their activation area include the bubble...... for operating the input techniques, in the other a mouse. In both experiments, the bubble cursor is fastest and participants make fewer errors with it. Participants also unanimously prefer this technique. For small targets, the cell cursors are generally more accurate than the point cursor; in the second...

  18. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    Neergaard, Helle; Leitch, Claire

    2015-01-01

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  19. Application of Active Flow Control Technique for Gust Load Alleviation

    Institute of Scientific and Technical Information of China (English)

    XU Xiaoping; ZHU Xiaoping; ZHOU Zhou; FAN Ruijun

    2011-01-01

    A new gust load alleviation technique is presented in this paper based on active flow control.Numerical studies are conducted to investigate the beneficial effects on the aerodynamic characteristics of the quasi “Global Hawk” airfoil using arrays of jets during the gust process.Based on unsteady Navier-Stokes equations,the grid-velocity method is introduced to simulate the gust influence,and dynamic response in vertical gust flow perturbation is investigated for the airfoil as well.An unsteady surface transpiration boundary condition is enforced over a user specified portion of the airfoil's surface to emulate the time dependent velocity boundary conditions.Firstly,after applying this method to simulate typical NACA0006 airfoil gust response to a step change in the angle of attack,it shows that the indicial responses of the airfoil make good agreement with the exact theoretical values and the calculated values in references.Furthermore,gust response characteristic for the quasi “Global Hawk” airfoil is analyzed.Five kinds of flow control techniques are introduced as steady blowing,steady suction,unsteady blowing,unsteady suction and synthetic jets.The physical analysis of the influence on the effects of gust load alleviation is proposed to provide some guidelines for practice.Numerical results have indicated that active flow control technique,as a new technology of gust load alleviation,can affect and suppress the fluid disturbances caused by gust so as to achieve the purpose of gust load alleviation.

  20. Attitude Exploration Using Factor Analysis Technique

    Directory of Open Access Journals (Sweden)

    Monika Raghuvanshi

    2016-12-01

    Full Text Available Attitude is a psychological variable that contains positive or negative evaluation about people or an environment. The growing generation possesses learning skills, so if positive attitude is inculcated at the right age, it might therefore become habitual. Students in the age group 14-20 years from the city of Bikaner, India, are the target population for this study. An inventory of 30Likert-type scale statements was prepared in order to measure attitude towards the environment and matters related to conservation. The primary data is collected though a structured questionnaire, using cluster sampling technique and analyzed using the IBM SPSS 23 statistical tool. Factor analysis is used to reduce 30 variables to a smaller number of more identifiable groups of variables. Results show that students “need more regulation and voluntary participation to protect the environment”, “need conservation of water and electricity”, “are concerned for undue wastage of water”, “need visible actions to protect the environment”, “need strengthening of the public transport system”, “are a little bit ignorant about the consequences of global warming”, “want prevention of water pollution by industries”, “need changing of personal habits to protect the environment”, and “don’t have firsthand experience of global warming”. Analysis revealed that nine factors obtained could explain about 58.5% variance in the attitude of secondary school students towards the environment in the city of Bikaner, India. The remaining 39.6% variance is attributed to other elements not explained by this analysis. A global campaign for improvement in attitude about environmental issues and its utility in daily lives may boost positive youth attitudes, potentially impacting worldwide. A cross-disciplinary approach may be developed by teaching along with other related disciplines such as science, economics, and social studies etc.

  1. Analysis of Hospital Processes with Process Mining Techniques.

    Science.gov (United States)

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  2. Particle fluence measurements by activation technique for radiation damage studies

    CERN Document Server

    León-Florián, E; Furetta, C; Leroy, Claude

    1995-01-01

    High-level radiation environment can produce radiation damage in detectors and their associate electronic components. The establishment of a correlation between damage, irradiation level and absorbed dose requires a precise measurement of the fluence of particles causing the damage. The activation technique is frequently used for performing particle fluence measurements. A review of this technique is presented.

  3. Numerical modeling techniques for flood analysis

    Science.gov (United States)

    Anees, Mohd Talha; Abdullah, K.; Nawawi, M. N. M.; Ab Rahman, Nik Norulaini Nik; Piah, Abd. Rahni Mt.; Zakaria, Nor Azazi; Syakir, M. I.; Mohd. Omar, A. K.

    2016-12-01

    Topographic and climatic changes are the main causes of abrupt flooding in tropical areas. It is the need to find out exact causes and effects of these changes. Numerical modeling techniques plays a vital role for such studies due to their use of hydrological parameters which are strongly linked with topographic changes. In this review, some of the widely used models utilizing hydrological and river modeling parameters and their estimation in data sparse region are discussed. Shortcomings of 1D and 2D numerical models and the possible improvements over these models through 3D modeling are also discussed. It is found that the HEC-RAS and FLO 2D model are best in terms of economical and accurate flood analysis for river and floodplain modeling respectively. Limitations of FLO 2D in floodplain modeling mainly such as floodplain elevation differences and its vertical roughness in grids were found which can be improve through 3D model. Therefore, 3D model was found to be more suitable than 1D and 2D models in terms of vertical accuracy in grid cells. It was also found that 3D models for open channel flows already developed recently but not for floodplain. Hence, it was suggested that a 3D model for floodplain should be developed by considering all hydrological and high resolution topographic parameter's models, discussed in this review, to enhance the findings of causes and effects of flooding.

  4. Function Analysis and Decomposistion using Function Analysis Systems Technique

    Energy Technology Data Exchange (ETDEWEB)

    Wixson, James Robert

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Technique (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  5. Function Analysis and Decomposistion using Function Analysis Systems Technique

    Energy Technology Data Exchange (ETDEWEB)

    J. R. Wixson

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Techniques (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  6. Development of Prompt Gamma Neutron Activation Analysis Techniques%瞬发伽马中子活化分析技术发展现状

    Institute of Scientific and Technical Information of China (English)

    卢毅; 宋朝晖

    2013-01-01

    It made a brief summarization of the current development of the Prompt Gamma Neutron Activation A -nalysis.The PGNAA theory, method, facility and international works on the application of PGNAA were intro-duced .In the end , there was a discussion about some problems in the development of PGNAA .%对瞬发伽马中子活化分析( PGNAA)技术发展现状进行了概述。介绍了PGNAA的基本原理、方法、设备以及当前国内外在PGNAA应用方面所做的一些研究工作。最后对PGNAA在技术发展方面存在的一些问题进行了探讨。

  7. Determination of silver, gold, zinc and copper in mineral samples by various techniques of instrumental neutron activation analysis; Determinacion de plata, oro, zinc y cobre en muestras minerales mediante diversas tecnicas de analisis por activacion de neutrones instrumental

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez R, N. I.; Rios M, C.; Pinedo V, J. L. [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Cipres No. 10, Fracc. La Penuela, 98068 Zacatecas, Zac. (Mexico); Yoho, M.; Landsberger, S., E-mail: neisla126@hotmail.com [University of Texas at Austin, Nuclear Engineering Teaching Laboratory, Austin 78712, Texas (United States)

    2015-09-15

    Using the method of instrumental neutron activation analysis, mineral exploration samples were analyzed in order to determine the concentrations of silver, gold, zinc and copper; these minerals being the main products of benefit of Tizapa and Cozamin mines. Samples were subjected to various techniques, where the type of radiation and counting methods were chosen based on the specific isotopic characteristics of each element. For calibration and determination of concentrations the comparator method was used, certified standards were subjected to the same conditions of irradiation and measurement that the prospecting samples. The irradiations were performed at the research reactor TRIGA Mark II of the University of Texas at Austin. The silver concentrations were determined by Cyclical Epithermal Neutron Activation Analysis. This method in combination with the transfer pneumatic system allowed a good analytical precision and accuracy in prospecting for silver, from photo peak measurement 657.7 keV of short half-life radionuclide {sup 110}Ag. For the determination of gold and zinc, Epithermal Neutron Activation Analysis was used, the photo peaks analyzed corresponded to the energies 411.8 keV of radionuclide {sup 199}Au and 438.6 keV of metastable radionuclide {sup 69m}Zn. On the other hand, copper quantification was based on the photo peak analysis of 1039.2 keV produced by the short half-life radionuclide {sup 66}Cu, by Thermal Neutron Activation Analysis. The photo peaks measurement corresponding to gold, zinc and copper was performed using a Compton suppression system, which allowed an improvement in the signal to noise relationship, so that better detection limits and low uncertainties associated with the results were obtained. Comparing elemental concentrations the highest values in silver, zinc and copper was for samples of mine Tizapa. Regarding gold values were found in the same range for both mines. To evaluate the precision and accuracy of the methods used

  8. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  9. Quality assurance and quantitative error analysis by tracer techniques

    Energy Technology Data Exchange (ETDEWEB)

    Schuetze, N.; Hermann, U.

    1983-12-01

    The locations, types and sources of casting defects have been tested by tracer techniques. Certain sites of moulds were labelled using /sup 199/Au, /sup 24/Na sodium carbonate solution, and technetium solution produced in the technetium generator on a /sup 99/Mo//sup 99/Tc elution column. Evaluations were made by means of activity measurements and autoradiography. The locations and causes of casting defects can be determined by error analysis. The surface defects of castings resulting from the moulding materials and from the blacking can be detected by technetium, the subsurface defects are located by gold.

  10. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  11. Escherichia coli activity characterization using a laser dynamic speckle technique

    CERN Document Server

    Ramírez-Miquet, Evelio E; Contreras-Alarcón, Orestes R

    2012-01-01

    The results of applying a laser dynamic speckle technique to characterize bacterial activity are presented. The speckle activity was detected in two-compartment Petri dishes. One compartment was inoculated and the other one was left as a control blank. The speckled images were processed by the recently reported temporal difference method. Three inoculums of 0.3, 0.5, and 0.7 McFarland units of cell concentration were tested; each inoculum was tested twice for a total of six experiments. The dependences on time of the mean activity, the standard deviation of activity and other descriptors of the speckle pattern evolution were calculated for both the inoculated compartment and the blank. In conclusion the proposed dynamic speckle technique allows characterizing the activity of Escherichia coli bacteria in solid medium.

  12. COMPARISON OF ACTIVE RELEASE TECHNIQUE AND MYOFASCIAL RELEASE TECHNIQUE ON PAIN, GRIP STRENGTH & FUNCTIONAL PERFORMANCE IN PATIENTS WITH CHRONIC LATERAL EPICONDYLITIS

    Directory of Open Access Journals (Sweden)

    Parth Trivedi

    2014-06-01

    Full Text Available Background & Purpose: Lateral epicondylitis is the most common lesion of the elbow. Tennis elbow or lateral epicondylitis is defined as a syndrome of pain in the wrist extensor muscles at or near their lateral epicondyle origin or pain directly over the lateral epicondyle. So, the aim of this study was to compare the effectiveness of Active Release Technique (ART and Myofascial Release Technique (MFR in the treatment of Chronic Lateral Epicondylitis (CLE. Methodology: The study included thirty-six patients with Chronic Lateral Epicondylitis of age group range between 30 to 45 years. Patients were randomly divided into three groups: Control Group (A, Active Release Technique Group (B and Myofascial Release Technique Group (C. The patients were treated for 4 weeks and three outcome measures: 0-10 NPRS, Hand Dynamometer and PRTEE were taken for assessment and analysis at baseline and after 4th weeks was done. Result: In this study the result showed that Active Release Technique and Myofascial Release Technique were effective in all three outcome measures when compared to Control Group. Myofascial Release Technique was more effective in improving grip strength & reducing pain & disability when compared to Active Release Technique.(p<0.05 Conclusion: Active Release Technique and Myofascial Release Technique are effective in patients with Chronic Lateral Epicondylitis. Myofascial Release Technique demonstrated better outcomes than Active Release Technique in the management of Chronic Lateral Epicondylitis.

  13. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  14. Node Augmentation Technique in Bayesian Network Evidence Analysis and Marshaling

    Energy Technology Data Exchange (ETDEWEB)

    Keselman, Dmitry [Los Alamos National Laboratory; Tompkins, George H [Los Alamos National Laboratory; Leishman, Deborah A [Los Alamos National Laboratory

    2010-01-01

    Given a Bayesian network, sensitivity analysis is an important activity. This paper begins by describing a network augmentation technique which can simplifY the analysis. Next, we present two techniques which allow the user to determination the probability distribution of a hypothesis node under conditions of uncertain evidence; i.e. the state of an evidence node or nodes is described by a user specified probability distribution. Finally, we conclude with a discussion of three criteria for ranking evidence nodes based on their influence on a hypothesis node. All of these techniques have been used in conjunction with a commercial software package. A Bayesian network based on a directed acyclic graph (DAG) G is a graphical representation of a system of random variables that satisfies the following Markov property: any node (random variable) is independent of its non-descendants given the state of all its parents (Neapolitan, 2004). For simplicities sake, we consider only discrete variables with a finite number of states, though most of the conclusions may be generalized.

  15. Stalked protozoa identification by image analysis and multivariable statistical techniques.

    Science.gov (United States)

    Amaral, A L; Ginoris, Y P; Nicolau, A; Coelho, M A Z; Ferreira, E C

    2008-06-01

    Protozoa are considered good indicators of the treatment quality in activated sludge systems as they are sensitive to physical, chemical and operational processes. Therefore, it is possible to correlate the predominance of certain species or groups and several operational parameters of the plant. This work presents a semiautomatic image analysis procedure for the recognition of the stalked protozoa species most frequently found in wastewater treatment plants by determining the geometrical, morphological and signature data and subsequent processing by discriminant analysis and neural network techniques. Geometrical descriptors were found to be responsible for the best identification ability and the identification of the crucial Opercularia and Vorticella microstoma microorganisms provided some degree of confidence to establish their presence in wastewater treatment plants.

  16. Classification Techniques for Multivariate Data Analysis.

    Science.gov (United States)

    1980-03-28

    analysis among biologists, botanists, and ecologists, while some social scientists may refer "typology". Other frequently encountered terms are pattern...the determinantal equation: lB -XW 0 (42) 49 The solutions X. are the eigenvalues of the matrix W-1 B 1 as in discriminant analysis. There are t non...Statistical Package for Social Sciences (SPSS) (14) subprogram FACTOR was used for the principal components analysis. It is designed both for the factor

  17. Active structural control with stable fuzzy PID techniques

    CERN Document Server

    Yu, Wen

    2016-01-01

    This book presents a detailed discussion of intelligent techniques to measure the displacement of buildings when they are subjected to vibration. It shows how these techniques are used to control active devices that can reduce vibration 60–80% more effectively than widely used passive anti-seismic systems. After introducing various structural control devices and building-modeling and active structural control methods, the authors propose offset cancellation and high-pass filtering techniques to solve some common problems of building-displacement measurement using accelerometers. The most popular control algorithms in industrial settings, PD/PID controllers, are then analyzed and then combined with fuzzy compensation. The stability of this combination is proven with standard weight-training algorithms. These conditions provide explicit methods for selecting PD/PID controllers. Finally, fuzzy-logic and sliding-mode control are applied to the control of wind-induced vibration. The methods described are support...

  18. Bonding techniques for hybrid active pixel sensors (HAPS)

    Science.gov (United States)

    Bigas, M.; Cabruja, E.; Lozano, M.

    2007-05-01

    A hybrid active pixel sensor (HAPS) consists of an array of sensing elements which is connected to an electronic read-out unit. The most used way to connect these two different devices is bump bonding. This interconnection technique is very suitable for these systems because it allows a very fine pitch and a high number of I/Os. However, there are other interconnection techniques available such as direct bonding. This paper, as a continuation of a review [M. Lozano, E. Cabruja, A. Collado, J. Santander, M. Ullan, Nucl. Instr. and Meth. A 473 (1-2) (2001) 95-101] published in 2001, presents an update of the different advanced bonding techniques available for manufacturing a hybrid active pixel detector.

  19. A Technique for Shunt Active Filter meld micro grid System

    Directory of Open Access Journals (Sweden)

    A. Lumani

    2015-08-01

    Full Text Available The proposed system presents a control technique for a micro grid connected hybrid generation system ith case study interfaced with a three phase shunt active filter to suppress the current harmonics and reactive power present in the load using PQ Theory with ANN controller. This Hybrid Micro Grid is developed using freely renewable energy resources like Solar Photovoltaic (SPV and Wind Energy (WE. To extract the maximum available power from PV panels and wind turbines, Maximum power point Tracker (MPPT has been included. This MPPT uses the “Standard Perturbs and Observe” technique. By using PQ Theory with ANN Controller, the Reference currents are generated which are to be injected by Shunt active power filter (SAPFto compensate the current harmonics in the non linear load. Simulation studies shows that the proposed control technique performs non-linear load current harmonic compensation maintaining the load current in phase with the source voltage.\\

  20. Innovative Perceptual Motor Activities: Programing Techniques That Work.

    Science.gov (United States)

    Sorrell, Howard M.

    1978-01-01

    A circuit approach and station techniques are used to depict perceptual motor games for handicapped and nonhandicapped children. Twenty activities are described in terms of objectives, materials, and procedures, and their focus on visual tracking, visual discrimination and copying of forms, spatial body perception, fine motor coordination, tactile…

  1. Trends and Techniques in Visual Gaze Analysis

    CERN Document Server

    Stellmach, Sophie; Dachselt, Raimund; Lindley, Craig A

    2010-01-01

    Visualizing gaze data is an effective way for the quick interpretation of eye tracking results. This paper presents a study investigation benefits and limitations of visual gaze analysis among eye tracking professionals and researchers. The results were used to create a tool for visual gaze analysis within a Master's project.

  2. Analysis of Gopher Tortoise Population Estimation Techniques

    Science.gov (United States)

    2005-10-01

    terrestrial reptile that was once found throughout the southeastern United States from North Carolina into Texas. However, due to numerous factors...et al. 2000, Waddle 2000). Solar energy is used for thermoregulation and egg incubation. Also, tortoises are grazers (Garner and Landers 1981...Evaluation and review of field techniques used to study and manage gopher tortoises.” Pages 205-215 in Management of amphibians, reptiles , and small mammals

  3. NEW TECHNIQUES USED IN AUTOMATED TEXT ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. I strate

    2010-12-01

    Full Text Available Automated analysis of natural language texts is one of the most important knowledge discovery tasks for any organization. According to Gartner Group, almost 90% of knowledge available at an organization today is dispersed throughout piles of documents buried within unstructured text. Analyzing huge volumes of textual information is often involved in making informed and correct business decisions. Traditional analysis methods based on statistics fail to help processing unstructured texts and the society is in search of new technologies for text analysis. There exist a variety of approaches to the analysis of natural language texts, but most of them do not provide results that could be successfully applied in practice. This article concentrates on recent ideas and practical implementations in this area.

  4. Assessing voluntary muscle activation with the twitch interpolation technique.

    Science.gov (United States)

    Shield, Anthony; Zhou, Shi

    2004-01-01

    The twitch interpolation technique is commonly employed to assess the completeness of skeletal muscle activation during voluntary contractions. Early applications of twitch interpolation suggested that healthy human subjects could fully activate most of the skeletal muscles to which the technique had been applied. More recently, however, highly sensitive twitch interpolation has revealed that even healthy adults routinely fail to fully activate a number of skeletal muscles despite apparently maximal effort. Unfortunately, some disagreement exists as to how the results of twitch interpolation should be employed to quantify voluntary activation. The negative linear relationship between evoked twitch force and voluntary force that has been observed by some researchers implies that voluntary activation can be quantified by scaling a single interpolated twitch to a control twitch evoked in relaxed muscle. Observations of non-linear evoked-voluntary force relationships have lead to the suggestion that the single interpolated twitch ratio can not accurately estimate voluntary activation. Instead, it has been proposed that muscle activation is better determined by extrapolating the relationship between evoked and voluntary force to provide an estimate of true maximum force. However, criticism of the single interpolated twitch ratio typically fails to take into account the reasons for the non-linearity of the evoked-voluntary force relationship. When these reasons are examined, it appears that most are even more challenging to the validity of extrapolation than they are to the linear equation. Furthermore, several factors that contribute to the observed non-linearity can be minimised or even eliminated with appropriate experimental technique. The detection of small activation deficits requires high resolution measurement of force and careful consideration of numerous experimental details such as the site of stimulation, stimulation intensity and the number of interpolated

  5. The Network Protocol Analysis Technique in Snort

    Science.gov (United States)

    Wu, Qing-Xiu

    Network protocol analysis is a network sniffer to capture data for further analysis and understanding of the technical means necessary packets. Network sniffing is intercepted by packet assembly binary format of the original message content. In order to obtain the information contained. Required based on TCP / IP protocol stack protocol specification. Again to restore the data packets at protocol format and content in each protocol layer. Actual data transferred, as well as the application tier.

  6. Uncertainty Analysis Technique for OMEGA Dante Measurements

    Energy Technology Data Exchange (ETDEWEB)

    May, M J; Widmann, K; Sorce, C; Park, H; Schneider, M

    2010-05-07

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  7. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  8. Cognitive task analysis: Techniques applied to airborne weapons training

    Energy Technology Data Exchange (ETDEWEB)

    Terranova, M.; Seamster, T.L.; Snyder, C.E.; Treitler, I.E. (Oak Ridge National Lab., TN (USA); Carlow Associates, Inc., Fairfax, VA (USA); Martin Marietta Energy Systems, Inc., Oak Ridge, TN (USA); Tennessee Univ., Knoxville, TN (USA))

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented along with the results. 6 refs., 2 figs., 4 tabs.

  9. Comparison of Hydrogen Sulfide Analysis Techniques

    Science.gov (United States)

    Bethea, Robert M.

    1973-01-01

    A summary and critique of common methods of hydrogen sulfide analysis is presented. Procedures described are: reflectance from silver plates and lead acetate-coated tiles, lead acetate and mercuric chloride paper tapes, sodium nitroprusside and methylene blue wet chemical methods, infrared spectrophotometry, and gas chromatography. (BL)

  10. OPERATIONAL MODAL ANALYSIS SCHEMES USING CORRELATION TECHNIQUE

    Institute of Scientific and Technical Information of China (English)

    Zheng Min; Shen Fan; Chen Huaihai

    2005-01-01

    For some large-scale engineering structures in operating conditions, modal parameters estimation must base itself on response-only data. This problem has received a considerable amount of attention in the past few years. It is well known that the cross-correlation function between the measured responses is a sum of complex exponential functions of the same form as the impulse response function of the original system. So this paper presents a time-domain operating modal identification global scheme and a frequency-domain scheme from output-only by coupling the cross-correlation function with conventional modal parameter estimation. The outlined techniques are applied to an airplane model to estimate modal parameters from response-only data.

  11. A comparison of wavelet analysis techniques in digital holograms

    Science.gov (United States)

    Molony, Karen M.; Maycock, Jonathan; McDonald, John B.; Hennelly, Bryan M.; Naughton, Thomas J.

    2008-04-01

    This study explores the effectiveness of wavelet analysis techniques on digital holograms of real-world 3D objects. Stationary and discrete wavelet transform techniques have been applied for noise reduction and compared. Noise is a common problem in image analysis and successful reduction of noise without degradation of content is difficult to achieve. These wavelet transform denoising techniques are contrasted with traditional noise reduction techniques; mean filtering, median filtering, Fourier filtering. The different approaches are compared in terms of speckle reduction, edge preservation and resolution preservation.

  12. An analysis technique for microstrip antennas

    Science.gov (United States)

    Agrawal, P. K.; Bailey, M. C.

    1977-01-01

    The paper presents a combined numerical and empirical approach to the analysis of microstrip antennas over a wide range of frequencies. The method involves representing the antenna by a fine wire grid immersed in a dielectric medium and then using Richmond's reaction formulation (1974) to evaluate the piecewise sinusoidal currents on the grid segments. The calculated results are then modified to account for the finite dielectric discontinuity. The method is applied to round and square microstrip antennas.

  13. Techniques for Surveying Urban Active Faults by Seismic Methods

    Institute of Scientific and Technical Information of China (English)

    Xu Mingcai; Gao Jinghua; Liu Jianxun; Rong Lixin

    2005-01-01

    Using the seismic method to detect active faults directly below cities is an irreplaceable prospecting technique. The seismic method can precisely determine the fault position. Seismic method itself can hardly determine the geological age of fault. However, by considering in connection with the borehole data and the standard geological cross-section of the surveyed area, the geological age of reflected wave group can be qualitatively (or semi-quantitatively)determined from the seismic depth profile. To determine the upper terminal point of active faults directly below city, it is necessary to use the high-resolution seismic reflection technique.To effectively determine the geometric feature of deep faults, especially to determine the relation between deep and shallow fracture structures, the seismic reflection method is better than the seismic refraction method.

  14. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  15. Application of neutron activation tracer sediment technique on environmental science

    Institute of Scientific and Technical Information of China (English)

    YinYi; ZhongWei-Ni; 等

    1997-01-01

    Field and laboratory inverstigations were carried out to study the transport and dispersion law of polluted sediments near wastewater outlet using neutron activation tracer technique.The direction of transport and dispersion of polluted sediments,dispersion amount in different directions,sedimentary region of polluted sediment and evaluation of polluted risk are given.This provided a new test method for the study of environmental science and added a new forecasted content for the evaluation of environmental influence.

  16. Radial Velocity Data Analysis with Compressed Sensing Techniques

    CERN Document Server

    Hara, Nathan C; Laskar, Jacques; Correia, Alexandre C M

    2016-01-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  17. Radial velocity data analysis with compressed sensing techniques

    Science.gov (United States)

    Hara, Nathan C.; Boué, G.; Laskar, J.; Correia, A. C. M.

    2017-01-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian process framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick Observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  18. Radial Velocity Data Analysis with Compressed Sensing Techniques

    Science.gov (United States)

    Hara, Nathan C.; Boué, G.; Laskar, J.; Correia, A. C. M.

    2016-09-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  19. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  20. New techniques for emulsion analysis in a hybrid experiment

    Energy Technology Data Exchange (ETDEWEB)

    Kodama, K. (Aichi University of Education, Kariya 448 (Japan)); Ushida, N. (Aichi University of Education, Kariya 448 (Japan)); Mokhtarani, A. (University of California (Davis), Davis, CA 95616 (United States)); Paolone, V.S. (University of California (Davis), Davis, CA 95616 (United States)); Volk, J.T. (University of California (Davis), Davis, CA 95616 (United States)); Wilcox, J.O. (University of California (Davis), Davis, CA 95616 (United States)); Yager, P.M. (University of California (Davis), Davis, CA 95616 (United States)); Edelstein, R.M. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Freyberger, A.P. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Gibaut, D.B. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Lipton, R.J. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Nichols, W.R. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Potter, D.M. (Carnegie-Mellon Univers

    1994-08-01

    A new method, called graphic scanning, was developed by the Nagoya University Group for emulsion analysis in a hybrid experiment. This method enhances both speed and reliability of emulsion analysis. Details of the application of this technique to the analysis of Fermilab experiment E653 are described. ((orig.))

  1. Electrogastrography: A Noninvasive Technique to Evaluate Gastric Electrical Activity

    OpenAIRE

    Claudia P. Sanmiguel; Mintchev, Martin P.; Bowes, Kenneth L.

    1998-01-01

    Electrogastrography (EGG) is the recording of gastric electrical activity (GEA) from the body surface. The cutaneous signal is low in amplitude and consequently must be amplified considerably. The resultant signal is heavily contaminated with noise, and visual analysis alone of an EGG signal is inadequate. Consequently, EGG recordings require special methodology for acquisition, processing and analysis. Essential components of this methodology involve an adequate system of digital filtering, ...

  2. Evolution of Electroencephalogram Signal Analysis Techniques during Anesthesia

    Directory of Open Access Journals (Sweden)

    Mahmoud I. Al-Kadi

    2013-05-01

    Full Text Available Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device.

  3. Evolution of electroencephalogram signal analysis techniques during anesthesia.

    Science.gov (United States)

    Al-Kadi, Mahmoud I; Reaz, Mamun Bin Ibne; Ali, Mohd Alauddin Mohd

    2013-05-17

    Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs) are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device.

  4. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    Directory of Open Access Journals (Sweden)

    T. Subramani

    2014-06-01

    Full Text Available This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case study on natural base isolation using naturally available soils is presented. Also, the future areas of research are indicated. Earthquakes are one of nature IS greatest hazards; throughout historic time they have caused significant loss offline and severe damage to property, especially to man-made structures. On the other hand, earthquakes provide architects and engineers with a number of important design criteria foreign to the normal design process. From well established procedures reviewed by many researchers, seismic isolation may be used to provide an effective solution for a wide range of seismic design problems. The application of the base isolation techniques to protect structures against damage from earthquake attacks has been considered as one of the most effective approaches and has gained increasing acceptance during the last two decades. This is because base isolation limits the effects of the earthquake attack, a flexible base largely decoupling the structure from the ground motion, and the structural response accelerations are usually less than the ground acceleration. In general, the increase of additional viscous damping in the structure may reduce displacement and acceleration responses of the structure. This study also seeks to evaluate the effects of additional damping on the seismic response when compared with structures without additional damping for the different ground motions.

  5. A comparison of maximal bioenergetic enzyme activities obtained with commonly used homogenization techniques.

    Science.gov (United States)

    Grace, M; Fletcher, L; Powers, S K; Hughes, M; Coombes, J

    1996-12-01

    Homogenization of tissue for analysis of bioenergetic enzyme activities is a common practice in studies examining metabolic properties of skeletal muscle adaptation to disease, aging, inactivity or exercise. While numerous homogenization techniques are in use today, limited information exists concerning the efficacy of specific homogenization protocols. Therefore, the purpose of this study was to compare the efficacy of four commonly used approaches to homogenizing skeletal muscle for analysis of bioenergetic enzyme activity. The maximal enzyme activity (Vmax) of citrate synthase (CS) and lactate dehydrogenase (LDH) were measured from homogenous muscle samples (N = 48 per homogenization technique) and used as indicators to determine which protocol had the highest efficacy. The homogenization techniques were: (1) glass-on-glass pestle; (2) a combination of a mechanical blender and a teflon pestle (Potter-Elvehjem); (3) a combination of the mechanical blender and a biological detergent; and (4) the combined use of a mechanical blender and a sonicator. The glass-on-glass pestle homogenization protocol produced significantly higher (P < 0.05) enzyme activities compared to all other protocols for both enzymes. Of the four protocols examined, the data demonstrate that the glass-on-glass pestle homogenization protocol is the technique of choice for studying bioenergetic enzyme activity in skeletal muscle.

  6. Neutron Activation Analysis of Inhomogeneous Large Samples; An Explorative Study

    NARCIS (Netherlands)

    Baas, H.W.

    2004-01-01

    Neutron activation analysis is a powerful technique for the determination of trace-element concentrations. Since both neutrons that are used for activation and gamma rays that are detected have a high penetrating power, the technique can be applied for relatively large samples (up to 13 L), as demon

  7. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing...

  8. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  9. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  10. Determination of the archaeological origin of ceramic fragments characterized by neutron activation analysis, by means of the application of multivariable statistical analysis techniques;Determinacion del origen de fragmentos de ceramica arqueologica caracterizados con analisis por activacion neutronica, mediante la aplicacion de tecnicas de analisis estadistico multivariable

    Energy Technology Data Exchange (ETDEWEB)

    Almazan T, M. G.; Jimenez R, M.; Monroy G, F.; Tenorio, D. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Rodriguez G, N. L. [Instituto Mexiquense de Cultura, Subdireccion de Restauracion y Conservacion, Hidalgo poniente No. 1013, 50080 Toluca, Estado de Mexico (Mexico)

    2009-07-01

    The elementary composition of archaeological ceramic fragments obtained during the explorations in San Miguel Ixtapan, Mexico State, was determined by the neutron activation analysis technique. The samples irradiation was realized in the research reactor TRIGA Mark III with a neutrons flow of 1centre dot10{sup 13}ncentre dotcm{sup -2}centre dots{sup -1}. The irradiation time was of 2 hours. Previous to the acquisition of the gamma rays spectrum the samples were allowed to decay from 12 to 14 days. The analyzed elements were: Nd, Ce, Lu, Eu, Yb, Pa(Th), Tb, La, Cr, Hf, Sc, Co, Fe, Cs, Rb. The statistical treatment of the data, consistent in the group analysis and the main components analysis allowed to identify three different origins of the archaeological ceramic, designated as: local, foreign and regional. (Author)

  11. Active Ageing: An Analysis

    Directory of Open Access Journals (Sweden)

    Alina-Cristina Nuta

    2011-10-01

    Full Text Available The problem of ageing is a highly topical for Romania and for European Union. In this framework, to create and implement some strategies for active ageing is an important objective. The international and regional forums set (supported by official statistics that the number of older people growing rapidly. Romania needs some programmes (with labour, social, economic, health care aspects to deal with the demographic changes, programs that will reform the existing working life structures and legislation. Despite the actual pension reform, which tries to close the opportunity of early retirement (by penalizing the total pension flows, or increasing the retirement age, etc., the labour system does not sets some important targets for this area.

  12. Design, data analysis and sampling techniques for clinical research

    OpenAIRE

    Karthik Suresh; Sanjeev V Thomas; Geetha Suresh

    2011-01-01

    Statistical analysis is an essential technique that enables a medical research practitioner to draw meaningful inference from their data analysis. Improper application of study design and data analysis may render insufficient and improper results and conclusion. Converting a medical problem into a statistical hypothesis with appropriate methodological and logical design and then back-translating the statistical results into relevant medical knowledge is a real challenge. This article explains...

  13. Memory Forensics: Review of Acquisition and Analysis Techniques

    Science.gov (United States)

    2013-11-01

    types of digital evidence investigated include images, text, video and audio files [1]. To date, digital forensic investigations have focused on the...UNCLASSIFIED Memory Forensics : Review of Acquisition and Analysis Techniques Grant Osborne Cyber and Electronic Warfare Division Defence Science and...Technology Organisation DSTO–GD–0770 ABSTRACT This document presents an overview of the most common memory forensics techniques used in the

  14. Analysis On Classification Techniques In Mammographic Mass Data Set

    OpenAIRE

    K.K.Kavitha; Dr.A.Kangaiammal

    2015-01-01

    Data mining, the extraction of hidden information from large databases, is to predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. Data-Mining classification techniques deals with determining to which group each data instances are associated with. It can deal with a wide variety of data so that large amount of data can be involved in processing. This paper deals with analysis on various data mining classification techniques such a...

  15. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    Science.gov (United States)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  16. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  17. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  18. Virtual Mold Technique in Thermal Stress Analysis during Casting Process

    Institute of Scientific and Technical Information of China (English)

    Si-Young Kwak; Jae-Wook Baek; Jeong-Ho Nam; Jeong-Kil Choi

    2008-01-01

    It is important to analyse the casting product and the mold at the same time considering thermal contraction of the casting and thermal expansion of the mold. The analysis considering contact of the casting and the mold induces the precise prediction of stress distribution and the defect such as hot tearing. But it is difficult to generate FEM mesh for the interface of the casting and the mold. Moreover the mesh for the mold domain spends lots of computational time and memory for the analysis due to a number of meshes. Consequently we proposed the virtual mold technique which only uses mesh of the casting part for thermal stress analysis in casting process. The spring bar element in virtual mold technique is used to consider the contact of the casting and the mold. In general, a volume of the mold is much bigger than that of casting part, so the proposed technique decreases the number of mesh and saves the computational memory and time greatly. In this study, the proposed technique was verified by the comparison with the traditional contact technique on a specimen. And the proposed technique gave satisfactory results.

  19. Parallelization of events generation for data analysis techniques

    CERN Document Server

    Lazzaro, A

    2010-01-01

    With the startup of the LHC experiments at CERN, the involved community is now focusing on the analysis of the collected data. The complexity of the data analyses will be a key factor for finding eventual new phenomena. For such a reason many data analysis tools have been developed in the last several years, which implement several data analysis techniques. Goal of these techniques is the possibility of discriminating events of interest and measuring parameters on a given input sample of events, which are themselves defined by several variables. Also particularly important is the possibility of repeating the determination of the parameters by applying the procedure on several simulated samples, which are generated using Monte Carlo techniques and the knowledge of the probability density functions of the input variables. This procedure achieves a better estimation of the results. Depending on the number of variables, complexity of their probability density functions, number of events, and number of sample to g...

  20. A new technique of ECG analysis and its application to evaluation of disorders during ventricular tachycardia

    Energy Technology Data Exchange (ETDEWEB)

    Moskalenko, A.V. [Institute of Theoretical and Experimental Biophysics RAS, Institutskaya Street, 3, Pushchino 142290 (Russian Federation)], E-mail: info@avmoskalenko.ru; Rusakov, A.V. [Institute of Theoretical and Experimental Biophysics RAS, Institutskaya Street, 3, Pushchino 142290 (Russian Federation); Elkin, Yu.E. [Institute of Mathematical Problems of Biology RAS, Institutskaya Street, 4, Pushchino 142290 (Russian Federation)

    2008-04-15

    We propose a new technique of ECG analysis to characterize the properties of polymorphic ventricular arrhythmias, potentially life-threatening disorders of cardiac activation. The technique is based on extracting two indices from the ECG fragment. The result is a new detailed quantitative description of polymorphic ECGs. Our observations suggest that the proposed ECG processing algorithm provides information that supplements the traditional visual ECG analysis. The estimates of ECG variation in this study reveal some unexpected details of ventricular activation dynamics, which are possibly useful for diagnosing cardiac rhythm disturbances.

  1. Active Learning Techniques Applied to an Interdisciplinary Mineral Resources Course.

    Science.gov (United States)

    Aird, H. M.

    2015-12-01

    An interdisciplinary active learning course was introduced at the University of Puget Sound entitled 'Mineral Resources and the Environment'. Various formative assessment and active learning techniques that have been effective in other courses were adapted and implemented to improve student learning, increase retention and broaden knowledge and understanding of course material. This was an elective course targeted towards upper-level undergraduate geology and environmental majors. The course provided an introduction to the mineral resources industry, discussing geological, environmental, societal and economic aspects, legislation and the processes involved in exploration, extraction, processing, reclamation/remediation and recycling of products. Lectures and associated weekly labs were linked in subject matter; relevant readings from the recent scientific literature were assigned and discussed in the second lecture of the week. Peer-based learning was facilitated through weekly reading assignments with peer-led discussions and through group research projects, in addition to in-class exercises such as debates. Writing and research skills were developed through student groups designing, carrying out and reporting on their own semester-long research projects around the lasting effects of the historical Ruston Smelter on the biology and water systems of Tacoma. The writing of their mini grant proposals and final project reports was carried out in stages to allow for feedback before the deadline. Speakers from industry were invited to share their specialist knowledge as guest lecturers, and students were encouraged to interact with them, with a view to employment opportunities. Formative assessment techniques included jigsaw exercises, gallery walks, placemat surveys, think pair share and take-home point summaries. Summative assessment included discussion leadership, exams, homeworks, group projects, in-class exercises, field trips, and pre-discussion reading exercises

  2. Multiscale statistical analysis of coronal solar activity

    CERN Document Server

    Gamborino, Diana; Martinell, Julio J

    2016-01-01

    Multi-filter images from the solar corona are used to obtain temperature maps which are analyzed using techniques based on proper orthogonal decomposition (POD) in order to extract dynamical and structural information at various scales. Exploring active regions before and after a solar flare and comparing them with quiet regions we show that the multiscale behavior presents distinct statistical properties for each case that can be used to characterize the level of activity in a region. Information about the nature of heat transport is also be extracted from the analysis.

  3. Improvement of vibration and noise by applying analysis technology. Development of active control technique of engine noise in a car cabin. Kaiseki gijutsu wo oyoshita shindo-soon no kaizen. Shashitsunai engine soon akutibu seigyo gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    Uchida, H.; Nakao, N.; Butsuen, T. (Matsuda Motor Corp., Hiroshima (Japan). Technology Research Inst.)

    1994-06-01

    It is difficult to reduce engine noise which is principal noise in a car cabin without producing an adverse effect on low cost production. Active noise control technique (ANC) has been developed to reduce engine noise compatible with low cost production. This paper discusses its control algorithm and the system configuration and presents experimental results. The filtered-x least mean square method is a well-known ANC algorithm, however, it often requires large amount of calculation exceeding the present capacity of a digital signal processor. An effective ANC algorithm is developed by the use of the repetitiveness of the engine noise. This paper describes the basic theory of the control algorithm, the extension to a multiple input and output system, the system configuration and experimental results. A noise control system with three microphones is designed with consideration of the spatial distribution of the noise and reduces noise in the whole cabin by 8dB(A) in the largest case. Active noise control technique is applicable to many areas and can be used for the reduction of noise and vibration other than engine noise. 5 refs., 7 figs., 1 tab.

  4. Data analysis techniques for nuclear and particle physicists

    CERN Document Server

    Pruneau, Claude

    2017-01-01

    This is an advanced data analysis textbook for scientists specializing in the areas of particle physics, nuclear physics, and related subfields. As a practical guide for robust, comprehensive data analysis, it focuses on realistic techniques to explain instrumental effects. The topics are relevant for engineers, scientists, and astroscientists working in the fields of geophysics, chemistry, and the physical sciences. The book serves as a reference for more senior scientists while being eminently accessible to advanced undergraduate and graduate students.

  5. Optimization Techniques for Analysis of Biological and Social Networks

    Science.gov (United States)

    2012-03-28

    systematic fashion under a unifying theoretical and algorithmic framework . Optimization, Complex Networks, Social Network Analysis, Computational...analyzing a new metaheuristic technique, variable objective search. 3. Experimentation and application: Implement the proposed algorithms, test and fine...exact solutions are presented. In [3], we introduce the variable objective search framework for combinatorial optimization. The method utilizes

  6. Characterization of archaeological ceramics from the north western lowland Maya Area, using the technique of neutron activation analysis; Caracterizacion de ceramicas arqueologicas de las tierras bajas noroccidentales del Area Maya, empleando la tecnica de activacion neutronica

    Energy Technology Data Exchange (ETDEWEB)

    Lopez R, M. C.; Tenorio, D.; Jimenez R, M. [ININ, Carretera Mexico-Toluca s/n, Ocoyoacac 52750, Estado de Mexico (Mexico); Terreros, E. [Museo del Templo Mayor, INAH, Seminario No. 8, Col. Centro, Mexico 06060, D. F. (Mexico); Ochoa, L. [UNAM, Instituto de Investigaciones Antropologicas, Circuito Exterior s/n, Ciudad Universitaria, Mexico 04510, D. F. (Mexico)

    2008-07-01

    It is a study on 50 samples of ceramics from various archaeological sites of the north western lowland Maya Area. This study was performed by neutron activation analysis of 19 chemical elements and the treatments relevant statistical data. Significant differences were found among the pieces that led to group them into five major groups, the difference is the site of their manufacture and therefore in the raw materials used for this. (Author)

  7. What Child Analysis Can Teach Us about Psychoanalytic Technique.

    Science.gov (United States)

    Ablon, Steven Luria

    2014-01-01

    Child analysis has much to teach us about analytic technique. Children have an innate, developmentally driven sense of analytic process. Children in analysis underscore the importance of an understanding and belief in the therapeutic action of play, the provisional aspects of play, and that not all play will be understood. Each analysis requires learning a new play signature that is constantly reorganized. Child analysis emphasizes the emergence and integration of dissociated states, the negotiation of self-other relationships, the importance of co-creation, and the child's awareness of the analyst's sensibility. Child analysis highlights the robust nature of transference and how working through and repairing is related to the initiation of coordinated patterns of high predictability in the context of deep attachments. I will illustrate these and other ideas in the description of the analysis of a nine-year-old boy.

  8. A comparative study on change vector analysis based change detection techniques

    Indian Academy of Sciences (India)

    Sartajvir Singh; Rajneesh Talwar

    2014-12-01

    Detection of Earth surface changes are essential to monitor regional climatic, snow avalanche hazard analysis and energy balance studies that occur due to air temperature irregularities. Geographic Information System (GIS) enables such research activities to be carried out through change detection analysis. From this viewpoint, different change detection algorithms have been developed for land-use land-cover (LULC) region. Among the different change detection algorithms, change vector analysis (CVA) has level headed capability of extracting maximuminformation in terms of overall magnitude of change and the direction of change between multispectral bands from multi-temporal satellite data sets. Since past two–three decades, many effective CVA based change detection techniques e.g., improved change vector analysis (ICVA), modified change vector analysis (MCVA) and change vector analysis posterior-probability space (CVAPS), have been developed to overcome the difficulty that exists in traditional change vector analysis (CVA). Moreover, many integrated techniques such as cross correlogram spectral matching (CCSM) based CVA. CVA uses enhanced principal component analysis (PCA) and inverse triangular (IT) function, hyper-spherical direction cosine (HSDC), and median CVA (m-CVA), as an effective LULC change detection tools. This paper comprises a comparative analysis on CVA based change detection techniques such as CVA, MCVA, ICVA and CVAPS. This paper also summarizes the necessary integrated CVA techniques along with their characteristics, features and shortcomings. Based on experiment outcomes, it has been evaluated that CVAPS technique has greater potential than other CVA techniques to evaluate the overall transformed information over three differentMODerate resolution Imaging Spectroradiometer (MODIS) satellite data sets of different regions. Results of this study are expected to be potentially useful for more accurate analysis of LULC changes which will, in turn

  9. Developing techniques for cause-responsibility analysis of occupational accidents.

    Science.gov (United States)

    Jabbari, Mousa; Ghorbani, Roghayeh

    2016-11-01

    The aim of this study was to specify the causes of occupational accidents, determine social responsibility and the role of groups involved in work-related accidents. This study develops occupational accidents causes tree, occupational accidents responsibility tree, and occupational accidents component-responsibility analysis worksheet; based on these methods, it develops cause-responsibility analysis (CRA) techniques, and for testing them, analyzes 100 fatal/disabling occupational accidents in the construction setting that were randomly selected from all the work-related accidents in Tehran, Iran, over a 5-year period (2010-2014). The main result of this study involves two techniques for CRA: occupational accidents tree analysis (OATA) and occupational accidents components analysis (OACA), used in parallel for determination of responsible groups and responsibilities rate. From the results, we find that the management group of construction projects has 74.65% responsibility of work-related accidents. The developed techniques are purposeful for occupational accidents investigation/analysis, especially for the determination of detailed list of tasks, responsibilities, and their rates. Therefore, it is useful for preventing work-related accidents by focusing on the responsible group's duties.

  10. Design, data analysis and sampling techniques for clinical research.

    Science.gov (United States)

    Suresh, Karthik; Thomas, Sanjeev V; Suresh, Geetha

    2011-10-01

    Statistical analysis is an essential technique that enables a medical research practitioner to draw meaningful inference from their data analysis. Improper application of study design and data analysis may render insufficient and improper results and conclusion. Converting a medical problem into a statistical hypothesis with appropriate methodological and logical design and then back-translating the statistical results into relevant medical knowledge is a real challenge. This article explains various sampling methods that can be appropriately used in medical research with different scenarios and challenges.

  11. Metabolic Engineering: Techniques for analysis of targets for genetic manipulations

    DEFF Research Database (Denmark)

    Nielsen, Jens Bredal

    1998-01-01

    of a given process requires analysis of the underlying mechanisms, at best, at the molecular level. To reveal these mechanisms a number of different techniques may be applied: (1) detailed physiological studies, (2) metabolic flux analysis (MFA), (3) metabolic control analysis (MCA), (4) thermodynamic......Metabolic engineering has been defined as the purposeful modification of intermediary metabolism using recombinant DNA techniques. With this definition metabolic engineering includes: (1) inserting new pathways in microorganisms with the aim of producing novel metabolites, e.g., production...... of polyketides by Streptomyces; (2) production of heterologous peptides, e.g., production of human insulin, erythropoitin, and tPA; and (3) improvement of both new and existing processes, e.g., production of antibiotics and industrial enzymes. Metabolic engineering is a multidisciplinary approach, which involves...

  12. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm......The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... (SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits...

  13. Electrogastrography: A Noninvasive Technique to Evaluate Gastric Electrical Activity

    Directory of Open Access Journals (Sweden)

    Claudia P Sanmiguel

    1998-01-01

    Full Text Available Electrogastrography (EGG is the recording of gastric electrical activity (GEA from the body surface. The cutaneous signal is low in amplitude and consequently must be amplified considerably. The resultant signal is heavily contaminated with noise, and visual analysis alone of an EGG signal is inadequate. Consequently, EGG recordings require special methodology for acquisition, processing and analysis. Essential components of this methodology involve an adequate system of digital filtering, amplification and analysis, along with minimization of the sources of external noise (random motions of the patient, electrode-skin interface impedance, electrode bending, obesity, etc and a quantitative interpretation of the recordings. There is a close relationship between GEA and gastric motility. Although it has been demonstrated that EGG satisfactorily reflects internal GEA frequency, there is not acceptable correlation with gastric contractions or gastric emptying. Many attempts have been made to relate EGG 'abnormalities' with clinical syndromes and diseases; however, the diagnostic and clinical value of EGG is still very much in question.

  14. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T......, which is shown to be approximated by a χ2 distribution. Application of this test to the results of determinations of manganese in human serum by a method of established precision, led to the detection of airborne pollution of the serum during the sampling process. The subsequent improvement in sampling...... conditions was shown to give not only increased precision, but also improved accuracy of the results....

  15. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    Directory of Open Access Journals (Sweden)

    Mohammed naved Khan

    2013-05-01

    Full Text Available Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among researchers to rely on higher order multivariate techniques viz. confirmatory factor analysis, structural equation modeling etc. to generate and analyze complex models, while at times ignoring very basic and yet powerful procedures such as mean, t-Test, ANOVA and correlation. The marked shift in orientation of researchers towards using sophisticated analytical techniques can largely beattributed to the competition within the community of researchers in social sciences in general and those working in the area of service quality in particular as also growing demands of reviewers ofjournals. From a pragmatic viewpoint, it is expected that the paper will serve as a useful source of information and provide deeper insights to academic researchers, consultants, and practitionersinterested in modelling patterns of service quality and arriving at optimal solutions to increasingly complex management problems.

  16. Error analysis in correlation computation of single particle reconstruction technique

    Institute of Scientific and Technical Information of China (English)

    胡悦; 隋森芳

    1999-01-01

    The single particle reconstruction technique has become particularly important in the structure analysis of hiomaeromolecules. The problem of reconstructing a picture from identical samples polluted by colored noises is studied, and the alignment error in the correlation computation of single particle reconstruction technique is analyzed systematically. The concept of systematic error is introduced, and the explicit form of the systematic error is given under the weak noise approximation. The influence of the systematic error on the reconstructed picture is discussed also, and an analytical formula for correcting the distortion in the picture reconstruction is obtained.

  17. Evaluation of Damping Using Frequency Domain Operational Modal Analysis Techniques

    DEFF Research Database (Denmark)

    Bajric, Anela; Georgakis, Christos T.; Brincker, Rune

    2015-01-01

    Operational Modal Analysis (OMA) techniques provide in most cases reasonably accurate estimates of structural frequencies and mode shapes. In contrast though, they are known to often produce uncertain structural damping estimates, which is mainly due to inherent random and/or bias errors...... domain techniques, the Frequency Domain Decomposition (FDD) and the Frequency Domain Polyreference (FDPR). The response of a two degree-of-freedom (2DOF) system is numerically established with specified modal parameters subjected to white noise loading. The system identification is evaluated with well...

  18. Analysis On Classification Techniques In Mammographic Mass Data Set

    Directory of Open Access Journals (Sweden)

    Mrs. K. K. Kavitha

    2015-07-01

    Full Text Available Data mining, the extraction of hidden information from large databases, is to predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. Data-Mining classification techniques deals with determining to which group each data instances are associated with. It can deal with a wide variety of data so that large amount of data can be involved in processing. This paper deals with analysis on various data mining classification techniques such as Decision Tree Induction, Naïve Bayes , k-Nearest Neighbour (KNN classifiers in mammographic mass dataset.

  19. Golden glazes analysis by PIGE and PIXE techniques

    Science.gov (United States)

    Fonseca, M.; Luís, H.; Franco, N.; Reis, M. A.; Chaves, P. C.; Taborda, A.; Cruz, J.; Galaviz, D.; Fernandes, N.; Vieira, P.; Ribeiro, J. P.; Jesus, A. P.

    2011-12-01

    We present the analysis performed on the chemical composition of two golden glazes available in the market using the PIGE and PIXE techniques at the ITN ion beam laboratory. The analysis of the light elements was performed using the Emitted Radiation Yield Analysis (ERYA) code, a standard-free method for PIGE analysis on thick samples. The results were compared to those obtained on an old glaze. Consistently high concentrations of lead and sodium were found in all analyzed golden glazes. The analysis of the samples pointed to Mo and Co as the specific elements responsible of the gold colour at the desired temperature, and allowed Portuguese ceramists to produce a golden glaze at 997 °C. Optical reflection spectra of the glazes are given, showing that the produced glaze has a spectrum similar to the old glaze. Also, in order to help the ceramists, the unknown compositions of four different types of frits (one of the components of glazes) were analysed.

  20. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    Science.gov (United States)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  1. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  2. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  3. Large areas elemental mapping by ion beam analysis techniques

    Science.gov (United States)

    Silva, T. F.; Rodrigues, C. L.; Curado, J. F.; Allegro, P.; Moro, M. V.; Campos, P. H. O. V.; Santos, S. B.; Kajiya, E. A. M.; Rizzutto, M. A.; Added, N.; Tabacniks, M. H.

    2015-07-01

    The external beam line of the Laboratory for Material Analysis with Ion Beams (LAMFI) is a versatile setup for multi-technique analysis. X-ray detectors for Particle Induced X-rays Emission (PIXE) measurements, a Gamma-ray detector for Particle Induced Gamma- ray Emission (PIGE), and a particle detector for scattering analysis, such as Rutherford Backscattering Spectrometry (RBS), were already installed. In this work, we present some results, using a large (60-cm range) XYZ computer controlled sample positioning system, completely developed and build in our laboratory. The XYZ stage was installed at the external beam line and its high spacial resolution (better than 5 μm over the full range) enables positioning the sample with high accuracy and high reproducibility. The combination of a sub-millimeter beam with the large range XYZ robotic stage is being used to produce elemental maps of large areas in samples like paintings, ceramics, stones, fossils, and all sort of samples. Due to its particular characteristics, this is a unique device in the sense of multi-technique analysis of large areas. With the continuous development of the external beam line at LAMFI, coupled to the robotic XYZ stage, it is becoming a robust and reliable option for regular analysis of trace elements (Z > 5) competing with the traditional in-vacuum ion-beam-analysis with the advantage of automatic rastering.

  4. Innovative techniques to analyze time series of geomagnetic activity indices

    Science.gov (United States)

    Balasis, Georgios; Papadimitriou, Constantinos; Daglis, Ioannis A.; Potirakis, Stelios M.; Eftaxias, Konstantinos

    2016-04-01

    Magnetic storms are undoubtedly among the most important phenomena in space physics and also a central subject of space weather. The non-extensive Tsallis entropy has been recently introduced, as an effective complexity measure for the analysis of the geomagnetic activity Dst index. The Tsallis entropy sensitively shows the complexity dissimilarity among different "physiological" (normal) and "pathological" states (intense magnetic storms). More precisely, the Tsallis entropy implies the emergence of two distinct patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a higher degree of organization, and (ii) a pattern associated with normal periods, which is characterized by a lower degree of organization. Other entropy measures such as Block Entropy, T-Complexity, Approximate Entropy, Sample Entropy and Fuzzy Entropy verify the above mentioned result. Importantly, the wavelet spectral analysis in terms of Hurst exponent, H, also shows the existence of two different patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a fractional Brownian persistent behavior (ii) a pattern associated with normal periods, which is characterized by a fractional Brownian anti-persistent behavior. Finally, we observe universality in the magnetic storm and earthquake dynamics, on a basis of a modified form of the Gutenberg-Richter law for the Tsallis statistics. This finding suggests a common approach to the interpretation of both phenomena in terms of the same driving physical mechanism. Signatures of discrete scale invariance in Dst time series further supports the aforementioned proposal.

  5. On discriminant analysis techniques and correlation structures in high dimensions

    DEFF Research Database (Denmark)

    Clemmensen, Line Katrine Harder

    This paper compares several recently proposed techniques for performing discriminant analysis in high dimensions, and illustrates that the various sparse methods dier in prediction abilities depending on their underlying assumptions about the correlation structures in the data. The techniques...... generally focus on two things: Obtaining sparsity (variable selection) and regularizing the estimate of the within-class covariance matrix. For high-dimensional data, this gives rise to increased interpretability and generalization ability over standard linear discriminant analysis. Here, we group...... variables. The two groups of methods are compared and the pros and cons are exemplied using dierent cases of simulated data. The results illustrate that the estimate of the covariance matrix is an important factor with respect to choice of method, and the choice of method should thus be driven by the nature...

  6. The analysis of gastric function using computational techniques

    CERN Document Server

    Young, P

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of...

  7. Characterization of PTFE Using Advanced Thermal Analysis Techniques

    Science.gov (United States)

    Blumm, J.; Lindemann, A.; Meyer, M.; Strasser, C.

    2010-10-01

    Polytetrafluoroethylene (PTFE) is a synthetic fluoropolymer used in numerous industrial applications. It is often referred to by its trademark name, Teflon. Thermal characterization of a PTFE material was carried out using various thermal analysis and thermophysical properties test techniques. The transformation energetics and specific heat were measured employing differential scanning calorimetry. The thermal expansion and the density changes were determined employing pushrod dilatometry. The viscoelastic properties (storage and loss modulus) were analyzed using dynamic mechanical analysis. The thermal diffusivity was measured using the laser flash technique. Combining thermal diffusivity data with specific heat and density allows calculation of the thermal conductivity of the polymer. Measurements were carried out from - 125 °C up to 150 °C. Additionally, measurements of the mechanical properties were carried out down to - 170 °C. The specific heat tests were conducted into the fully molten regions up to 370 °C.

  8. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  9. Chromatographic Fingerprint Analysis of Marrubiin in Marrubium vulgare L. via HPTLC Technique

    OpenAIRE

    Keyvan Yousefi; Sanaz Hamedeyazdan; Mohammadali Torbati; Fatemeh Fathiazad

    2016-01-01

    Purpose: In the present study we aimed to quantify marrubiin, as the major active compound, in the aerial parts of Marrubium vulgare from Iran using a HPTLC-densitometry technique. Methods: Quantitative determination of marrubiin in M. vulgare methanol extract was performed by HPTLC analysis via a fully automated TLC scanner. Later on, the in vitro antioxidant activity of the M. vulgare methanol extract was determined using 1,1-diphenyl-2-picryl-hydrazil (DPPH) free radic...

  10. Technique of Hadamard transform microscope fluorescence image analysis

    Institute of Scientific and Technical Information of China (English)

    梅二文; 顾文芳; 曾晓斌; 陈观铨; 曾云鹗

    1995-01-01

    Hadamard transform spatial multiplexed imaging technique is combined with fluorescence microscope and an instrument of Hadamard transform microscope fluorescence image analysis is developed. Images acquired by this instrument can provide a lot of useful information simultaneously, including three-dimensional Hadamard transform microscope cell fluorescence image, the fluorescence intensity and fluorescence distribution of a cell, the background signal intensity and the signal/noise ratio, etc.

  11. Impedance Flow Cytometry: A Novel Technique in Pollen Analysis

    OpenAIRE

    Heidmann, Iris; Schade-Kampmann, Grit; Lambalk, Joep; Ottiger, Marcel; Di Berardino, Marco

    2016-01-01

    Introduction An efficient and reliable method to estimate plant cell viability, especially of pollen, is important for plant breeding research and plant production processes. Pollen quality is determined by classical methods, like staining techniques or in vitro pollen germination, each having disadvantages with respect to reliability, analysis speed, and species dependency. Analysing single cells based on their dielectric properties by impedance flow cytometry (IFC) has developed into a comm...

  12. Calcium Hardness Analysis of Water Samples Using EDXRF Technique

    Directory of Open Access Journals (Sweden)

    Kanan Deep

    2014-08-01

    Full Text Available Calcium hardness of water samples has been determined using a method based upon the Energy Dispersive X-ray fluorescence (EDXRF technique for elemental analysis. The minimum detection limit for Ca has been found in the range 0.1-100ppm. The experimental approach and analytical method for calcium studies seem satisfactory for the purpose and can be utilized for similar investigations.

  13. Failure Analysis Seminar: Techniques and Teams. Seminar Notes. Volume I.

    Science.gov (United States)

    1981-01-01

    and Progress - Evaluate 7* 6 *~ 0 6 9 9 S 9 FAILURE ANALYSIS STRATEGY1 Augustine E. Magistro *. Introduction A primary task of management and systems...by Augustine Magistro , Picatinny Arsenal and Lawrence R. Seggel, U. S. Army Missile Command. The report Is available from the National Technical...to emphasize techniques - Identification and improvement of your leadership styles 2I BIOGRAPHIC SKETCHES: A.E. "Gus" Magistro - Systems Evaluation

  14. Analysis of diagnostic calorimeter data by the transfer function technique

    Science.gov (United States)

    Delogu, R. S.; Poggi, C.; Pimazzoni, A.; Rossi, G.; Serianni, G.

    2016-02-01

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  15. Analysis of diagnostic calorimeter data by the transfer function technique

    Energy Technology Data Exchange (ETDEWEB)

    Delogu, R. S., E-mail: rita.delogu@igi.cnr.it; Pimazzoni, A.; Serianni, G. [Consorzio RFX, Corso Stati Uniti, 35127 Padova (Italy); Poggi, C.; Rossi, G. [Università degli Studi di Padova, Via 8 Febbraio 1848, 35122 Padova (Italy)

    2016-02-15

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  16. The potential of electroanalytical techniques in pharmaceutical analysis.

    Science.gov (United States)

    Kauffmann, J M; Pékli-Novák, M; Nagy, A

    1996-03-01

    With the considerable progresses observed in analytical instrumentation, it was of interest to survey recent trends in the field of electroanalysis of drugs. Potentiometric, voltammetric and amperometric techniques were scrutinized both in terms of historical evolution and in terms of potentialities with respect to the analysis of drugs in various matrices. With regard to the former, it appeared that numerous original selective electrodes (for drugs and ions) have been studied and several ion-selective electrodes have been successfully commercialized. Improvements are still expected in this field in order to find more robust membrane matrices and to minimize the surface fouling. Electrochemistry is well suited for trace metal analysis. A renewed interest in potentiometric stripping analysis is observed and is stimulated by the power of computers and microprocessors which allow rapid signal recording and data handling. Polarography and its refinements (Pulsed Waveform, Automation,...) is ideally applied for trace metal analysis and speciation. The technique is still useful in the analysis of drug formulations and in biological samples provided that the method is adequately validated (selectivity!). The same holds for solid electrodes which are currently routinely applied as sensitive detectors after chromatographic separation. New instrumentation is soon expected as regard electrochemical detection in capillary electrophoresis. Actually, in order to increase the responses and improve the selectivity, solid electrodes are facing exponential research dedicated to surface modifications. Perm-selectivity, chelations catalysis, etc. may be considered as appropriate strategies. Microelectrodes and screen printed (disposable) sensors are of considerable interest in cell culture e.g. for single cell excretion analysis and in field (decentralized) assays, respectively. Finally several biosensors and electrochemical immunoassays have been successfully development for the

  17. Weed Identification Using An Automated Active Shape Matching (AASM) Technique

    DEFF Research Database (Denmark)

    Swain, K C; Nørremark, Michael; Jørgensen, R N

    2011-01-01

    Weed identification and control is a challenge for intercultural operations in agriculture. As an alternative to chemical pest control, a smart weed identification technique followed by mechanical weed control system could be developed. The proposed smart identification technique works on the con......Weed identification and control is a challenge for intercultural operations in agriculture. As an alternative to chemical pest control, a smart weed identification technique followed by mechanical weed control system could be developed. The proposed smart identification technique works......-leaf growth stage model for Solanum nigrum L. (nightshade) is generated from 32 segmented training images in Matlab software environment. Using the AASM algorithm, the leaf model was aligned and placed at the centre of the target plant and a model deformation process carried out. The parameters used...

  18. Applying Stylometric Analysis Techniques to Counter Anonymity in Cyberspace

    Directory of Open Access Journals (Sweden)

    Jianwen Sun

    2012-02-01

    Full Text Available Due to the ubiquitous nature and anonymity abuses in cyberspace, it’s difficult to make criminal identity tracing in cybercrime investigation. Writeprint identification offers a valuable tool to counter anonymity by applying stylometric analysis technique to help identify individuals based on textual traces. In this study, a framework for online writeprint identification is proposed. Variable length character n-gram is used to represent the author’s writing style. The technique of IG seeded GA based feature selection for Ensemble (IGAE is also developed to build an identification model based on individual author level features. Several specific components for dealing with the individual feature set are integrated to improve the performance. The proposed feature and technique are evaluated on a real world data set encompassing reviews posted by 50 Amazon customers. The experimental results show the effectiveness of the proposed framework, with accuracy over 94% for 20 authors and over 80% for 50 ones. Compared with the baseline technique (Support Vector Machine, a higher performance is achieved by using IGAE, resulting in a 2% and 8% improvement over SVM for 20 and 50 authors respectively. Moreover, it has been shown that IGAE is more scalable in terms of the number of authors, than author group level based methods.

  19. Dispersion analysis techniques within the space vehicle dynamics simulation program

    Science.gov (United States)

    Snow, L. S.; Kuhn, A. E.

    1975-01-01

    The Space Vehicle Dynamics Simulation (SVDS) program was evaluated as a dispersion analysis tool. The Linear Error Analysis (LEA) post processor was examined in detail and simulation techniques relative to conducting a dispersion analysis using the SVDS were considered. The LEA processor is a tool for correlating trajectory dispersion data developed by simulating 3 sigma uncertainties as single error source cases. The processor combines trajectory and performance deviations by a root-sum-square (RSS process) and develops a covariance matrix for the deviations. Results are used in dispersion analyses for the baseline reference and orbiter flight test missions. As a part of this study, LEA results were verified as follows: (A) Hand calculating the RSS data and the elements of the covariance matrix for comparison with the LEA processor computed data. (B) Comparing results with previous error analyses. The LEA comparisons and verification are made at main engine cutoff (MECO).

  20. Arc-length technique for nonlinear finite element analysis

    Institute of Scientific and Technical Information of China (English)

    MEMON Bashir-Ahmed; SU Xiao-zu(苏小卒)

    2004-01-01

    Nonlinear solution of reinforced concrete structures, particularly complete load-deflection response, requires tracing of the equilibrium path and proper treatment of the limit and bifurcation points. In this regard, ordinary solution techniques lead to instability near the limit points and also have problems in case of snap-through and snap-back. Thus they fail to predict the complete load-displacement response. The arc-length method serves the purpose well in principle, Received wide acceptance in finite element analysis, and has been used extensively. However modifications to the basic idea are vital to meet the particular needs of the analysis. This paper reviews some of the recent developments of the method in the last two decades, with particular emphasis on nonlinear finite element analysis of reinforced concrete structures.

  1. Requirements Analyses Integrating Goals and Problem Analysis Techniques

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    One of the difficulties that goal-oriented requirements analyses encounters is that the efficiency of the goal refinement is based on the analysts' subjective knowledge and experience. To improve the efficiency of the requirements eiicitation process, engineers need approaches with more systemized analysis techniques. This paper integrates the goal-oriented requirements language i* with concepts from a structured problem analysis notation, problem frames (PF). The PF approach analyzes software design as a contextualized problem which has to respond to constraints imposed by the environment. The proposed approach is illustrated using the meeting scheduler exemplar. Results show that integration of the goal and the problem analysis enables simultaneous consideration of the designer's subjective intentions and the physical environmental constraints.

  2. EMPIRICAL ANALYSIS OF DATA MINING TECHNIQUES FOR SOCIAL NETWORK WEBSITES

    Directory of Open Access Journals (Sweden)

    S.G.S Fernando

    2015-11-01

    Full Text Available Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of applications. With the growth of online social networks have significantly increased data content available because profile holders become more active producers and distributors of such data. This paper identifies and analyzes existing web mining techniques used to mine social network data.

  3. Empirical Analysis of Data Mining Techniques for Social Network Websites

    Directory of Open Access Journals (Sweden)

    S.G.S Fernando

    2014-02-01

    Full Text Available Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of applications. With the growth of online social networks have significantly increased data content available because profile holders become more active producers and distributors of such data. This paper identifies and analyzes existing web mining techniques used to mine social network data.

  4. BaTMAn: Bayesian Technique for Multi-image Analysis

    CERN Document Server

    Casado, J; García-Benito, R; Guidi, G; Choudhury, O S; Bellocchi, E; Sánchez, S; Díaz, A I

    2016-01-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BaTMAn), a novel image segmentation technique based on Bayesian statistics, whose main purpose is to characterize an astronomical dataset containing spatial information and perform a tessellation based on the measurements and errors provided as input. The algorithm will iteratively merge spatial elements as long as they are statistically consistent with carrying the same information (i.e. signal compatible with being identical within the errors). We illustrate its operation and performance with a set of test cases that comprises both synthetic and real Integral-Field Spectroscopic (IFS) data. Our results show that the segmentations obtained by BaTMAn adapt to the underlying structure of the data, regardless of the precise details of their morphology and the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in those regions where the signal is actually con...

  5. COMPARISON ANALYSIS OF WEB USAGE MINING USING PATTERN RECOGNITION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Nanhay Singh

    2013-07-01

    Full Text Available Web usage mining is the application of data mining techniques to better serve the needs of web-based applications on the web site. In this paper, we analyze the web usage mining by applying the pattern recognition techniques on web log data. Pattern recognition is defined as the act of taking in raw data and making an action based on the ‘category’ of the pattern. Web usage mining is divided into three partsPreprocessing, Pattern discovery and Pattern analysis. Further, this paper intended with experimental work in which web log data is used. We have taken the web log data from the “NASA” web server which is analyzed with “Web Log Explorer”. Web Log Explorer is a web usage mining tool which plays the vital role to carry out this work.

  6. Modular Sampling and Analysis Techniques for the Real-Time Analysis of Human Breath

    Energy Technology Data Exchange (ETDEWEB)

    Frank, M; Farquar, G; Adams, K; Bogan, M; Martin, A; Benner, H; Spadaccini, C; Steele, P; Davis, C; Loyola, B; Morgan, J; Sankaran, S

    2007-07-09

    At LLNL and UC Davis, we are developing several techniques for the real-time sampling and analysis of trace gases, aerosols and exhaled breath that could be useful for a modular, integrated system for breath analysis. Those techniques include single-particle bioaerosol mass spectrometry (BAMS) for the analysis of exhaled aerosol particles or droplets as well as breath samplers integrated with gas chromatography mass spectrometry (GC-MS) or MEMS-based differential mobility spectrometry (DMS). We describe these techniques and present recent data obtained from human breath or breath condensate, in particular, addressing the question of how environmental exposure influences the composition of breath.

  7. Atmospheric Deposition of Heavy Metals around the Lead and Copper-Zinc Smelters in Baia Mare, Romania, Studied by the Moss Biomonitoring Technique, Neutron Activation Analysis and Flame Atomic Absorption Spectrometry

    CERN Document Server

    Culicov, O A; Steinnes, E; Okina, O S; Santa, Z; Todoran, R

    2002-01-01

    The mosses Pleurozium schreberi, Pseudoscleropodium purum and Rhytidiadelphus squarrosus were used as biomonitors to study the atmospheric deposition of heavy metals around the lead and copper-zinc smelters in Baia Mare. Samples representing the last three years' growth of moss or its green part, collected on the ground at 28 sites located 2-17 km from the source area, were analyzed by instrumental neutron activation analysis using epithermal neutrons and by flame atomic absorption spectrometry. A total of 31 elements were determined, including most of the heavy metals characteristic of emissions from this kind industry. The observed data for Pb, As, Cu, and Cd are all high compared with those observed in other regions of Europe with similar industries, but the concentrations in moss approach regional background levels at a distance of about 8 km from the main source area. Factor analysis of the data distinguishes two industrial components, one characterized by Pb, Cu, As, and Sb, and another one by Zn and Cd...

  8. An Empirical Analysis of Rough Set Categorical Clustering Techniques

    Science.gov (United States)

    2017-01-01

    Clustering a set of objects into homogeneous groups is a fundamental operation in data mining. Recently, many attentions have been put on categorical data clustering, where data objects are made up of non-numerical attributes. For categorical data clustering the rough set based approaches such as Maximum Dependency Attribute (MDA) and Maximum Significance Attribute (MSA) has outperformed their predecessor approaches like Bi-Clustering (BC), Total Roughness (TR) and Min-Min Roughness(MMR). This paper presents the limitations and issues of MDA and MSA techniques on special type of data sets where both techniques fails to select or faces difficulty in selecting their best clustering attribute. Therefore, this analysis motivates the need to come up with better and more generalize rough set theory approach that can cope the issues with MDA and MSA. Hence, an alternative technique named Maximum Indiscernible Attribute (MIA) for clustering categorical data using rough set indiscernible relations is proposed. The novelty of the proposed approach is that, unlike other rough set theory techniques, it uses the domain knowledge of the data set. It is based on the concept of indiscernibility relation combined with a number of clusters. To show the significance of proposed approach, the effect of number of clusters on rough accuracy, purity and entropy are described in the form of propositions. Moreover, ten different data sets from previously utilized research cases and UCI repository are used for experiments. The results produced in tabular and graphical forms shows that the proposed MIA technique provides better performance in selecting the clustering attribute in terms of purity, entropy, iterations, time, accuracy and rough accuracy. PMID:28068344

  9. Investigation of spectral analysis techniques for randomly sampled velocimetry data

    Science.gov (United States)

    Sree, Dave

    1993-01-01

    It is well known that velocimetry (LV) generates individual realization velocity data that are randomly or unevenly sampled in time. Spectral analysis of such data to obtain the turbulence spectra, and hence turbulence scales information, requires special techniques. The 'slotting' technique of Mayo et al, also described by Roberts and Ajmani, and the 'Direct Transform' method of Gaster and Roberts are well known in the LV community. The slotting technique is faster than the direct transform method in computation. There are practical limitations, however, as to how a high frequency and accurate estimate can be made for a given mean sampling rate. These high frequency estimates are important in obtaining the microscale information of turbulence structure. It was found from previous studies that reliable spectral estimates can be made up to about the mean sampling frequency (mean data rate) or less. If the data were evenly samples, the frequency range would be half the sampling frequency (i.e. up to Nyquist frequency); otherwise, aliasing problem would occur. The mean data rate and the sample size (total number of points) basically limit the frequency range. Also, there are large variabilities or errors associated with the high frequency estimates from randomly sampled signals. Roberts and Ajmani proposed certain pre-filtering techniques to reduce these variabilities, but at the cost of low frequency estimates. The prefiltering acts as a high-pass filter. Further, Shapiro and Silverman showed theoretically that, for Poisson sampled signals, it is possible to obtain alias-free spectral estimates far beyond the mean sampling frequency. But the question is, how far? During his tenure under 1993 NASA-ASEE Summer Faculty Fellowship Program, the author investigated from his studies on the spectral analysis techniques for randomly sampled signals that the spectral estimates can be enhanced or improved up to about 4-5 times the mean sampling frequency by using a suitable

  10. Burnout prediction using advance image analysis coal characterization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Edward Lester; Dave Watts; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical Environmental and Mining Engineering

    2003-07-01

    The link between petrographic composition and burnout has been investigated previously by the authors. However, these predictions were based on 'bulk' properties of the coal, including the proportion of each maceral or the reflectance of the macerals in the whole sample. Combustion studies relating burnout with microlithotype analysis, or similar, remain less common partly because the technique is more complex than maceral analysis. Despite this, it is likely that any burnout prediction based on petrographic characteristics will become more accurate if it includes information about the maceral associations and the size of each particle. Chars from 13 coals, 106-125 micron size fractions, were prepared using a Drop Tube Furnace (DTF) at 1300{degree}C and 200 millisecond and 1% Oxygen. These chars were then refired in the DTF at 1300{degree}C 5% oxygen and residence times of 200, 400 and 600 milliseconds. The progressive burnout of each char was compared with the characteristics of the initial coals. This paper presents an extension of previous studies in that it relates combustion behaviour to coals that have been characterized on a particle by particle basis using advanced image analysis techniques. 13 refs., 7 figs.

  11. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    Science.gov (United States)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  12. The effects of communication techniques on public relation activities: A sample of hospitality business

    Directory of Open Access Journals (Sweden)

    Şirvan Şen Demir

    2011-07-01

    Full Text Available Nowadays, firms who give importance to public relations have been increasing rapidly in numbers. All modern firms either found public relations department in their body to deal with public relations operations or outsource this activity to consultants in order to communicate with target populations. Among the firms in tourism sector, hospitality companies are the ones that use public relations the most. The purpose of this study is to investigate the communication techniques in public relation and effects of these techniques on public relation activities. A literature review was conducted for research model and then questionnaire was developed from the studies in the literature. Data were collected by researchers in face-to-face interviews with 145 supervisors who are responsible for public relation activities of the hotel and were analyzed with SPSS statistical programs. Structural and convergent validity of the data have revealed with the explanatory factor analysis. It was tested using a regression analysis to determine the effects of independent variables on dependent variables. As a result, independent variables have positive effects on the dependent variables.

  13. Dynamic Range Analysis of the Phase Generated Carrier Demodulation Technique

    Directory of Open Access Journals (Sweden)

    M. J. Plotnikov

    2014-01-01

    Full Text Available The dependence of the dynamic range of the phase generated carrier (PGC technique on low-pass filters passbands is investigated using a simulation model. A nonlinear character of this dependence, which could lead to dynamic range limitations or measurement uncertainty, is presented for the first time. A detailed theoretical analysis is provided to verify the simulation results and these results are consistent with performed calculations. The method for the calculation of low-pass filters passbands according to the required dynamic range upper limit is proposed.

  14. New technique for high-speed microjet breakup analysis

    Energy Technology Data Exchange (ETDEWEB)

    Vago, N. [Department of Atomic Physics, Budapest University of Technology and Economics, Budafoki ut 8, 1111, Budapest (Hungary); Synova SA, Ch. Dent d' Oche, 1024 Ecublens (Switzerland); Spiegel, A. [Department of Atomic Physics, Budapest University of Technology and Economics, Budafoki ut 8, 1111, Budapest (Hungary); Couty, P. [Institute of Imaging and Applied Optics, Swiss Federal Institute of Technology, Lausanne, BM, 1015, Lausanne (Switzerland); Wagner, F.R.; Richerzhagen, B. [Synova SA, Ch. Dent d' Oche, 1024 Ecublens (Switzerland)

    2003-10-01

    In this paper we introduce a new technique for visualizing the breakup of thin high-speed liquid jets. Focused light of a He-Ne laser is coupled into a water jet, which behaves as a cylindrical waveguide until the point where the amplitude of surface waves is large enough to scatter out the light from the jet. Observing the jet from a direction perpendicular to its axis, the light that appears indicates the location of breakup. Real-time examination and also statistical analysis of the jet disruption is possible with this method. A ray tracing method was developed to demonstrate the light scattering process. (orig.)

  15. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune

    1998-01-01

    This article describes the work carried out within the project: Modal Analysis Based on the Random Decrement Technique - Application to Civil Engineering Structures. The project is part of the research programme: Dynamics of Structures sponsored by the Danish Technical Research Counsil. The planned...... contents and the requirement for the project prior to its start are described together with thee results obtained during the 3 year period of the project. The project was mainly carried out as a Ph.D project by the first author from September 1994 to August 1997 in cooperation with associate professor Rune...

  16. Image analysis technique applied to lock-exchange gravity currents

    OpenAIRE

    Nogueira, Helena; Adduce, Claudia; Alves, Elsa; Franca, Rodrigues Pereira Da; Jorge, Mario

    2013-01-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in th...

  17. Golden glazes analysis by PIGE and PIXE techniques

    Energy Technology Data Exchange (ETDEWEB)

    Fonseca, M., E-mail: mmfonseca@itn.pt [Dept. Fisica, Faculdade de Ciencias e Tecnologia, Universidade Nova de Lisboa, Caparica (Portugal); Centro de Fisica Nuclear da Universidade de Lisboa, Lisboa (Portugal); Luis, H., E-mail: heliofluis@itn.pt [Dept. Fisica, Faculdade de Ciencias e Tecnologia, Universidade Nova de Lisboa, Caparica (Portugal); Centro de Fisica Nuclear da Universidade de Lisboa, Lisboa (Portugal); Franco, N., E-mail: nfranco@itn.pt [Instituto Tecnologico Nuclear, Sacavem (Portugal); Reis, M.A., E-mail: mareis@itn.pt [Instituto Tecnologico Nuclear, Sacavem (Portugal); Chaves, P.C., E-mail: cchaves@itn.pt [Instituto Tecnologico Nuclear, Sacavem (Portugal); Taborda, A., E-mail: galaviz@cii.fc.ul.pt [Instituto Tecnologico Nuclear, Sacavem (Portugal); Cruz, J., E-mail: jdc@fct.unl.pt [Dept. Fisica, Faculdade de Ciencias e Tecnologia, Universidade Nova de Lisboa, Caparica (Portugal); Centro de Fisica Nuclear da Universidade de Lisboa, Lisboa (Portugal); Galaviz, D., E-mail: ataborda@itn.pt [Centro de Fisica Nuclear da Universidade de Lisboa, Lisboa (Portugal); Dept. Fisica, Faculdade de Ciencias, Universidade de Lisboa, Lisboa (Portugal); and others

    2011-12-15

    We present the analysis performed on the chemical composition of two golden glazes available in the market using the PIGE and PIXE techniques at the ITN ion beam laboratory. The analysis of the light elements was performed using the Emitted Radiation Yield Analysis (ERYA) code, a standard-free method for PIGE analysis on thick samples. The results were compared to those obtained on an old glaze. Consistently high concentrations of lead and sodium were found in all analyzed golden glazes. The analysis of the samples pointed to Mo and Co as the specific elements responsible of the gold colour at the desired temperature, and allowed Portuguese ceramists to produce a golden glaze at 997 Degree-Sign C. Optical reflection spectra of the glazes are given, showing that the produced glaze has a spectrum similar to the old glaze. Also, in order to help the ceramists, the unknown compositions of four different types of frits (one of the components of glazes) were analysed.

  18. BaTMAn: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2016-12-01

    Bayesian Technique for Multi-image Analysis (BaTMAn) characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). The output segmentations successfully adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. BaTMAn identifies (and keeps) all the statistically-significant information contained in the input multi-image (e.g. an IFS datacube). The main aim of the algorithm is to characterize spatially-resolved data prior to their analysis.

  19. Dynamic analysis of granite rockburst based on the PIV technique

    Institute of Scientific and Technical Information of China (English)

    Wang Hongjian; Liu Da’an; Gong Weili; Li Liyun

    2015-01-01

    This paper describes the deep rockburst simulation system to reproduce the granite instantaneous rock-burst process. Based on the PIV (Particle Image Velocimetry) technique, quantitative analysis of a rock-burst, the images of tracer particle, displacement and strain fields can be obtained, and the debris trajectory described. According to the observation of on-site tests, the dynamic rockburst is actually a gas–solid high speed flow process, which is caused by the interaction of rock fragments and surrounding air. With the help of analysis on high speed video and PIV images, the granite rockburst failure process is composed of six stages of platey fragment spalling and debris ejection. Meanwhile, the elastic energy for these six stages has been calculated to study the energy variation. The results indicate that the rockburst process can be summarized as:an initiating stage, intensive developing stage and gradual decay stage. This research will be helpful for our further understanding of the rockburst mechanism.

  20. METHODOLOGICAL STUDY OF OPINION MINING AND SENTIMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Pravesh Kumar Singh

    2014-02-01

    Full Text Available Decision making both on individual and organizational level is always accompanied by the search of other’s opinion on the same. With tremendous establishment of opinion rich resources like, reviews, forum discussions, blogs, micro-blogs, Twitter etc provide a rich anthology of sentiments. This user generated content can serve as a benefaction to market if the semantic orientations are deliberated. Opinion mining and sentiment analysis are the formalization for studying and construing opinions and sentiments. The digital ecosystem has itself paved way for use of huge volume of opinionated data recorded. This paper is an attempt to review and evaluate the various techniques used for opinion and sentiment analysis.

  1. Recovering prehistoric woodworking skills using spatial analysis techniques

    Science.gov (United States)

    Kovács, K.; Hanke, K.

    2015-08-01

    Recovering of ancient woodworking skills can be achieved by the simultaneous documentation and analysis of the tangible evidences such as the geometry parameters of prehistoric hand tools or the fine morphological characteristics of well preserved wooden archaeological finds. During this study, altogether 10 different hand tool forms and over 60 hand tool impressions were investigated for the better understanding of the Bronze Age woodworking efficiency. Two archaeological experiments were also designed in this methodology and unknown prehistoric adzes could be reconstructed by the results of these studies and by the spatial analysis of the Bronze Age tool marks. Finally, the trimming efficiency of these objects were also implied and these woodworking skills could be quantified in the case of a Bronze Age wooden construction from Austria. The proposed GIS-based tool mark segmentation and comparison can offer an objective, user-independent technique for the related intangible heritage interpretations in the future.

  2. Pressure transient analysis for long homogeneous reservoirs using TDS technique

    Energy Technology Data Exchange (ETDEWEB)

    Escobar, Freddy Humberto [Universidad Surcolombiana, Av. Pastrana - Cra. 1, Neiva, Huila (Colombia); Hernandez, Yuly Andrea [Hocol S.A., Cra. 7 No 114-43, Floor 16, Bogota (Colombia); Hernandez, Claudia Marcela [Weatherford, Cra. 7 No 81-90, Neiva, Huila (Colombia)

    2007-08-15

    A significant number of well pressure tests are conducted in long, narrow reservoirs with close and open extreme boundaries. It is desirable not only to appropriately identify these types of systems but also to develop an adequate and practical interpretation technique to determine their parameters and size, when possible. An accurate understanding of how the reservoir produces and the magnitude of producible reserves can lead to competent decisions and adequate reservoir management. So far, studies found for identification and determination of parameters for such systems are conducted by conventional techniques (semilog analysis) and semilog and log-log type-curve matching of pressure versus time. Type-curve matching is basically a trial-and-error procedure which may provide inaccurate results. Besides, a limitation in the number of type curves plays a negative role. In this paper, a detailed analysis of pressure derivative behavior for a vertical well in linear reservoirs with open and closed extreme boundaries is presented for the case of constant rate production. We studied independently each flow regime, especially the linear flow regime since it is the most characteristic 'fingerprint' of these systems. We found that when the well is located at one of the extremes of the reservoir, a single linear flow regime develops once radial flow and/or wellbore storage effects have ended. When the well is located at a given distance from both extreme boundaries, the pressure derivative permits the identification of two linear flows toward the well and it has been called that 'dual-linear flow regime'. This is characterized by an increment of the intercept of the 1/2-slope line from {pi}{sup 0.5} to {pi} with a consequent transition between these two straight lines. The identification of intersection points, lines, and characteristic slopes allows us to develop an interpretation technique without employing type-curve matching. This technique uses

  3. An effective technique for isolating adult activated Schwann cells

    Institute of Scientific and Technical Information of China (English)

    Jifei Zhang; Lianhong Jin; Yuzhen Zhao

    2006-01-01

    BACKGROUND: Schwann cells (SCs) are neuroglial cells of peripheral nerve and play a key role in repairing peripheral nerve injury; therefore, it provides an important evidence for transplantation of SCs which are characterized by active proliferation and adult high-purity in vitro after nerve injury in clinic, and also develops a new therapeutic way for nerve injury.OBJECTIVE: To investigate an effective technique for isolating adult activated Schwann cells.DESIGN: Controlled observational study.SETTING: Mudanjiang Medical College.MATERIALS: The experiment was completed at the Department of Medical Genetics of Harbin Medical University from March 2003 to April 2005. Health female Wistar rats, aged 2 months, weighting 150-160 g, were randomly divided into 3 groups with 5 in each group.METHODS: The right sciatic nerves from 15 Wistar rats were exposed and transected at the mid thigh under pentobarbital anesthesia (4 mg/kg, I.p). Seven days later, the distal segments of the predegenerated nerves were removed and used to produce adult Schwann cell cultures. The distal segment of the predegenerated nerve, 20 mm in length, was resected. The nerve was cut into pieces 1 mm in length and incubated for 3 hours under CO2 at 37 ℃ with an enzyme mixture of 0.05% collagenase/dispase. Rats were divided into 3 groups:① Group 1: The nerve fragments were explanted in poly-L-lysine and laminin-coated dishes with BS medium from the 1st to the 6th day. On the 6th day, the fragments were removed into a new poly-L-lysine-laminin-coated dish and the BS medium was changed to BS with 10% FBS. The nerve fragments were replaced repeatedly in the same way in new dishes on the 12th and the 18th days. ②Group 2: For the first 3 days, the nerve fragments were fed with BS with 10% FBS. This medium was changed to BS medium on the third day. The nerve fragments were removed to another dish on day 6 and BS medium was changed to BS with 25 mL/L FBS. Hereafter the culture method was the same as

  4. Application of transport phenomena analysis technique to cerebrospinal fluid.

    Science.gov (United States)

    Lam, C H; Hansen, E A; Hall, W A; Hubel, A

    2013-12-01

    The study of hydrocephalus and the modeling of cerebrospinal fluid flow have proceeded in the past using mathematical analysis that was very capable of prediction phenomenonologically but not well in physiologic parameters. In this paper, the basis of fluid dynamics at the physiologic state is explained using first established equations of transport phenomenon. Then, microscopic and molecular level techniques of modeling are described using porous media theory and chemical kinetic theory and then applied to cerebrospinal fluid (CSF) dynamics. Using techniques of transport analysis allows the field of cerebrospinal fluid dynamics to approach the level of sophistication of urine and blood transport. Concepts such as intracellular and intercellular pathways, compartmentalization, and tortuosity are associated with quantifiable parameters that are relevant to the anatomy and physiology of cerebrospinal fluid transport. The engineering field of transport phenomenon is rich and steeped in architectural, aeronautical, nautical, and more recently biological history. This paper summarizes and reviews the approaches that have been taken in the field of engineering and applies it to CSF flow.

  5. Vortex metrology using Fourier analysis techniques: vortex networks correlation fringes.

    Science.gov (United States)

    Angel-Toro, Luciano; Sierra-Sosa, Daniel; Tebaldi, Myrian; Bolognini, Néstor

    2012-10-20

    In this work, we introduce an alternative method of analysis in vortex metrology based on the application of the Fourier optics techniques. The first part of the procedure is conducted as is usual in vortex metrology for uniform in-plane displacement determination. On the basis of two recorded intensity speckled distributions, corresponding to two states of a diffuser coherently illuminated, we numerically generate an analytical signal from each recorded intensity pattern by using a version of the Riesz integral transform. Then, from each analytical signal, a two-dimensional pseudophase map is generated in which the vortices are located and characterized in terms of their topological charges and their core's structural properties. The second part of the procedure allows obtaining Young's interference fringes when Fourier transforming the light passing through a diffracting mask with multiple apertures at the locations of the homologous vortices. In fact, we use the Fourier transform as a mathematical operation to compute the far-field diffraction intensity pattern corresponding to the multiaperture set. Each aperture from the set is associated with a rectangular hole that coincides both in shape and size with a pixel from recorded images. We show that the fringe analysis can be conducted as in speckle photography in an extended range of displacement measurements. Effects related with speckled decorrelation are also considered. Our experimental results agree with those of speckle photography in the range in which both techniques are applicable.

  6. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  7. PVUSA instrumentation and data analysis techniques for photovoltaic systems

    Energy Technology Data Exchange (ETDEWEB)

    Newmiller, J.; Hutchinson, P.; Townsend, T.; Whitaker, C.

    1995-10-01

    The Photovoltaics for Utility Scale Applications (PVUSA) project tests two types of PV systems at the main test site in Davis, California: new module technologies fielded as 20-kW Emerging Module Technology (EMT) arrays and more mature technologies fielded as 70- to 500-kW turnkey Utility-Scale (US) systems. PVUSA members have also installed systems in their service areas. Designed appropriately, data acquisition systems (DASs) can be a convenient and reliable means of assessing system performance, value, and health. Improperly designed, they can be complicated, difficult to use and maintain, and provide data of questionable validity. This report documents PVUSA PV system instrumentation and data analysis techniques and lessons learned. The report is intended to assist utility engineers, PV system designers, and project managers in establishing an objective, then, through a logical series of topics, facilitate selection and design of a DAS to meet the objective. Report sections include Performance Reporting Objectives (including operational versus research DAS), Recommended Measurements, Measurement Techniques, Calibration Issues, and Data Processing and Analysis Techniques. Conclusions and recommendations based on the several years of operation and performance monitoring are offered. This report is one in a series of 1994--1995 PVUSA reports documenting PVUSA lessons learned at the demonstration sites in Davis and Kerman, California. Other topical reports address: five-year assessment of EMTs; validation of the Kerman 500-kW grid support PV plant benefits; construction and safety experience in installing and operating PV systems; balance-of-system design and costs; procurement, acceptance, and rating practices for PV power plants; experience with power conditioning units and power quality.

  8. Stalked protozoa identification by image analysis and multivariable statistical techniques

    OpenAIRE

    Amaral, A.L.; Ginoris, Y. P.; Nicolau, Ana; M.A.Z. Coelho; Ferreira, E. C.

    2008-01-01

    Protozoa are considered good indicators of the treatment quality in activated sludge systems as they are sensitive to physical, chemical and operational processes. Therefore, it is possible to correlate the predominance of certain species or groups and several operational parameters of the plant. This work presents a semiautomatic image analysis procedure for the recognition of the stalked protozoa species most frequently found in wastewater treatment plants by determinin...

  9. BATMAN: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2016-12-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real Integral-Field Spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of two. Our analysis reveals that the algorithm prioritizes conservation of all the statistically-significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BATMAN is not to be used as a `black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially-resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  10. Insight to Nanoparticle Size Analysis-Novel and Convenient Image Analysis Method Versus Conventional Techniques.

    Science.gov (United States)

    Vippola, Minnamari; Valkonen, Masi; Sarlin, Essi; Honkanen, Mari; Huttunen, Heikki

    2016-12-01

    The aim of this paper is to introduce a new image analysis program "Nanoannotator" particularly developed for analyzing individual nanoparticles in transmission electron microscopy images. This paper describes the usefulness and efficiency of the program when analyzing nanoparticles, and at the same time, we compare it to more conventional nanoparticle analysis techniques. The techniques which we are concentrating here are transmission electron microscopy (TEM) linked with different image analysis methods and X-ray diffraction techniques. The developed program appeared as a good supplement to the field of particle analysis techniques, since the traditional image analysis programs suffer from the inability to separate the individual particles from agglomerates in the TEM images. The program is more efficient, and it offers more detailed morphological information of the particles than the manual technique. However, particle shapes that are very different from spherical proved to be problematic also for the novel program. When compared to X-ray techniques, the main advantage of the small-angle X-ray scattering (SAXS) method is the average data it provides from a very large amount of particles. However, the SAXS method does not provide any data about the shape or appearance of the sample.

  11. Comparing dynamical systems concepts and techniques for biomechanical analysis

    Institute of Scientific and Technical Information of China (English)

    Richard E.A. van Emmerik; Scott W. Ducharme; Avelino C. Amado; Joseph Hamill

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new states, and (3) are governed by short-and long-term (fractal) correlational processes at different spatio-temporal scales. These different aspects of system dynamics are typically investigated using concepts related to variability, stability, complexity, and adaptability. The purpose of this paper is to compare and contrast these different concepts and demonstrate that, although related, these terms represent fundamentally different aspects of system dynamics. In particular, we argue that variability should not uniformly be equated with stability or complexity of movement. In addition, current dynamic stability measures based on nonlinear analysis methods (such as the finite maximal Lyapunov exponent) can reveal local instabilities in movement dynamics, but the degree to which these local instabilities relate to global postural and gait stability and the ability to resist external perturbations remains to be explored. Finally, systematic studies are needed to relate observed reductions in complexity with aging and disease to the adaptive capabilities of the movement system and how complexity changes as a function of different task constraints.

  12. Techniques of DNA methylation analysis with nutritional applications.

    Science.gov (United States)

    Mansego, Maria L; Milagro, Fermín I; Campión, Javier; Martínez, J Alfredo

    2013-01-01

    Epigenetic mechanisms are likely to play an important role in the regulation of metabolism and body weight through gene-nutrient interactions. This review focuses on methods for analyzing one of the most important epigenetic mechanisms, DNA methylation, from single nucleotide to global measurement depending on the study goal and scope. In addition, this study highlights the major principles and methods for DNA methylation analysis with emphasis on nutritional applications. Recent developments concerning epigenetic technologies are showing promising results of DNA methylation levels at a single-base resolution and provide the ability to differentiate between 5-methylcytosine and other nucleotide modifications such as 5-hydroxymethylcytosine. A large number of methods can be used for the analysis of DNA methylation such as pyrosequencing™, primer extension or real-time PCR methods, and genome-wide DNA methylation profile from microarray or sequencing-based methods. Researchers should conduct a preliminary analysis focused on the type of validation and information provided by each technique in order to select the best method fitting for their nutritional research interests.

  13. A comparison between active and passive techniques for measurements of radon emanation factors

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Coto, I. [Dept. Fisica Aplicada, University of Huelva, Huelva (Spain)], E-mail: Israel.lopez@dfa.uhu.es; Mas, J.L. [Dept. de Fisica Aplicada I, E.U.P., University of Seville, Seville (Spain); San Miguel, E.G.; Bolivar, J.P. [Dept. Fisica Aplicada, University of Huelva, Huelva (Spain); Sengupta, D. [Department of Geology and Geophysics, I.I.T. Kharagpur, West Bengal (India)

    2009-05-15

    Some radon related parameters have been determined through two different techniques (passive and active) in soil and phosphogypsum samples. Emanation factors determined through these techniques show a good agreement for soil samples while for phosphogympsum samples appear large discrepancies. In this paper, these discrepancies are analyzed and explained if non-controlled radon leakages in the passive technique are taken into account.

  14. Analysis techniques for background rejection at the Majorana Demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Cuestra, Clara [University of Washington; Rielage, Keith Robert [Los Alamos National Laboratory; Elliott, Steven Ray [Los Alamos National Laboratory; Xu, Wenqin [Los Alamos National Laboratory; Goett, John Jerome III [Los Alamos National Laboratory

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  15. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    Energy Technology Data Exchange (ETDEWEB)

    Cuesta, C.; Buuck, M.; Detwiler, J. A.; Gruszko, J.; Guinn, I. S.; Leon, J.; Robertson, R. G. H. [Center for Experimental Nuclear Physics and Astrophysics, and Department of Physics, University of Washington, Seattle, WA (United States); Abgrall, N.; Bradley, A. W.; Chan, Y-D.; Mertens, S.; Poon, A. W. P. [Nuclear Science Division, Lawrence Berkeley National Laboratory, Berkeley, CA (United States); Arnquist, I. J.; Hoppe, E. W.; Kouzes, R. T.; LaFerriere, B. D.; Orrell, J. L. [Pacific Northwest National Laboratory, Richland, WA (United States); Avignone, F. T. [Department of Physics and Astronomy, University of South Carolina, Columbia, SC (United States); Oak Ridge National Laboratory, Oak Ridge, TN (United States); Baldenegro-Barrera, C. X.; Bertrand, F. E. [Oak Ridge National Laboratory, Oak Ridge, TN (United States); and others

    2015-08-17

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40- kg modular HPGe detector array to search for neutrinoless double beta decay in {sup 76}Ge. In view of the next generation of tonne-scale Ge-based 0νβ β-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR’s germanium detectors allows for significant reduction of gamma background.

  16. ANALYSIS OF ANDROID VULNERABILITIES AND MODERN EXPLOITATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Himanshu Shewale

    2014-03-01

    Full Text Available Android is an operating system based on the Linux kernel. It is the most widely used and popular operating system among Smartphones and portable devices. Its programmable and open nature attracts attackers to take undue advantage. Android platform allows developers to freely access and modify source code. But at the same time it increases the security issue. A user is likely to download and install malicious applications written by software hackers. This paper focuses on understanding and analyzing the vulnerabilities present in android platform. In this paper firstly we study the android architecture; analyze the existing threats and security weaknesses. Then we identify various exploit mitigation techniques to mitigate known vulnerabilities. A detailed analysis will help us to identify the existing loopholes and it will give strategic direction to make android operating system more secure.

  17. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    CERN Document Server

    Cuesta, C; Arnquist, I J; Avignone, F T; Baldenegro-Barrera, C X; Barabash, A S; Bertrand, F E; Bradley, A W; Brudanin, V; Busch, M; Buuck, M; Byram, D; Caldwell, A S; Chan, Y-D; Christofferson, C D; Detwiler, J A; Efremenko, Yu; Ejiri, H; Elliott, S R; Galindo-Uribarri, A; Gilliss, T; Giovanetti, G K; Goett, J; Green, M P; Gruszko, J; Guinn, I S; Guiseppe, V E; Henning, R; Hoppe, E W; Howard, S; Howe, M A; Jasinski, B R; Keeter, K J; Kidd, M F; Konovalov, S I; Kouzes, R T; LaFerriere, B D; Leon, J; MacMullin, J; Martin, R D; Meijer, S J; Mertens, S; Orrell, J L; O'Shaughnessy, C; Poon, A W P; Radford, D C; Rager, J; Rielage, K; Robertson, R G H; Romero-Romero, E; Shanks, B; Shirchenko, M; Snyder, N; Suriano, A M; Tedeschi, D; Trimble, J E; Varner, R L; Vasilyev, S; Vetter, K; Vorren, K; White, B R; Wilkerson, J F; Wiseman, C; Xu, W; Yakushev, E; Yu, C -H; Yumatov, V; Zhitnikov, I

    2015-01-01

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0nbb-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR 0s germanium detectors allows for significant reduction of gamma background.

  18. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  19. Adolescent baseball pitching technique: lower extremity biomechanical analysis.

    Science.gov (United States)

    Milewski, Matthew D; Õunpuu, Sylvia; Solomito, Matthew; Westwell, Melany; Nissen, Carl W

    2012-11-01

    Documentation of the lower extremity motion patterns of adolescent pitchers is an important part of understanding the pitching motion and the implication of lower extremity technique on upper extremity loads, injury and performance. The purpose of this study was to take the initial step in this process by documenting the biomechanics of the lower extremities during the pitching cycle in adolescent pitchers and to compare these findings with the published data for older pitchers. Three-dimensional motion analysis using a comprehensive lower extremity model was used to evaluate the fast ball pitch technique in adolescent pitchers. Thirty-two pitchers with a mean age of 12.4 years (range 10.5-14.7 years) and at least 2 years of experience were included in this study. The pitchers showed a mean of 49 ± 12° of knee flexion of the lead leg at foot contact. They tended to maintain this position through ball release, and then extended their knee during the follow through phase (ball release to maximal internal glenohumeral rotation). The lead leg hip rapidly progressed into adduction and flexion during the arm cocking phase with a range of motion of 40 ± 10° adduction and 30 ± 13° flexion. The lead hip mean peak adduction velocity was 434 ± 83°/s and flexion velocity was 456 ± 156°/s. Simultaneously, the trailing leg hip rapidly extended approaching to a mean peak extension of -8 ± 5° at 39% of the pitch cycle, which is close to passive range of motion constraints. Peak hip abduction of the trailing leg at foot contact was -31 ± 12°, which also approached passive range of motion constraints. Differences and similarities were also noted between the adolescent lower extremity kinematics and adult pitchers; however, a more comprehensive analysis using similar methods is needed for a complete comparison.

  20. Techniques for active embodiment of participants in virtual environments

    Energy Technology Data Exchange (ETDEWEB)

    Hightower, R.; Stansfield, S.

    1996-03-01

    This paper presents preliminary work in the development of an avatar driver. An avatar is the graphical embodiment of a user in a virtual world. In applications such as small team, close quarters training and mission planning and rehearsal, it is important that the user`s avatar reproduce his or her motions naturally and with high fidelity. This paper presents a set of special purpose algorithms for driving the motion of the avatar with minimal information about the posture and position of the user. These algorithms utilize information about natural human motion and posture to produce solutions quickly and accurately without the need for complex general-purpose kinematics algorithms. Several examples illustrating the successful applications of these techniques are included.

  1. Acoustical Characteristics of Mastication Sounds: Application of Speech Analysis Techniques

    Science.gov (United States)

    Brochetti, Denise

    Food scientists have used acoustical methods to study characteristics of mastication sounds in relation to food texture. However, a model for analysis of the sounds has not been identified, and reliability of the methods has not been reported. Therefore, speech analysis techniques were applied to mastication sounds, and variation in measures of the sounds was examined. To meet these objectives, two experiments were conducted. In the first experiment, a digital sound spectrograph generated waveforms and wideband spectrograms of sounds by 3 adult subjects (1 male, 2 females) for initial chews of food samples differing in hardness and fracturability. Acoustical characteristics were described and compared. For all sounds, formants appeared in the spectrograms, and energy occurred across a 0 to 8000-Hz range of frequencies. Bursts characterized waveforms for peanut, almond, raw carrot, ginger snap, and hard candy. Duration and amplitude of the sounds varied with the subjects. In the second experiment, the spectrograph was used to measure the duration, amplitude, and formants of sounds for the initial 2 chews of cylindrical food samples (raw carrot, teething toast) differing in diameter (1.27, 1.90, 2.54 cm). Six adult subjects (3 males, 3 females) having normal occlusions and temporomandibular joints chewed the samples between the molar teeth and with the mouth open. Ten repetitions per subject were examined for each food sample. Analysis of estimates of variation indicated an inconsistent intrasubject variation in the acoustical measures. Food type and sample diameter also affected the estimates, indicating the variable nature of mastication. Generally, intrasubject variation was greater than intersubject variation. Analysis of ranks of the data indicated that the effect of sample diameter on the acoustical measures was inconsistent and depended on the subject and type of food. If inferences are to be made concerning food texture from acoustical measures of mastication

  2. The Effectiveness of Active and Traditional Teaching Techniques in the Orthopedic Assessment Laboratory

    Science.gov (United States)

    Nottingham, Sara; Verscheure, Susan

    2010-01-01

    Active learning is a teaching methodology with a focus on student-centered learning that engages students in the educational process. This study implemented active learning techniques in an orthopedic assessment laboratory, and the effects of these teaching techniques. Mean scores from written exams, practical exams, and final course evaluations…

  3. Successful Application of Active Learning Techniques to Introductory Microbiology

    Directory of Open Access Journals (Sweden)

    Elizabeth A. Hoffman

    2009-12-01

    Full Text Available While the traditional lecture format may be a successful way to teach microbiology to both medical and nursing students, it was not an effective means of learning for many prenursing and preprofessional students enrolled in either of the introductory microbiology courses at Ashland Community College, an open enrollment institution. The structure of both Medical Microbiology and Principles of Microbiology was redesigned to allow students to address the material in an active manner. Daily quizzes, student group discussions, scrapbooks, lab project presentations and papers, and extra credit projects were all added in order to allow students maximum exposure to the course material in a manner compatible with various methods of learning. Student knowledge, course evaluations, and student success rates have all improved with the active learning format.

  4. New trends in the development of "active correlations" technique

    Science.gov (United States)

    Tsyganov, Yu. S.

    2016-09-01

    With reaching extremely high intensities of heavy-ion beams, new requirements for the detection system of the Dubna Gas-Filled Recoil Separator (DGFRS) will definitely be set. One of the challenges is how to apply the "active correlations" method [1-6] to suppress beam associated background products without significant losses in the whole long-term experiment efficiency value. Different scenarios and equations for the development of a method according to this requirement are under consideration in the present paper.

  5. Aerial monitoring in active mud volcano by UAV technique

    Science.gov (United States)

    Pisciotta, Antonino; Capasso, Giorgio; Madonia, Paolo

    2016-04-01

    UAV photogrammetry opens various new applications in the close range domain, combining aerial and terrestrial photogrammetry, but also introduces low-cost alternatives to the classical manned aerial photogrammetry. Between 2014 and 2015 tree aerial surveys have been carried out. Using a quadrotor drone, equipped with a compact camera, it was possible to generate high resolution elevation models and orthoimages of The "Salinelle", an active mud volcanoes area, located in territory of Paternò (South Italy). The main risks are related to the damages produced by paroxysmal events. Mud volcanoes show different cyclic phases of activity, including catastrophic events and periods of relative quiescence characterized by moderate activity. Ejected materials often are a mud slurry of fine solids suspended in liquids which may include water and hydrocarbon fluids, the bulk of released gases are carbon dioxide, with some methane and nitrogen, usually pond-shaped of variable dimension (from centimeters to meters in diameter). The scope of the presented work is the performance evaluation of a UAV system that was built to rapidly and autonomously acquire mobile three-dimensional (3D) mapping data in a volcanic monitoring scenario.

  6. Ionospheric Behaviour Analysis over Thailand Using Radio Occultation Technique.

    Directory of Open Access Journals (Sweden)

    Ahmed Wasiu Akande

    2015-11-01

    Full Text Available With the advent in the development of science and technology in the field of space and atmospheric science in order to obtain accurate result, hence the use of radio occultation technique in the investigation of the amount of electron density and Total Electron Content presence in equatorial region particularly over Thailand. In this research, radio occultation data obtained from UCAR/CDAAC was used to observe daily, monthly, seasonal and the entire year 2013 Ionospheric TEC and electron density variation due to changes and instability of solar activities from time to time. It was observed that TEC was high (ionosphere was more disturbed or violent in May and spread over a wide range of altitude and summer season has the highest TEC value for the year 2013 which means at this period GNSS measurements was more prone to error. It was noted that ionospheric variations or fluctuations was maximum between 200km and 450km altitude. The results of the study show that ionospheric perturbation effects or irregularities depend on season and solar activity.

  7. High-Throughput Analysis of Enzyme Activities

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Guoxin [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    High-throughput screening (HTS) techniques have been applied to many research fields nowadays. Robot microarray printing technique and automation microtiter handling technique allows HTS performing in both heterogeneous and homogeneous formats, with minimal sample required for each assay element. In this dissertation, new HTS techniques for enzyme activity analysis were developed. First, patterns of immobilized enzyme on nylon screen were detected by multiplexed capillary system. The imaging resolution is limited by the outer diameter of the capillaries. In order to get finer images, capillaries with smaller outer diameters can be used to form the imaging probe. Application of capillary electrophoresis allows separation of the product from the substrate in the reaction mixture, so that the product doesn't have to have different optical properties with the substrate. UV absorption detection allows almost universal detection for organic molecules. Thus, no modifications of either the substrate or the product molecules are necessary. This technique has the potential to be used in screening of local distribution variations of specific bio-molecules in a tissue or in screening of multiple immobilized catalysts. Another high-throughput screening technique is developed by directly monitoring the light intensity of the immobilized-catalyst surface using a scientific charge-coupled device (CCD). Briefly, the surface of enzyme microarray is focused onto a scientific CCD using an objective lens. By carefully choosing the detection wavelength, generation of product on an enzyme spot can be seen by the CCD. Analyzing the light intensity change over time on an enzyme spot can give information of reaction rate. The same microarray can be used for many times. Thus, high-throughput kinetic studies of hundreds of catalytic reactions are made possible. At last, we studied the fluorescence emission spectra of ADP and obtained the detection limits for ADP under three different

  8. Pattern recognition software and techniques for biological image analysis.

    Directory of Open Access Journals (Sweden)

    Lior Shamir

    Full Text Available The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  9. Pattern recognition software and techniques for biological image analysis.

    Science.gov (United States)

    Shamir, Lior; Delaney, John D; Orlov, Nikita; Eckley, D Mark; Goldberg, Ilya G

    2010-11-24

    The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  10. Efficient geometric rectification techniques for spectral analysis algorithm

    Science.gov (United States)

    Chang, C. Y.; Pang, S. S.; Curlander, J. C.

    1992-01-01

    The spectral analysis algorithm is a viable technique for processing synthetic aperture radar (SAR) data in near real time throughput rates by trading the image resolution. One major challenge of the spectral analysis algorithm is that the output image, often referred to as the range-Doppler image, is represented in the iso-range and iso-Doppler lines, a curved grid format. This phenomenon is known to be the fanshape effect. Therefore, resampling is required to convert the range-Doppler image into a rectangular grid format before the individual images can be overlaid together to form seamless multi-look strip imagery. An efficient algorithm for geometric rectification of the range-Doppler image is presented. The proposed algorithm, realized in two one-dimensional resampling steps, takes into consideration the fanshape phenomenon of the range-Doppler image as well as the high squint angle and updates of the cross-track and along-track Doppler parameters. No ground reference points are required.

  11. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    Science.gov (United States)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the

  12. Experimental techniques for screening of antiosteoporotic activity in postmenopausal osteoporosis.

    Science.gov (United States)

    Satpathy, Swaha; Patra, Arjun; Ahirwar, Bharti

    2015-12-01

    Postmenopausal osteoporosis, a silent epidemic, has become a major health hazard, afflicting about 50% of postmenopausal women worldwide and is thought to be a disease with one of the highest incidences in senile people. It is a chronic, progressive condition associated with micro-architectural deterioration of bone tissue that results in low bone mass, decreased bone strength that predisposes to an increased risk of fracture. Women are more likely to develop osteoporosis than men due to reduction in estrogen during menopause which leads to decline in bone formation and increase in bone resorption activity. Estrogen is able to suppress the production of proinflammatory cytokines like interleukin (IL)-1, IL-6, IL-7 and tumor necrosis factor (TNF-α). This is why these cytokines are elevated in postmenopausal women. In this review article we have made an attempt to collate the various methods and parameters most frequently used for screening of antiosteoporotic activity in postmenopausal osteoporosis. Pertaining to ovariectomized animal model, this is the most appropriate model for studying the efficacy of different drugs to prevent bone loss in postmenopausal osteoporosis.

  13. Smart actuators: a novel technique for active damping

    Science.gov (United States)

    Muth, Michael; Moldovan, Klaus; Goetz, Bernt

    1995-05-01

    Sensors are important components for any automatic process. Their function is to measure physical variables, and thus to allow automatic actions in a technical process, for example in a manufacturing sequence or a measurement. Selecting a sensor for a process, it is mostly overlooked that actuators used in a process also have sensory properties. The reactions of actuators to the state of a process give the possibility to extract relevant information out of the process with actuators. In using the sensory properties of actuators the costs for additional sensors can be saved. Even more important, under some circumstances it may not even be possible to place a special sensor directly at the location of interest: In that case the information about the physical variable is only accessible by analyzing the return signal of the actuator. An example of such a smart actuator combining active and sensory properties is demonstrated in a simple experiment. This experiment shows a steel ball supported as a pendulum. The steel ball can be pushed off, and on swinging back it can be caught in a single pass without any bounce. The actuator uses the piezoelectric effect which shows the underlying principle most clearly: Application of the reversibility of physical effects. In this case mechanical energy can either be produced or absorbed. This experiment is means as a demonstration model for students. It is also used for preliminary investigations developing a fast, actively damped tipping mechanism (optical scanner).

  14. Cooperative Experimental System Development - cooperative techniques beyound initial design and analysis

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Kyng, Morten; Mogensen, Preben Holst

    1995-01-01

    /design to targeted object oriented design, specification, and realisation; and design for tailorability. The emerging CESD approach is based on several years of experience in applying cooperative analysis and design techniques in projects developing general, tailorable software products. The CESD approach is...... be based solely on observation and detached reflection; prototyping methods often have a narrow focus on the technical construction of various kinds of prototypes; Participatory Design techniques—including the Scandinavian Cooperative Design (CD) approaches—seldom go beyond the early analysis....../design activities of development projects. In contrast, the CESD approach is characterized by its focus on: active user involvement throughout the entire development process; prototyping experiments closely coupled to work-situations and use-scenarios; transforming results from early cooperative analysis...

  15. Emerging techniques for soil analysis via mid-infrared spectroscopy

    Science.gov (United States)

    Linker, R.; Shaviv, A.

    2009-04-01

    Transmittance and diffuse reflectance (DRIFT) spectroscopy in the mid-IR range are well-established methods for soil analysis. Over the last five years, additional mid-IR techniques have been investigated, and in particular: 1. Attenuated total reflectance (ATR) Attenuated total reflectance is commonly used for analysis of liquids and powders for which simple transmittance measurements are not possible. The method relies on a crystal with a high refractive index, which is in contact with the sample and serves as a waveguide for the IR radiation. The radiation beam is directed in such a way that it hits the crystal/sample interface several times, each time penetrating a few microns into the sample. Since the penetration depth is limited to a few microns, very good contact between the sample and the crystal must be ensured, which can be achieved by working with samples close to water saturation. However, the strong absorbance of water in the mid-infrared range as well as the absorbance of some soil constituents (e.g., calcium carbonate) interfere with some of the absorbance bands of interest. This has led to the development of several post-processing methods for analysis of the spectra. The FTIR-ATR technique has been successfully applied to soil classification as well as to determination of nitrate concentration [1, 6-8, 10]. Furthermore, Shaviv et al. [12] demonstrated the possibility of using fiber optics as an ATR devise for direct determination of nitrate concentration in soil extracts. Recently, Du et al. [5] showed that it is possible to differentiate between 14N and 15N in such spectra, which opens very promising opportunities for developing FTIR-ATR based methods for investigating nitrogen transformation in soils by tracing changes in N-isotopic species. 2. Photo-acoustic spectroscopy Photoacoustic spectroscopy (PAS) is based on absorption-induced heating of the sample, which produces pressure fluctuations in a surrounding gas. These fluctuations are

  16. Ionospheric Plasma Drift Analysis Technique Based On Ray Tracing

    Science.gov (United States)

    Ari, Gizem; Toker, Cenk

    2016-07-01

    Ionospheric drift measurements provide important information about the variability in the ionosphere, which can be used to quantify ionospheric disturbances caused by natural phenomena such as solar, geomagnetic, gravitational and seismic activities. One of the prominent ways for drift measurement depends on instrumentation based measurements, e.g. using an ionosonde. The drift estimation of an ionosonde depends on measuring the Doppler shift on the received signal, where the main cause of Doppler shift is the change in the length of the propagation path of the signal between the transmitter and the receiver. Unfortunately, ionosondes are expensive devices and their installation and maintenance require special care. Furthermore, the ionosonde network over the world or even Europe is not dense enough to obtain a global or continental drift map. In order to overcome the difficulties related to an ionosonde, we propose a technique to perform ionospheric drift estimation based on ray tracing. First, a two dimensional TEC map is constructed by using the IONOLAB-MAP tool which spatially interpolates the VTEC estimates obtained from the EUREF CORS network. Next, a three dimensional electron density profile is generated by inputting the TEC estimates to the IRI-2015 model. Eventually, a close-to-real situation electron density profile is obtained in which ray tracing can be performed. These profiles can be constructed periodically with a period of as low as 30 seconds. By processing two consequent snapshots together and calculating the propagation paths, we estimate the drift measurements over any coordinate of concern. We test our technique by comparing the results to the drift measurements taken at the DPS ionosonde at Pruhonice, Czech Republic. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.

  17. Monolithic active pixel radiation detector with shielding techniques

    Energy Technology Data Exchange (ETDEWEB)

    Deptuch, Grzegorz W.

    2016-09-06

    A monolithic active pixel radiation detector including a method of fabricating thereof. The disclosed radiation detector can include a substrate comprising a silicon layer upon which electronics are configured. A plurality of channels can be formed on the silicon layer, wherein the plurality of channels are connected to sources of signals located in a bulk part of the substrate, and wherein the signals flow through electrically conducting vias established in an isolation oxide on the substrate. One or more nested wells can be configured from the substrate, wherein the nested wells assist in collecting charge carriers released in interaction with radiation and wherein the nested wells further separate the electronics from the sensing portion of the detector substrate. The detector can also be configured according to a thick SOA method of fabrication.

  18. Two-dimensional Imaging Velocity Interferometry: Technique and Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Erskine, D J; Smith, R F; Bolme, C; Celliers, P; Collins, G

    2011-03-23

    We describe the data analysis procedures for an emerging interferometric technique for measuring motion across a two-dimensional image at a moment in time, i.e. a snapshot 2d-VISAR. Velocity interferometers (VISAR) measuring target motion to high precision have been an important diagnostic in shockwave physics for many years Until recently, this diagnostic has been limited to measuring motion at points or lines across a target. We introduce an emerging interferometric technique for measuring motion across a two-dimensional image, which could be called a snapshot 2d-VISAR. If a sufficiently fast movie camera technology existed, it could be placed behind a traditional VISAR optical system and record a 2d image vs time. But since that technology is not yet available, we use a CCD detector to record a single 2d image, with the pulsed nature of the illumination providing the time resolution. Consequently, since we are using pulsed illumination having a coherence length shorter than the VISAR interferometer delay ({approx}0.1 ns), we must use the white light velocimetry configuration to produce fringes with significant visibility. In this scheme, two interferometers (illuminating, detecting) having nearly identical delays are used in series, with one before the target and one after. This produces fringes with at most 50% visibility, but otherwise has the same fringe shift per target motion of a traditional VISAR. The 2d-VISAR observes a new world of information about shock behavior not readily accessible by traditional point or 1d-VISARS, simultaneously providing both a velocity map and an 'ordinary' snapshot photograph of the target. The 2d-VISAR has been used to observe nonuniformities in NIF related targets (polycrystalline diamond, Be), and in Si and Al.

  19. MEASURING THE LEANNESS OF SUPPLIERS USING PRINCIPAL COMPONENT ANALYSIS TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Y. Zare Mehrjerdi

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: A technique that helps management to reduce costs and improve quality is ‘lean supply chain management’, which focuses on the elimination of all wastes in every stage of the supply chain and is derived from ‘agile production’. This research aims to assess and rank the suppliers in an auto industry, based upon the concept of ‘production leanness’. The focus of this research is on the suppliers of a company called Touse-Omron Naein. We have examined the literature about leanness, and classified its criteria into ten dimensions and 76 factors. A questionnaire was used to collect the data, and the suppliers were ranked using the principal component analysis (PCA technique.

    AFRIKAANSE OPSOMMING: Lenige voorsieningsbestuur (“lean supply chain management” is ’n tegniek wat bestuur in staat stel om koste te verminder en gehalte te verbeter. Dit fokus op die vermindering van vermorsing op elke stadium van die voorsieningsketting en word afgelei van ratse vervaardiging (“agile production”. Hierdie navorsing poog om leweransiers in ’n motorbedryf te beoordeel aan die hand van die konsep van vervaardigingslenigheid (“production leanness”. Die navorsing fokus op leweransiers van ’n maatskappy genaamd Touse-Omron Naein. ’n Literatuurstudie aangaande lenigheid het gelei tot die klassifikasie van kriteria in tien dimensies en 76 faktore. ’n Vraelys is gebruik om die data te versamel en die leweransiers is in rangvolgorde geplaas aan die hand van die PCA-tegniek.

  20. 中子活化法表征酸奶与苹果中有机卤化物%Study of organohalogens in yogurt and apple by neutron activation analysis and related techniques

    Institute of Scientific and Technical Information of China (English)

    张鸿; 柴之芳; 孙慧斌

    2008-01-01

    利用仪器中子活化分析、气相色谱和化学分离相结合的方法,研究随机采自北京、深圳超市的酸奶(20个品牌)和苹果(9种)中总卤素、可萃取有机卤素、可萃取持久性有机卤素和可鉴别持久性有机氯.结果显示,Cl、Br和I的INAA探测极限分别为50 ng、8 ng和3.5 ng.酸奶中可萃取有机氯占总氯含量的0.005%~0.043%,其中约24%为耐浓硫酸的可萃取持久性有机氯,可鉴别有机氯占总EPOCl的0.7%~13.1%;苹果中相应比例分别为1.6%~5.1%、34%和0.5%~6.2%,表明酸奶与苹果中的氯化物主要为极性水溶性化合物,EOCl主要为酸溶或酸不稳定氯化物,大部分EPOCl为现代气相色谱技术尚不能鉴别的未知化合物,仍留待人们去认识.%Twenty brands of Chinese commercial yogurt specimens and nine different kinds of apple samples collected randomly from supermarkets in Beijing and Shenzhen,China,were analyzed by instrumental neutron activation analysis (INAA) combined with gas chromatography (GC) and chemical separation methods for total halogens,extractable organohalogens (EOX),extractable persistent organohalogens (EPOX) and identified organochlorines. The INAA detection limits are 50 ng,8 ng and 3.5 ng for Cl,Br and I,respectively. The extractable organochlorines (EOCl) accounted for 0.005% to 0.043% of the total chlorine in yogurt and 1.6% to 5.1% in apple.About 24% of EOCl kept undecomposed as the extractable persistent organochlorines (EPOCl) after treatment with concentrated sulfuric acid in yogurt,and 34% in apple.These results indicated that chlorine in the two selected foodstuffs mainly existed as inorganic species and non-extractable organochlorines,and most EOCl in yogurt and apple were acid-liable or acid-soluble fractions. The Ratios of identified organochlorines to total EPOCl were 0.7% to 13.1% and 0.5% to 6.2% in yogurt and apple samples,respectively,which implying that a major portion of EPOCl measured in yogurt and apple

  1. Activated mechanisms in proteins: a multiple-temperature activation-relaxation technique study

    Science.gov (United States)

    Malek, Rachid; Mousseau, Normand; Derreumaux, Philippe

    2001-03-01

    The low-temperature dynamics of proteins is controlled by a complex activated dynamics taking place over long time-scales compared with the period of thermal oscillations. In view of the range of relevant time scales, the numerical study of these processes remains a challenge and numerous methods have been introduced to address this problem. We introduce here a mixture of two algorithms, the activation-relaxation technique (ART)^1,2 coupled with the parallel tempering method, and use it to study the structure of the energy landscape around the native state of a 38-residue polypeptide. While ART samples rapidly the local energy landscape, the parallel tempering, which sets up exchanges of configuration between simultaneous runs at multiple temperatures, generates a very efficient sampling of energy basins separated by high barriers^(3). Results show the nature of the barriers and local minima surrounding the native state of this 38-residue peptide, modeled with off-lattice OPEP-like interactions^4. (1) G.T. Barkema and N. Mousseau, PRL 77, 4358 (1996) (2) N. Mousseau and G.T. Barkema, PRE 57, 2419 (1998) (3) E. Marinari and G. Parisi, Europhys. Lett., 19 (6), 451 (1992) (4) Ph. Derreumaux, J. Chem. Phys. 111, 2301 (1999); PRB 85, 206 (2000)

  2. Romanian medieval earring analysis by X-ray fluorescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Therese, Laurent; Guillot, Philippe, E-mail: philippe.guillot@univ-jfc.fr [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Muja, Cristina [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Faculty of Biology, University of Bucharest (Romania); Vasile Parvan Institute of Archaeology, Bucharest, (Romania)

    2011-07-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100{mu}m. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two

  3. Effects on hamstring muscle extensibility, muscle activity, and balance of different stretching techniques.

    Science.gov (United States)

    Lim, Kyoung-Il; Nam, Hyung-Chun; Jung, Kyoung-Sim

    2014-02-01

    [Purpose] The purpose of this study was to investigate the effects of two different stretching techniques on range of motion (ROM), muscle activation, and balance. [Subjects] For the present study, 48 adults with hamstring muscle tightness were recruited and randomly divided into three groups: a static stretching group (n=16), a PNF stretching group (n=16), a control group (n=16). [Methods] Both of the stretching techniques were applied to the hamstring once. Active knee extension angle, muscle activation during maximum voluntary isometric contraction (MVC), and static balance were measured before and after the application of each stretching technique. [Results] Both the static stretching and the PNF stretching groups showed significant increases in knee extension angle compared to the control group. However, there were no significant differences in muscle activation or balance between the groups. [Conclusion] Static stretching and PNF stretching techniques improved ROM without decrease in muscle activation, but neither of them exerted statistically significant effects on balance.

  4. Sentiment Analysis of Twitter tweets using supervised classification technique

    Directory of Open Access Journals (Sweden)

    Pranav Waykar

    2016-05-01

    Full Text Available Making use of social media for analyzing the perceptions of the masses over a product, event or a person has gained momentum in recent times. Out of a wide array of social networks, we chose Twitter for our analysis as the opinions expressed their, are concise and bear a distinctive polarity. Here, we collect the most recent tweets on users' area of interest and analyze them. The extracted tweets are then segregated as positive, negative and neutral. We do the classification in following manner: collect the tweets using Twitter API; then we process the collected tweets to convert all letters to lowercase, eliminate special characters etc. which makes the classification more efficient; the processed tweets are classified using a supervised classification technique. We make use of Naive Bayes classifier to segregate the tweets as positive, negative and neutral. We use a set of sample tweets to train the classifier. The percentage of the tweets in each category is then computed and the result is represented graphically. The result can be used further to gain an insight into the views of the people using Twitter about a particular topic that is being searched by the user. It can help corporate houses devise strategies on the basis of the popularity of their product among the masses. It may help the consumers to make informed choices based on the general sentiment expressed by the Twitter users on a product

  5. An evaluation of wind turbine blade cross section analysis techniques.

    Energy Technology Data Exchange (ETDEWEB)

    Paquette, Joshua A.; Griffith, Daniel Todd; Laird, Daniel L.; Resor, Brian Ray

    2010-03-01

    The blades of a modern wind turbine are critical components central to capturing and transmitting most of the load experienced by the system. They are complex structural items composed of many layers of fiber and resin composite material and typically, one or more shear webs. Large turbine blades being developed today are beyond the point of effective trial-and-error design of the past and design for reliability is always extremely important. Section analysis tools are used to reduce the three-dimensional continuum blade structure to a simpler beam representation for use in system response calculations to support full system design and certification. One model simplification approach is to analyze the two-dimensional blade cross sections to determine the properties for the beam. Another technique is to determine beam properties using static deflections of a full three-dimensional finite element model of a blade. This paper provides insight into discrepancies observed in outputs from each approach. Simple two-dimensional geometries and three-dimensional blade models are analyzed in this investigation. Finally, a subset of computational and experimental section properties for a full turbine blade are compared.

  6. Seismic margin analysis technique for nuclear power plant structures

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong Moon; Choi, In Kil

    2001-04-01

    In general, the Seismic Probabilistic Risk Assessment (SPRA) and the Seismic Margin Assessment(SAM) are used for the evaluation of realistic seismic capacity of nuclear power plant structures. Seismic PRA is a systematic process to evaluate the seismic safety of nuclear power plant. In our country, SPRA has been used to perform the probabilistic safety assessment for the earthquake event. SMA is a simple and cost effective manner to quantify the seismic margin of individual structural elements. This study was performed to improve the reliability of SMA results and to confirm the assessment procedure. To achieve this goal, review for the current status of the techniques and procedures was performed. Two methodologies, CDFM (Conservative Deterministic Failure Margin) sponsored by NRC and FA (Fragility Analysis) sponsored by EPRI, were developed for the seismic margin review of NPP structures. FA method was originally developed for Seismic PRA. CDFM approach is more amenable to use by experienced design engineers including utility staff design engineers. In this study, detailed review on the procedures of CDFM and FA methodology was performed.

  7. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Lalitha Jayaraman

    2010-01-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thicknessdirection. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimizing the torque for blankets from different manufacturers.

  8. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Balaraman Kumar

    2010-06-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thickness direction. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimising the torque for blankets from different manufacturers.

  9. Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques

    Science.gov (United States)

    Kuan, Gary M

    2008-01-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  10. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Mohamed, A.

    1998-07-10

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results.

  11. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    Energy Technology Data Exchange (ETDEWEB)

    Clegg, Samuel M [Los Alamos National Laboratory; Barefield, James E [Los Alamos National Laboratory; Wiens, Roger C [Los Alamos National Laboratory; Sklute, Elizabeth [MT HOLYOKE COLLEGE; Dyare, Melinda D [MT HOLYOKE COLLEGE

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  12. A Review of Emerging Analytical Techniques for Objective Physical Activity Measurement in Humans.

    Science.gov (United States)

    Clark, Cain C T; Barnes, Claire M; Stratton, Gareth; McNarry, Melitta A; Mackintosh, Kelly A; Summers, Huw D

    2017-03-01

    Physical inactivity is one of the most prevalent risk factors for non-communicable diseases in the world. A fundamental barrier to enhancing physical activity levels and decreasing sedentary behavior is limited by our understanding of associated measurement and analytical techniques. The number of analytical techniques for physical activity measurement has grown significantly, and although emerging techniques may advance analyses, little consensus is presently available and further synthesis is therefore required. The objective of this review was to identify the accuracy of emerging analytical techniques used for physical activity measurement in humans. We conducted a search of electronic databases using Web of Science, PubMed, and Google Scholar. This review included studies written in English and published between January 2010 and December 2014 that assessed physical activity using emerging analytical techniques and reported technique accuracy. A total of 2064 papers were initially retrieved from three databases. After duplicates were removed and remaining articles screened, 50 full-text articles were reviewed, resulting in the inclusion of 11 articles that met the eligibility criteria. Despite the diverse nature and the range in accuracy associated with some of the analytic techniques, the rapid development of analytics has demonstrated that more sensitive information about physical activity may be attained. However, further refinement of these techniques is needed.

  13. Comparative Analysis of Vehicle Make and Model Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Faiza Ayub Syed

    2014-03-01

    Full Text Available Vehicle Make and Model Recognition (VMMR has emerged as a significant element of vision based systems because of its application in access control systems, traffic control and monitoring systems, security systems and surveillance systems, etc. So far a number of techniques have been developed for vehicle recognition. Each technique follows different methodology and classification approaches. The evaluation results highlight the recognition technique with highest accuracy level. In this paper we have pointed out the working of various vehicle make and model recognition techniques and compare these techniques on the basis of methodology, principles, classification approach, classifier and level of recognition After comparing these factors we concluded that Locally Normalized Harris Corner Strengths (LHNS performs best as compared to other techniques. LHNS uses Bayes and K-NN classification approaches for vehicle classification. It extracts information from frontal view of vehicles for vehicle make and model recognition.

  14. Fluorometric Discrimination Technique of Phytoplankton Population Based on Wavelet Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Shanshan; SU Rongguo; DUAN Yali; ZHANG Cui; SONG Zhijie; WANG Xiulin

    2012-01-01

    The discrete excitation-emission-matrix fluorescence spectra(EEMS)at 12 excitation wavelengths (400,430,450,460,470,490,500,510,525,550,570,and 590 nm)and emission wavelengths ranging from 600-750 nm were determined for 43 phytoplankton species.A two-rank fluorescence spectra database was established by wavelet analysis and a fluorometric discrimination technique for determining phytoplankton population was developed.For laboratory simulatively mixed samples,the samples mixed from 43 algal species(the algae of one division accounted for 25%,50%,75%,85%,and 100% of the gross biomass,respectively),the average discrimination rates at the level of division were 65.0%,87.5%,98.6%,99.0%,and 99.1%,with average relative contents of 18.9%,44.5%,68.9%,73.4%,and 82.9%,respectively;the samples mixed from 32 red tide algal species(the dominant species accounted for 60%,70%,80%,90%,and 100% of the gross biomass,respectively),the average correct discrimination rates of the dominant species at the level of genus were 63.3%,74.2%,78.8%,83.4%,and 79.4%,respectively.For the 81 laboratory mixed samples with the dominant species accounting for 75% of the gross biomass(chlorophyll),the discrimination rates of the dominant species were 95.1% and 72.8% at the level of division and genus,respectively.For the 12 samples collected from the mesocosm experiment in Maidao Bay of Qingdao in August 2007,the dominant species of the 11 samples were recognized at the division level and the dominant species of four of the five samples in which the dominant species accounted for more than 80% of the gross biomass were discriminated at the genus level;for the 12 samples obtained from Jiaozhou Bay in August 2007,the dominant species of all the 12 samples were recognized at the division level.The technique can be directly applied to fluorescence spectrophotometers and to the developing of an in situ algae fluorescence auto-analyzer for

  15. HPLC-MS technique for radiopharmaceuticals analysis and quality control

    Science.gov (United States)

    Macášek, F.; Búriová, E.; Brúder, P.; Vera-Ruiz, H.

    2003-01-01

    Potentialities of liquid chromatography with mass spectrometric detector (MSD) were investigated with the objective of quality control of radiopharmaceuticals; 2-deoxy-2-[18F]fluoro-D-glucose (FDG) being an example. Screening of suitable MSD analytical lines is presented. Mass-spectrometric monitoring of acetonitrile— aqueous ammonium formate eluant by negatively charged FDG.HCO2 - ions enables isotope analysis (specific activity) of the radiopharmaceutical at m/z 227 and 226. Kryptofix® 222 provides an intense MSD signal of the positive ion associated with NH4 + at m/z 394. Expired FDG injection samples contain decomposition products from which at least one labelled by 18F and characterised by signal of negative ions at m/z 207 does not correspond to FDG fragments but to C5 decomposition products. A glucose chromatographic peak, characterised by m/z 225 negative ion is accompanied by a tail of a component giving a signal of m/z 227, which can belong to [18O]glucose; isobaric sorbitol signals were excluded but FDG-glucose association occurs in the co-elution of separation of model mixtures. The latter can actually lead to a convoluted chromatographic peak, but the absence of 18F makes this inconsistent. Quantification and validation of the FDG component analysis is under way.

  16. A computerised morphometric technique for the analysis of intimal hyperplasia.

    OpenAIRE

    Tennant, M; McGeachie, J K

    1991-01-01

    The aim of this study was to design, develop and employ a method for the acquisition of a significant data base of thickness measurements. The integration of standard histological techniques (step serial sectioning), modern computer technology and a personally developed software package (specifically designed for thickness measurement) produced a novel technique suitable for the task. The technique allowed the elucidation of a larger data set from tissue samples. Thus a detailed and accurate ...

  17. ESTIMATION OF ACTIVATED ENERGY OF DESORPTION OF n—HEXANE ON ACTIVATED CARBONS BY PTD TECHNIQUE

    Institute of Scientific and Technical Information of China (English)

    LIZhong; WANGHongjuan; 等

    2001-01-01

    In this paper,six kinds of activated carbons such as Ag+-activated carbon,Cu2+activated carbon,Fe3+-activated carbon,activated carbon,Ba2+-activated carbon and Ca2+activated carbon were prepared.The model for estimating activated energy of desorption was established.Temperature-programmed desorption(TPD)experiments were conducted to measure the TPD curves of n-hexanol and then estimate the activation energy for desorption of n-hexanol on the activated carbons.Results showed that the activation energy for the desorption of n-hexanol on the Ag+-activated carbon,the Cu2+-activated carbon and the Fe3+-activated carbon were higher than those of n-hexanol on the activated carbon,the Ca2+-activated carbon and the Ba2+-activated carbon.

  18. ESTIMATION OF ACTIVATED ENERGY OF DESORPTION OF n-HEXANE ON ACTIVATED CARBONS BY TPD TECHNIQUE

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In this paper, six kinds of activated carbons such as Ag+-activated carbon, Cu2+-activated carbon, Fe3+- activated carbon, activated carbon, Ba2+- activated carbon and Ca2+-activated carbon were prepared. The model for estimating activated energy of desorption was established. Temperature-programmed desorption (TPD) experiments were conducted to measure the TPD curves of n-hexanol and then estimate the activation energy for desorption of n-hexanol on the activated carbons. Results showed that the activation energy for the desorption of n-hexanol on the Ag+- activated carbon, the Cu2+- activated carbon and the Fe3+- activated carbon were higher than those of n-hexanol on the activated carbon, the Ca2+- activated carbon and the Ba2+- activated carbon.

  19. Biomechanical analysis of cross-country skiing techniques.

    Science.gov (United States)

    Smith, G A

    1992-09-01

    The development of new techniques for cross-country skiing based on skating movements has stimulated biomechanical research aimed at understanding the various movement patterns, the forces driving the motions, and the mechanical factors affecting performance. Research methods have evolved from two-dimensional kinematic descriptions of classic ski techniques to three-dimensional analyses involving measurement of the forces and energy relations of skating. While numerous skiing projects have been completed, most have focused on either the diagonal stride or the V1 skating technique on uphill terrain. Current understanding of skiing mechanics is not sufficiently complete to adequately assess and optimize an individual skier's technique.

  20. COMPARATIVE ANALYSIS OF SATELLITE IMAGE PRE-PROCESSING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    T. Sree Sharmila

    2013-01-01

    Full Text Available Satellite images are corrupted by noise in its acquisition and transmission. The removal of noise from the image by attenuating the high frequency image components, removes some important details as well. In order to retain the useful information and improve the visual appearance, an effective denoising and resolution enhancement techniques are required. In this research, Hybrid Directional Lifting (HDL technique is proposed to retain the important details of the image and improve the visual appearance. The Discrete Wavelet Transform (DWT based interpolation technique is developed for enhancing the resolution of the denoised image. The performance of the proposed techniques are tested by Land Remote-Sensing Satellite (LANDSAT images, using the quantitative performance measure, Peak Signal to Noise Ratio (PSNR and computation time to show the significance of the proposed techniques. The PSNR of the HDL technique increases 1.02 dB compared to the standard denoising technique and the DWT based interpolation technique increases 3.94 dB. From the experimental results it reveals that newly developed image denoising and resolution enhancement techniques improve the image visual quality with rich textures.

  1. Development of active control technique for engine noise. Engine soon no active seigyo gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    Uchida, H.; Nakao, N.; Butsuen, T. (Mazda Motor Corp., Hiroshima (Japan))

    1994-03-31

    As a measure to reduce engine noise in a car, the active noise control (ANC) technique to eliminate noise by another noise of antiphase has been studied. The conventional filtered-x LMS control algorithm has been generally applied to the ANC, but a large quantity of arithmetic operation used for filtering is practically problematic. This paper proposes the new algorithm of which control effects and practicability have been improved by utilizing periodicity of engine noise and by introducing the idea of error scanning. This algorithm requires only 30-50% of the arithmetic operation of the above LMS method. Concerning the actual system structure, arrangement and the number of microphones have been examined based on the detailed measurement results of the spatial distribution of noise in a car. As a result, the suitable arrangement of only three microphones to reduce noise in the whole interior space of a car is found. Through the experiments, maximum noise reduction of 8dB (A scale) has been achieved at each seat position. 7 refs., 9 figs., 1 tab.

  2. Effects of nanosuspension and inclusion complex techniques on the in vitro protease inhibitory activity of naproxen

    OpenAIRE

    Dharmalingam, Senthil Rajan; Chidambaram, Kumarappan; Ramamurthy, Srinivasan; Nadaraju,Shamala

    2014-01-01

    This study investigated the effects of nanosuspension and inclusion complex techniques on in vitro trypsin inhibitory activity of naproxen—a member of the propionic acid derivatives, which are a group of antipyretic, analgesic, and non-steroidal anti-inflammatory drugs. Nanosuspension and inclusion complex techniques were used to increase the solubility and anti-inflammatory efficacy of naproxen. The evaporative precipitation into aqueous solution (EPAS) technique and the kneading metho...

  3. Getting the Most Out of Dual-Listed Courses: Involving Undergraduate Students in Discussion Through Active Learning Techniques

    Science.gov (United States)

    Tasich, C. M.; Duncan, L. L.; Duncan, B. R.; Burkhardt, B. L.; Benneyworth, L. M.

    2015-12-01

    Dual-listed courses will persist in higher education because of resource limitations. The pedagogical differences between undergraduate and graduate STEM student groups and the underlying distinction in intellectual development levels between the two student groups complicate the inclusion of undergraduates in these courses. Active learning techniques are a possible remedy to the hardships undergraduate students experience in graduate-level courses. Through an analysis of both undergraduate and graduate student experiences while enrolled in a dual-listed course, we implemented a variety of learning techniques used to complement the learning of both student groups and enhance deep discussion. Here, we provide details concerning the implementation of four active learning techniques - role play, game, debate, and small group - that were used to help undergraduate students critically discuss primary literature. Student perceptions were gauged through an anonymous, end-of-course evaluation that contained basic questions comparing the course to other courses at the university and other salient aspects of the course. These were given as a Likert scale on which students rated a variety of statements (1 = strongly disagree, 3 = no opinion, and 5 = strongly agree). Undergraduates found active learning techniques to be preferable to traditional techniques with small-group discussions being rated the highest in both enjoyment and enhanced learning. The graduate student discussion leaders also found active learning techniques to improve discussion. In hindsight, students of all cultures may be better able to take advantage of such approaches and to critically read and discuss primary literature when written assignments are used to guide their reading. Applications of active learning techniques can not only address the gap between differing levels of students, but also serve as a complement to student engagement in any science course design.

  4. Digital image processing and analysis for activated sludge wastewater treatment.

    Science.gov (United States)

    Khan, Muhammad Burhan; Lee, Xue Yong; Nisar, Humaira; Ng, Choon Aun; Yeap, Kim Ho; Malik, Aamir Saeed

    2015-01-01

    Activated sludge system is generally used in wastewater treatment plants for processing domestic influent. Conventionally the activated sludge wastewater treatment is monitored by measuring physico-chemical parameters like total suspended solids (TSSol), sludge volume index (SVI) and chemical oxygen demand (COD) etc. For the measurement, tests are conducted in the laboratory, which take many hours to give the final measurement. Digital image processing and analysis offers a better alternative not only to monitor and characterize the current state of activated sludge but also to predict the future state. The characterization by image processing and analysis is done by correlating the time evolution of parameters extracted by image analysis of floc and filaments with the physico-chemical parameters. This chapter briefly reviews the activated sludge wastewater treatment; and, procedures of image acquisition, preprocessing, segmentation and analysis in the specific context of activated sludge wastewater treatment. In the latter part additional procedures like z-stacking, image stitching are introduced for wastewater image preprocessing, which are not previously used in the context of activated sludge. Different preprocessing and segmentation techniques are proposed, along with the survey of imaging procedures reported in the literature. Finally the image analysis based morphological parameters and correlation of the parameters with regard to monitoring and prediction of activated sludge are discussed. Hence it is observed that image analysis can play a very useful role in the monitoring of activated sludge wastewater treatment plants.

  5. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  6. Structural analysis of irradiated crotoxin by spectroscopic techniques

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Karina C. de; Fucase, Tamara M.; Silva, Ed Carlos S. e; Chagas, Bruno B.; Buchi, Alisson T.; Viala, Vincent L.; Spencer, Patrick J.; Nascimento, Nanci do, E-mail: kcorleto@usp.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro de Biotecnologia

    2013-07-01

    Snake bites are a serious public health problem, especially in subtropical countries. In Brazil, the serum, the only effective treatment in case of snake bites, is produced in horses which, despite of their large size, have a reduced lifespan due to the high toxicity of the antigen. Ionizing radiation has been successfully employed to attenuate the biological activity of animal toxins. Crotoxin, the main toxic compound from Crotalus durissus terrificus (Cdt), is a heterodimeric protein composed of two subunits: crotapotin and phospholipase A{sub 2}. Previous data indicated that this protein, following irradiation process, undergoes unfolding and/or aggregation, resulting in a much lower toxic antigen. The exact mechanisms and structural modifications involved in aggregation process are not clear yet. This work investigates the effects of ionizing radiation on crotoxin employing Infrared Spectroscopy, Circular Dichroism and Dynamic Light Scattering techniques. The infrared spectrum of lyophilized crotoxin showed peaks corresponding to the vibrational spectra of the secondary structure of crotoxin, including β-sheet, random coil, α-helix and β-turns. We calculated the area of these spectral regions after adjusting for baseline and normalization using the amide I band (1590-1700 cm{sup -1}), obtaining the variation of secondary structures of the toxin following irradiation. The Circular Dichroism spectra of native and irradiated crotoxin suggests a conformational change within the molecule after the irradiation process. This data indicates structural changes between the samples, apparently from ordered conformation towards a random coil. The analyses by light scattering indicated that the irradiated crotoxin formed multimers with an average molecular radius 100 folds higher than the native toxin. (author)

  7. Error Analysis for the Airborne Direct Georeferincing Technique

    Science.gov (United States)

    Elsharkawy, Ahmed S.; Habib, Ayman F.

    2016-10-01

    Direct Georeferencing was shown to be an important alternative to standard indirect image orientation using classical or GPS-supported aerial triangulation. Since direct Georeferencing without ground control relies on an extrapolation process only, particular focus has to be laid on the overall system calibration procedure. The accuracy performance of integrated GPS/inertial systems for direct Georeferencing in airborne photogrammetric environments has been tested extensively in the last years. In this approach, the limiting factor is a correct overall system calibration including the GPS/inertial component as well as the imaging sensor itself. Therefore remaining errors in the system calibration will significantly decrease the quality of object point determination. This research paper presents an error analysis for the airborne direct Georeferencing technique, where integrated GPS/IMU positioning and navigation systems are used, in conjunction with aerial cameras for airborne mapping compared with GPS/INS supported AT through the implementation of certain amount of error on the EOP and Boresight parameters and study the effect of these errors on the final ground coordinates. The data set is a block of images consists of 32 images distributed over six flight lines, the interior orientation parameters, IOP, are known through careful camera calibration procedure, also 37 ground control points are known through terrestrial surveying procedure. The exact location of camera station at time of exposure, exterior orientation parameters, EOP, is known through GPS/INS integration process. The preliminary results show that firstly, the DG and GPS-supported AT have similar accuracy and comparing with the conventional aerial photography method, the two technologies reduces the dependence on ground control (used only for quality control purposes). Secondly, In the DG Correcting overall system calibration including the GPS/inertial component as well as the imaging sensor itself

  8. Advanced patch-clamp techniques and single-channel analysis

    NARCIS (Netherlands)

    Biskup, B; Elzenga, JTM; Homann, U; Thiel, G; Wissing, F; Maathuis, FJM

    1999-01-01

    Much of our knowledge of ion-transport mechanisms in plant cell membranes comes from experiments using voltage-clamp. This technique allows the measurement of ionic currents across the membrane, whilst the voltage is held under experimental control. The patch-clamp technique was developed to study t

  9. Using Metadata Analysis and Base Analysis Techniques in Data Qualities Framework for Data Warehouses

    Directory of Open Access Journals (Sweden)

    Azwa A. Aziz

    2011-01-01

    Full Text Available Information provided by any applications systems in organization is vital in order to obtain a decision. Due to this factor, the quality of data provided by Data Warehouse (DW is really important for organization to produce the best solution for their company to move forwards. DW is complex systems that have to deliver highly-aggregated, high quality data from heterogeneous sources to decision makers. It involves a lot of integration of sources system to support business operations. Problem statement: Many of DW projects are failed because of Data Quality (DQ problems. DQ issues become a major concern over decade. Approach: This study proposes a framework for implementing DQ in DW system architecture using Metadata Analysis Technique and Base Analysis Technique. Those techniques perform comparison between target values and current values gain from the systems. A prototype using PHP is develops to support Base Analysis Techniques. Then a sample schema from Oracle database is used to study differences between applying the framework or not. The prototype is demonstrated to the selected organizations to identify whether it will help to reduce DQ problems. Questionnaires have been given to respondents. Results: The result show user interested in applying DQ processes in their organizations. Conclusion/Recommendation: The implementation of the framework suggested in real situation need to be conducted to obtain more accurate result.

  10. Development of HANARO Activation Analysis System and Utilization Technology

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Y. S.; Moon, J. H.; Cho, H. J. (and others)

    2007-06-15

    1. Establishment of evaluation system using a data for a neutron activation analysis : Improvement of NAA measurement system and its identification, Development of combined data evaluation code of NAA/PGAA, International technical cooperation project 2. Development of technique for a industrial application of high precision gamma nuclide spectroscopic analysis : Analytical quality control, Development of industrial application techniques and its identification 3. Industrial application research for a prompt gamma-ray activation analysis : Improvement of Compton suppression counting system (PGAA), Development of applied technology using a PGAA system 4. Establishment of NAA user supporting system and KOLAS management : Development and validation of KOLAS/ISO accreditation testing and identification method, Cooperation researches for a industrial application, Establishment of integrated user analytical supporting system, Accomplishment of sample irradiation facility.

  11. Association of two techniques of frontal sinus radiographic analysis for human identification

    Directory of Open Access Journals (Sweden)

    Rhonan Ferreira da SILVA

    2009-09-01

    Full Text Available Introduction: The analysis of images with human identificationpurpose is a routine activity in the departments of forensic medicine, especially when is necessary to identify burned bodies, skeletal remains or corpses in advanced stage of decomposition. Case report: The feasibility and reliability of the analysis of the morphoradiographic image of the frontal sinus is showed, displayed in a posteroanterior (PA radiography of skull produced in life compared to another produced post-death. Conclusion: The results obtained in the radiographic comparison through the association of two different techniques of analysis of the frontal sinus allowed a positive correlation of the identity of the disappeared person with the body in an advanced stage of decomposition.

  12. A new technique for fractal analysis applied to human, intracerebrally recorded, ictal electroencephalographic signals.

    Science.gov (United States)

    Bullmore, E; Brammer, M; Alarcon, G; Binnie, C

    1992-11-09

    Application of a new method of fractal analysis to human, intracerebrally recorded, ictal electroencephalographic (EEG) signals is reported. 'Frameshift-Richardson' (FR) analysis involves estimation of fractal dimension (1 EEG data; it is suggested that this technique offers significant operational advantages over use of algorithms for FD estimation requiring preliminary reconstruction of EEG data in phase space. FR analysis was found to reduce substantially the volume of EEG data, without loss of diagnostically important information concerning onset, propagation and evolution of ictal EEG discharges. Arrhythmic EEG events were correlated with relatively increased FD; rhythmic EEG events with relatively decreased FD. It is proposed that development of this method may lead to: (i) enhanced definition and localisation of initial ictal changes in the EEG presumed due to multi-unit activity; and (ii) synoptic visualisation of long periods of EEG data.

  13. Multidimensional scaling technique for analysis of magnetic storms at Indian observatories

    Indian Academy of Sciences (India)

    M Sridharan; A M S Ramasamy

    2002-12-01

    Multidimensional scaling is a powerful technique for analysis of data. The latitudinal dependenceof geomagnetic field variation in horizontal component (H) during magnetic storms is analysed in this paper by employing this technique.

  14. Applications of surface analysis techniques to photovoltaic research: Grain and grain boundary studies

    Science.gov (United States)

    Kazmerski, L. L.

    Complementary surface analysis techniques (AES, SIMS, XPS) are applied to photovoltaic devices in order to assess the limiting factors of grain and grain boundary chemistry to the performance of polycrystalline solar cells. Results of these compositional and chemical studies are directly correlated with electrical measurements (EBIC) and with resulting device performance. Examples of grain boundary passivation in polycrystalline Si and GaAs solar cells are cited. The quality of the intragrain material used in these devices is shown to be equally important to the grain boundary activity in determining overall photovoltaic performance.

  15. In Vivo Imaging Techniques: A New Era for Histochemical Analysis

    Science.gov (United States)

    Busato, A.; Feruglio, P. Fumene; Parnigotto, P.P.; Marzola, P.; Sbarbati, A.

    2016-01-01

    In vivo imaging techniques can be integrated with classical histochemistry to create an actual histochemistry of water. In particular, Magnetic Resonance Imaging (MRI), an imaging technique primarily used as diagnostic tool in clinical/preclinical research, has excellent anatomical resolution, unlimited penetration depth and intrinsic soft tissue contrast. Thanks to the technological development, MRI is not only capable to provide morphological information but also and more interestingly functional, biophysical and molecular. In this paper we describe the main features of several advanced imaging techniques, such as MRI microscopy, Magnetic Resonance Spectroscopy, functional MRI, Diffusion Tensor Imaging and MRI with contrast agent as a useful support to classical histochemistry. PMID:28076937

  16. Image analysis techniques associated with automatic data base generation.

    Science.gov (United States)

    Bond, A. D.; Ramapriyan, H. K.; Atkinson, R. J.; Hodges, B. C.; Thomas, D. T.

    1973-01-01

    This paper considers some basic problems relating to automatic data base generation from imagery, the primary emphasis being on fast and efficient automatic extraction of relevant pictorial information. Among the techniques discussed are recursive implementations of some particular types of filters which are much faster than FFT implementations, a 'sequential similarity detection' technique of implementing matched filters, and sequential linear classification of multispectral imagery. Several applications of the above techniques are presented including enhancement of underwater, aerial and radiographic imagery, detection and reconstruction of particular types of features in images, automatic picture registration and classification of multiband aerial photographs to generate thematic land use maps.

  17. Cepstrum Analysis: An Advanced Technique in Vibration Analysis of Defects in Rotating Machinery

    Directory of Open Access Journals (Sweden)

    M. Satyam

    1994-01-01

    Full Text Available Conventional frequency analysis in machinery vibration is not adequate to find out accurately defects in gears, bearings, and blades where sidebands and harmonics are present. Also such an approach is dependent on the transmission path. On the other hand, cepstrum analysis accurately identifies harmonics and sideband families and is a better technique available for fault diagnosis in gears, bearings, and turbine blades of ships and submarines. Cepstrum represents the global power content of a whole family of harmonics and sidebands when more than one family of sidebands are presents at the same time. Also it is insensitive to the transmission path effects since source and transmission path effects are additive and can be separated in cepstrum. The concept, underlying theory and the measurement and analysis involved for using the technique are briefly outlined. Two cases were taken to demonstrate advantage of cepstrum technique over the spectrum analysis. An LP compressor was chosen to study the transmission path effects and a marine gearbox having two sets of sideband families was studied to diagnose the problematic sideband and its severity.

  18. Application of neutron activation techniques and x-ray energy dispersion spectrometry, in analysis of metallic traces adsorbed by chelex-100 resin; Ativacao das tecnicas de ativacao neutronica e espectrometria por dispersao de onda e de energia de raios X, na analise de tracos metalicos adsorvidos pela resina chelex-100

    Energy Technology Data Exchange (ETDEWEB)

    Fernandes, Jair C.; Amaral, Angela M.; Magalhaes, Jesus C.; Pereira, Jose S.J.; Silva, Juliana B. da; Auler, Lucia M.L.A. [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), Belo Horizonte, MG (Brazil)]. E-mail: jcf@urano.cdtn.br

    2000-07-01

    In this work, the authors have investigated optimal conditions of adsorption for several ion metallic groups (cations of heavy metals and transition metals, oxyanions metallics and metalloids and cations of rare earths), as traces (ppb), withdrawn and in mixture of groups, by chelex-100 resin. The experiments have been developed by bath techniques in ammonium acetate tamponade solution 40 mM pH 5,52 content 0,5 g of chelex-100 resin. After magnetic agitation for two hours, resins were dried and submitted to X-ray energy dispersion spectrometry, x-ray fluorescence spectrometry and neutron activation analysis. The results have demonstrated that chelex-100 resin adsorb quantitatively transition element groups and rare earth groups in two cases (withdrawn and simultaneously adsorption)

  19. Comparative analysis of data mining techniques for business data

    Science.gov (United States)

    Jamil, Jastini Mohd; Shaharanee, Izwan Nizal Mohd

    2014-12-01

    Data mining is the process of employing one or more computer learning techniques to automatically analyze and extract knowledge from data contained within a database. Companies are using this tool to further understand their customers, to design targeted sales and marketing campaigns, to predict what product customers will buy and the frequency of purchase, and to spot trends in customer preferences that can lead to new product development. In this paper, we conduct a systematic approach to explore several of data mining techniques in business application. The experimental result reveals that all data mining techniques accomplish their goals perfectly, but each of the technique has its own characteristics and specification that demonstrate their accuracy, proficiency and preference.

  20. RAPD analysis : a rapid technique for differentation of spoilage yeasts

    NARCIS (Netherlands)

    Baleiras Couto, M.M.; Vossen, J.M.B.M. van der; Hofstra, H.; Huis in 't Veld, J.H.J.

    1994-01-01

    Techniques for the identification of the spoilage yeasts Saccharomyces cerevisiae and members of the Zygosaccharomyces genus from food and beverages sources were evaluated. The use of identification systems based on physiological characteristics resulted often in incomplete identification or misiden

  1. Reticle defect sizing of optical proximity correction defects using SEM imaging and image analysis techniques

    Science.gov (United States)

    Zurbrick, Larry S.; Wang, Lantian; Konicek, Paul; Laird, Ellen R.

    2000-07-01

    Sizing of programmed defects on optical proximity correction (OPC) feature sis addressed using high resolution scanning electron microscope (SEM) images and image analysis techniques. A comparison and analysis of different sizing methods is made. This paper addresses the issues of OPC defect definition and discusses the experimental measurement results obtained by SEM in combination with image analysis techniques.

  2. MUMAL: Multivariate analysis in shotgun proteomics using machine learning techniques

    Directory of Open Access Journals (Sweden)

    Cerqueira Fabio R

    2012-10-01

    Full Text Available Abstract Background The shotgun strategy (liquid chromatography coupled with tandem mass spectrometry is widely applied for identification of proteins in complex mixtures. This method gives rise to thousands of spectra in a single run, which are interpreted by computational tools. Such tools normally use a protein database from which peptide sequences are extracted for matching with experimentally derived mass spectral data. After the database search, the correctness of obtained peptide-spectrum matches (PSMs needs to be evaluated also by algorithms, as a manual curation of these huge datasets would be impractical. The target-decoy database strategy is largely used to perform spectrum evaluation. Nonetheless, this method has been applied without considering sensitivity, i.e., only error estimation is taken into account. A recently proposed method termed MUDE treats the target-decoy analysis as an optimization problem, where sensitivity is maximized. This method demonstrates a significant increase in the retrieved number of PSMs for a fixed error rate. However, the MUDE model is constructed in such a way that linear decision boundaries are established to separate correct from incorrect PSMs. Besides, the described heuristic for solving the optimization problem has to be executed many times to achieve a significant augmentation in sensitivity. Results Here, we propose a new method, termed MUMAL, for PSM assessment that is based on machine learning techniques. Our method can establish nonlinear decision boundaries, leading to a higher chance to retrieve more true positives. Furthermore, we need few iterations to achieve high sensitivities, strikingly shortening the running time of the whole process. Experiments show that our method achieves a considerably higher number of PSMs compared with standard tools such as MUDE, PeptideProphet, and typical target-decoy approaches. Conclusion Our approach not only enhances the computational performance, and

  3. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    Science.gov (United States)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  4. Applying Modern Techniques and Carrying Out English .Extracurricular—— On the Model United Nations Activity

    Institute of Scientific and Technical Information of China (English)

    XuXiaoyu; WangJian

    2004-01-01

    This paper is an introduction of the extracurricular activity of the Model United Nations in Northwestern Polyteehnical University (NPU) and it focuses on the application of the modem techniques in the activity and the pedagogical theories applied in it. An interview and questionnaire research will reveal the influence of the Model United Nations.

  5. ACTIVATION-ENERGY SPECTRA FOR STRESS-INDUCED ORDERING IN AMORPHOUS MATERIALS CALCULATED USING FOURIER TECHNIQUES

    NARCIS (Netherlands)

    KASARDOVA, A; Ocelik, Vaclav; CSACH, K; MISKUF, J

    1995-01-01

    A method for calculating the activation energy spectrum from isothermal data using Fourier techniques is used for studying the deformation processes in amorphous metals. The influence of experimental error on the calculated spectrum is discussed. The activation energy spectrum derived from the anela

  6. Reconstructing muscle activation during normal walking: a comparison of symbolic and connectionist machine learning techniques

    NARCIS (Netherlands)

    Heller, Ben W.; Veltink, Peter H.; Rijkhoff, Nico J.M.; Rutten, Wim L.C.; Andrews, Brian J.

    1993-01-01

    One symbolic (rule-based inductive learning) and one connectionist (neural network) machine learning technique were used to reconstruct muscle activation patterns from kinematic data measured during normal human walking at several speeds. The activation patterns (or desired outputs) consisted of sur

  7. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  8. Hydrogeological activity of lineaments in Yaoundé Cameroon region using remote sensing and GIS techniques

    Directory of Open Access Journals (Sweden)

    William Teikeu Assatse

    2016-06-01

    Full Text Available Though Yaoundé zone is characterized by abundant rains, access to safe drinking water becomes a difficult activity, because of climate change and pollution caused by human activities. Lineament zones on the earth’s surface are important elements in understanding the dynamics of the subsurface fluid flow. However, good exposures of these features are always lacking in some areas around Yaoundé, characterized by thick alteration. During field surveys these conditions, in many cases, hinder the proper characterization of such features. Therefore, an approach that identifies the regional lineaments on remote-sensing images (Landsat Thematic Mapper and shaded digital terrain models, with its large scale synoptic coverage, could be promising. This paper aims to the structural organization of lineament network in the crystalline basement of Yaoundé from remote sensing data and characterize them by statistical and geostatistical techniques. The results were validated on the basis of the geological maps, the hydrogeological maps and the outcrop data. Statistical analysis of the lineaments network shows a distribution along the N0–10, N20–30, N40–60 and N140–150. The correlation between the productivity of high yield wells and the closest lineament confirms that these lineaments are surface traces of regional discontinuities and act as main groundwater flow paths.

  9. Molecular field analysis (MFA) and other QSAR techniques in development of phosphatase inhibitors.

    Science.gov (United States)

    Nair, Pramod C

    2011-01-01

    Phosphatases are well known drug targets for diseases such as diabetes, obesity and other autoimmune diseases. Their role in cancer is due to unusual expression patterns in different types of cancer. However, there is strong evidence for selective targeting of phosphatases in cancer therapy. Several experimental and in silico techniques have been attempted for design of phosphatase inhibitors, with focus on diseases such as diabetes, inflammation and obesity. Their utility for cancer therapy is limited and needs to be explored vastly. Quantitative Structure Activity relationship (QSAR) is well established in silico ligand based drug design technique, used by medicinal chemists for prediction of ligand binding affinity and lead design. These techniques have shown promise for subsequent optimization of already existing lead compounds, with an aim of increased potency and pharmacological properties for a particular drug target. Furthermore, their utility in virtual screening and scaffold hopping is highlighted in recent years. This review focuses on the recent molecular field analysis (MFA) and QSAR techniques, directed for design and development of phosphatase inhibitors and their potential use in cancer therapy. In addition, this review also addresses issues concerning the binding orientation and binding conformation of ligands for alignment sensitive QSAR approaches.

  10. A technique for detecting antifungal activity of proteins separated by polyacrylamide gel electrophoresis.

    Science.gov (United States)

    De Bolle, M F; Goderis, I J; Terras, F R; Cammue, B P; Broekaert, W F

    1991-06-01

    A technique was developed for the detection of antifungal activity of proteins after discontinuous polyacrylamide gel electrophoresis under native conditions. The antifungal activity is detected as growth inhibition zones in a homogeneous fungal lawn, grown in an agar layer spread on top of the polyacrylamide gel. The position of proteins with antifungal activity can be determined on a diffusion blot prepared from the same gel. The technique is illustrated for three antifungal plant proteins, i.e. alpha-purothionin, Urtica dioica agglutinin, and tobacco chitinase.

  11. IN VITRO ANALYSIS OF MIGRATION ACTIVITY OF ENCEPHALYTOGENIC T CEL

    Directory of Open Access Journals (Sweden)

    M. A. Nosov

    2010-01-01

    Full Text Available Experimental autoimmune encephalomyelitis in an adoptive transfer model is caused by injecting animal with activated T cells specific for a CNS antigen, e.g., basic myelin protein. Development of autimmune inflammation in such a model is connected with changed functional stateof encephalytogenic (EG T cells in the coure of disease progression, as reflected by changes in their activation, proliferation and motility levels. Present work describes an original technique allowing for in vitro analysis of encephalytogenic T cell motility, and studying effects of certain compomemts of extracellular matrix upon migration and functional activities of EG T cells.

  12. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  13. Comparative study of Authorship Identification Techniques for Cyber Forensics Analysis

    Directory of Open Access Journals (Sweden)

    Smita Nirkhi

    2013-06-01

    Full Text Available Authorship Identification techniques are used to identify the most appropriate author from group of potential suspects of online messages and find evidences to support the conclusion. Cybercriminals make misuse of online communication for sending blackmail or a spam email and then attempt to hide their true identities to void detection.Authorship Identification of online messages is the contemporary research issue for identity tracing in cyber forensics. This is highly interdisciplinary area as it takes advantage of machine learning, information retrieval, and natural language processing. In this paper, a study of recent techniques and automated approaches to attributing authorship of online messages is presented. The focus of this review study is to summarize all existing authorship identification techniques used in literature to identify authors of online messages. Also it discusses evaluation criteria and parameters for authorship attribution studies and list open questions that will attract future work in this area.

  14. MAG4 Versus Alternative Techniques for Forecasting Active-Region Flare Productivity

    Science.gov (United States)

    Falconer, David A.; Moore, Ronald L.; Barghouty, Abdulnasser F.; Khazanov, Igor

    2014-01-01

    MAG4 is a technique of forecasting an active region's rate of production of major flares in the coming few days from a free-magnetic-energy proxy. We present a statistical method of measuring the difference in performance between MAG4 and comparable alternative techniques that forecast an active region's major-flare productivity from alternative observed aspects of the active region. We demonstrate the method by measuring the difference in performance between the "Present MAG4" technique and each of three alternative techniques, called "McIntosh Active-Region Class," "Total Magnetic Flux," and "Next MAG4." We do this by using (1) the MAG4 database of magnetograms and major-flare histories of sunspot active regions, (2) the NOAA table of the major-flare productivity of each of 60 McIntosh active-region classes of sunspot active regions, and (3) five technique-performance metrics (Heidke Skill Score, True Skill Score, Percent Correct, Probability of Detection, and False Alarm Rate) evaluated from 2000 random two-by-two contingency tables obtained from the databases. We find that (1) Present MAG4 far outperforms both McIntosh Active-Region Class and Total Magnetic Flux, (2) Next MAG4 significantly outperforms Present MAG4, (3) the performance of Next MAG4 is insensitive to the forward and backward temporal windows used, in the range of one to a few days, and (4) forecasting from the free-energy proxy in combination with either any broad category of McIntosh active-region classes or any Mount Wilson active-region class gives no significant performance improvement over forecasting from the free-energy proxy alone (Present MAG4).

  15. An Information Diffusion Technique for Fire Risk Analysis

    Institute of Scientific and Technical Information of China (English)

    刘静; 黄崇福

    2004-01-01

    There are many kinds of fires occurring under different conditions. For a specific site, it is difficult to collect sufficient data for analyzing the fire risk. In this paper, we suggest an information diffusion technique to analyze fire risk with a small sample. The information distribution method is applied to change crisp observations into fuzzy sets, and then to effectively construct a fuzzy relationship between fire and surroundings. With the data of Shanghai in winter, we show how to use the technique to analyze the fire risk.

  16. Data Mining Techniques: A Source for Consumer Behavior Analysis

    CERN Document Server

    Raorane, Abhijit

    2011-01-01

    Various studies on consumer purchasing behaviors have been presented and used in real problems. Data mining techniques are expected to be a more effective tool for analyzing consumer behaviors. However, the data mining method has disadvantages as well as advantages. Therefore, it is important to select appropriate techniques to mine databases. The objective of this paper is to know consumer behavior, his psychological condition at the time of purchase and how suitable data mining method apply to improve conventional method. Moreover, in an experiment, association rule is employed to mine rules for trusted customers using sales data in a super market industry

  17. Opportunities for innovation in neutron activation analysis

    NARCIS (Netherlands)

    Bode, P.

    2011-01-01

    Neutron activation laboratories worldwide are at a turning point at which new staff has to be found for the retiring pioneers from the 1960s–1970s. A scientific career in a well-understood technique, often characterized as ‘mature’ may only be attractive to young scientists if still challenges for f

  18. Tape Stripping Technique for Stratum Corneum Protein Analysis

    DEFF Research Database (Denmark)

    Clausen, Maja-Lisa; Slotved, H-C; Krogfelt, Karen A

    2016-01-01

    The aim of this study was to investigate the amount of protein in stratum corneum in atopic dermatitis (AD) patients and healthy controls, using tape stripping technique. Furthermore, to compare two different methods for protein assessment. Tape stripping was performed in AD patients and healthy...

  19. Analysis on Poe's Unique Techniques to Achieve Aestheticism

    Institute of Scientific and Technical Information of China (English)

    孔佳鸣

    2008-01-01

    Edgar Allan Poe was one of the most important poets in the American poetic history for his unremitting pursuit for ‘ideal beauty'.This essay proves by various examples chosen from his poems that his aestheticism was obvious in his versification techniques.His poetic theory and practice gave an immortal example for the development of the English poetry.

  20. UPLC-ICP-MS - a fast technique for speciation analysis

    DEFF Research Database (Denmark)

    Bendahl, L.; Sturup, S.; Gammelgaard, Bente;

    2005-01-01

    Ultra performance liquid chromatography is a new development of the HPLC separation technique that allows separations on column materials at high pressures up to 10(8) Pa using particle diameters of 1.7 mu m. This increases the efficiency, the resolution and the speed of the separation. Four aque...

  1. Infrared Contrast Analysis Technique for Flash Thermography Nondestructive Evaluation

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    The paper deals with the infrared flash thermography inspection to detect and analyze delamination-like anomalies in nonmetallic materials. It provides information on an IR Contrast technique that involves extracting normalized contrast verses time evolutions from the flash thermography infrared video data. The paper provides the analytical model used in the simulation of infrared image contrast. The contrast evolution simulation is achieved through calibration on measured contrast evolutions from many flat bottom holes in the subject material. The paper also provides formulas to calculate values of the thermal measurement features from the measured contrast evolution curve. Many thermal measurement features of the contrast evolution that relate to the anomaly characteristics are calculated. The measurement features and the contrast simulation are used to evaluate flash thermography inspection data in order to characterize the delamination-like anomalies. In addition, the contrast evolution prediction is matched to the measured anomaly contrast evolution to provide an assessment of the anomaly depth and width in terms of depth and diameter of the corresponding equivalent flat-bottom hole (EFBH) or equivalent uniform gap (EUG). The paper provides anomaly edge detection technique called the half-max technique which is also used to estimate width of an indication. The EFBH/EUG and half-max width estimations are used to assess anomaly size. The paper also provides some information on the "IR Contrast" software application, half-max technique and IR Contrast feature imaging application, which are based on models provided in this paper.

  2. Combination of electrochemical, spectrometric and other analytical techniques for high throughput screening of pharmaceutically active compounds.

    Science.gov (United States)

    Suzen, Sibel; Ozkan, Sibel A

    2010-08-01

    Recently, use of electrochemistry and combination of this method with spectroscopic and other analytical techniques are getting one of the important approaches in drug discovery and research as well as quality control, drug stability, determination of physiological activity, measurement of neurotransmitters. Many fundamental physiological processes are depending on oxido-reduction reactions in the body. Therefore, it may be possible to find connections between electrochemical and biochemical reactions concerning electron transfer pathways. Applications of electrochemical techniques to redox-active drug development and studies are one of the recent interests in drug discovery. In this review, the latest developments related to the use of electrochemical techniques in drug research in order to evaluate possible combination spectrometric methods with electrochemical techniques.

  3. A System of Systems Interface Hazard Analysis Technique

    Science.gov (United States)

    2007-03-01

    Table 3. HAZOP Guide Words for Software or System Interface Analysis....... 22 Table 4. Example System of Systems Architecture Table...steps are applicable for a software HAZOP . 2 Plan HAZOP Establish HAZOP analysis goals, definitions, worksheets, schedule and process. Divide the...Subtle Incorrect Output’s value is wrong, but cannot be detected Table 3. HAZOP Guide Words for Software or System Interface Analysis31 The

  4. Development of an Automated Technique for Failure Modes and Effect Analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Allasia, G.;

    1999-01-01

    implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  5. Development of an automated technique for failure modes and effect analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Bagnoli, F.;

    implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  6. Effect of Preparation Techniques of Y-Mo/HZSM-5 on Its Activity in Methane Aromatization

    Institute of Scientific and Technical Information of China (English)

    Qiying Wang; Weiming Lin

    2004-01-01

    The production of benzene directly from methane aromatization under oxygen-free condition is currently a new focus in natural gas utilization. The influence of preparation techniques of the catalysts on their catalytic activities is studied in this paper. The influencing factors include the impregnating method, the calcination temperature, the promoter content and the acidity of the zeolite support. Optimum preparation techniques for the catalysts are obtained through this work.

  7. The composition-explicit distillation curve technique: Relating chemical analysis and physical properties of complex fluids.

    Science.gov (United States)

    Bruno, Thomas J; Ott, Lisa S; Lovestead, Tara M; Huber, Marcia L

    2010-04-16

    The analysis of complex fluids such as crude oils, fuels, vegetable oils and mixed waste streams poses significant challenges arising primarily from the multiplicity of components, the different properties of the components (polarity, polarizability, etc.) and matrix properties. We have recently introduced an analytical strategy that simplifies many of these analyses, and provides the added potential of linking compositional information with physical property information. This aspect can be used to facilitate equation of state development for the complex fluids. In addition to chemical characterization, the approach provides the ability to calculate thermodynamic properties for such complex heterogeneous streams. The technique is based on the advanced distillation curve (ADC) metrology, which separates a complex fluid by distillation into fractions that are sampled, and for which thermodynamically consistent temperatures are measured at atmospheric pressure. The collected sample fractions can be analyzed by any method that is appropriate. The analytical methods we have applied include gas chromatography (with flame ionization, mass spectrometric and sulfur chemiluminescence detection), thin layer chromatography, FTIR, corrosivity analysis, neutron activation analysis and cold neutron prompt gamma activation analysis. By far, the most widely used analytical technique we have used with the ADC is gas chromatography. This has enabled us to study finished fuels (gasoline, diesel fuels, aviation fuels, rocket propellants), crude oils (including a crude oil made from swine manure) and waste oils streams (used automotive and transformer oils). In this special issue of the Journal of Chromatography, specifically dedicated to extraction technologies, we describe the essential features of the advanced distillation curve metrology as an analytical strategy for complex fluids.

  8. Diagnostic Application of Absolute Neutron Activation Analysis in Hematology

    Energy Technology Data Exchange (ETDEWEB)

    Zamboni, C.B.; Oliveira, L.C.; Dalaqua, L. Jr.

    2004-10-03

    The Absolute Neutron Activation Analysis (ANAA) technique was used to determine element concentrations of Cl and Na in blood of healthy group (male and female blood donators), select from Blood Banks at Sao Paulo city, to provide information which can help in diagnosis of patients. This study permitted to perform a discussion about the advantages and limitations of using this nuclear methodology in hematological examinations.

  9. Pathways of distinction analysis: a new technique for multi-SNP analysis of GWAS data.

    Science.gov (United States)

    Braun, Rosemary; Buetow, Kenneth

    2011-06-01

    Genome-wide association studies (GWAS) have become increasingly common due to advances in technology and have permitted the identification of differences in single nucleotide polymorphism (SNP) alleles that are associated with diseases. However, while typical GWAS analysis techniques treat markers individually, complex diseases (cancers, diabetes, and Alzheimers, amongst others) are unlikely to have a single causative gene. Thus, there is a pressing need for multi-SNP analysis methods that can reveal system-level differences in cases and controls. Here, we present a novel multi-SNP GWAS analysis method called Pathways of Distinction Analysis (PoDA). The method uses GWAS data and known pathway-gene and gene-SNP associations to identify pathways that permit, ideally, the distinction of cases from controls. The technique is based upon the hypothesis that, if a pathway is related to disease risk, cases will appear more similar to other cases than to controls (or vice versa) for the SNPs associated with that pathway. By systematically applying the method to all pathways of potential interest, we can identify those for which the hypothesis holds true, i.e., pathways containing SNPs for which the samples exhibit greater within-class similarity than across classes. Importantly, PoDA improves on existing single-SNP and SNP-set enrichment analyses, in that it does not require the SNPs in a pathway to exhibit independent main effects. This permits PoDA to reveal pathways in which epistatic interactions drive risk. In this paper, we detail the PoDA method and apply it to two GWAS: one of breast cancer and the other of liver cancer. The results obtained strongly suggest that there exist pathway-wide genomic differences that contribute to disease susceptibility. PoDA thus provides an analytical tool that is complementary to existing techniques and has the power to enrich our understanding of disease genomics at the systems-level.

  10. Thermal imaging for detection of SM45C subsurface defects using active infrared thermography techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yoon Jae; Ranjit, Shrestha; Kim, Won Tae [Kongju National University, Cheonan (Korea, Republic of)

    2015-06-15

    Active thermography techniques have the capability of inspecting a broad range simultaneously. By evaluating the phase difference between the defected area and the healthy area, the technique indicates the qualitative location and size of the defect. Previously, the development of the defect detection method used a variety of materials and the test specimen was done. In this study, the proposed technique of lock-in is verified with artificial specimens that have different size and depth of subsurface defects. Finally, the defect detection capability was evaluated using comparisons of the phase image and the amplitude image according to the size and depth of defects.

  11. A Low Latency Electrocardiographic QRS Activity Recovery Technique for Use on the Upper Left Arm

    Directory of Open Access Journals (Sweden)

    William D. Lynn

    2014-07-01

    Full Text Available Empirical mode decomposition is used as a low latency method of recovering the cardiac ventricular activity QRS biopotential signals recorded from the upper arm. The recovery technique is tested and compared with the industry accepted technique of signal averaging using a database of “normal” rhythm traces from bipolar ECG leads along the left arm, recorded from patient volunteers at a cardiology day procedure clinic. The same partial recomposition technique is applied to recordings taken using an innovative dry electrode technology supplied by Plessey Semiconductors. In each case, signal to noise ratio (SNR is used as a metric for comparison.

  12. Cooking techniques improve the levels of bioactive compounds and antioxidant activity in kale and red cabbage.

    Science.gov (United States)

    Murador, Daniella Carisa; Mercadante, Adriana Zerlotti; de Rosso, Veridiana Vera

    2016-04-01

    The aim of this study is to investigate the effects of different home cooking techniques (boiling, steaming, and stir-frying) in kale and red cabbage, on the levels of bioactive compounds (carotenoids, anthocyanins and phenolic compounds) determined by high-performance liquid chromatography coupled with photodiode array and mass spectrometry detectors (HPLC-DAD-MS(n)), and on the antioxidant activity evaluated by ABTS, ORAC and cellular antioxidant activity (CAA) assays. The steaming technique resulted in a significant increase in phenolic content in kale (86.1%; pkale, steaming resulted in significant increases in antioxidant activity levels in all of the evaluation methods. In the red cabbage, boiling resulted in a significant increase in antioxidant activity using the ABTS assay but resulted in a significant decrease using the ORAC assay. According to the CAA assay, the stir-fried sample displayed the highest levels of antioxidant activity.

  13. Antioxidant activity of Galium mollugo L. extracts obtained by different recovery techniques

    Directory of Open Access Journals (Sweden)

    Milić Petar S.

    2013-01-01

    Full Text Available The yield of extractive substances, antioxidant activity, as well as total phenolic and total flavonoid contents of aqueous-ethanolic extracts obtained from aerial parts of Galium mollugo L. by different extraction techniques (maceration, reflux and ultrasonic extraction were reported. The antioxidant activity of extracts was tested by measuring their ability to scavenge a stable DPPH free radical, while the total phenolic and total flavonoid contents were determined according to the Folin-Ciocalteu procedure and a colorimetric method, respectively. The Duncan’s multiple range tests were used to evaluate if there were significant differences among yields of extractive substances, total phenolics, total flavonoids and EC50 values for the extracts obtained by different extraction techniques. The extracts obtained by the reflux extraction contained higher amounts of extractive substances, as well as phenolic and flavonoid compounds, and showed a better antioxidant activity than those obtained by the two other recovering techniques.

  14. Activation analysis of meteorites. 3

    Energy Technology Data Exchange (ETDEWEB)

    Nagai, H.; Honda, M.; Sato, H. [Nihon Univ., College of Humanities and Sciences, Tokyo (Japan); Ebihara, M.; Oura, Y.; Setoguchi, M. [Tokyo Metropolitan Univ., Faculty of Science, Tokyo (Japan)

    2001-07-01

    A long-lived cosmogenic nuclide, {sup 53}Mn in extra-terrestrial materials has been determined in the DR-1 hole of the JRR-3M reactor, applying the well-thermalized neutron flux. The neutron flux intensities are variable with the depths whereas the fast thermal ratios are not quite variable. By this method, {sup 53}Mn contents in iron meteorites and metal phases in general could be routinely determined in many samples. The chemical separation method has been modified and a convenient short circuit method has been proposed to shorten the process. The short method is to count the activities of {sup 54}Mn just after the irradiation without further purification of manganese. (author)

  15. A COMPARISON OF SOME STATISTICAL TECHNIQUES FOR ROAD ACCIDENT ANALYSIS

    NARCIS (Netherlands)

    OPPE, S INST ROAD SAFETY RES, SWOV

    1992-01-01

    At the TRRL/SWOV Workshop on Accident Analysis Methodology, heldin Amsterdam in 1988, the need to establish a methodology for the analysis of road accidents was firmly stated by all participants. Data from different countries cannot be compared because there is no agreement on research methodology,

  16. A Survey of Techniques for Security Architecture Analysis

    Science.gov (United States)

    2003-05-01

    Effects Analysis FPG Failure Propagation Graph FTA Fault Tree Analysis HAZOP Hazard and Operability studies IATF Information Assurance Technical...represent logical places, within an information system, where people can perform their work by means of software acting on their behalf. People who...Describes the resources used to support the DIE (Including, for example, hardware, software , communication networks, applications and qualified staff

  17. Twitter Sentiment Analysis of Movie Reviews using Machine Learning Techniques.

    Directory of Open Access Journals (Sweden)

    Akshay Amolik

    2015-12-01

    Full Text Available Sentiment analysis is basically concerned with analysis of emotions and opinions from text. We can refer sentiment analysis as opinion mining. Sentiment analysis finds and justifies the sentiment of the person with respect to a given source of content. Social media contain huge amount of the sentiment data in the form of tweets, blogs, and updates on the status, posts, etc. Sentiment analysis of this largely generated data is very useful to express the opinion of the mass. Twitter sentiment analysis is tricky as compared to broad sentiment analysis because of the slang words and misspellings and repeated characters. We know that the maximum length of each tweet in Twitter is 140 characters. So it is very important to identify correct sentiment of each word. In our project we are proposing a highly accurate model of sentiment analysis of tweets with respect to latest reviews of upcoming Bollywood or Hollywood movies. With the help of feature vector and classifiers such as Support vector machine and Naïve Bayes, we are correctly classifying these tweets as positive, negative and neutral to give sentiment of each tweet.

  18. Facilitating the analysis of immunological data with visual analytic techniques.

    Science.gov (United States)

    Shih, David C; Ho, Kevin C; Melnick, Kyle M; Rensink, Ronald A; Kollmann, Tobias R; Fortuno, Edgardo S

    2011-01-02

    Visual analytics (VA) has emerged as a new way to analyze large dataset through interactive visual display. We demonstrated the utility and the flexibility of a VA approach in the analysis of biological datasets. Examples of these datasets in immunology include flow cytometry, Luminex data, and genotyping (e.g., single nucleotide polymorphism) data. Contrary to the traditional information visualization approach, VA restores the analysis power in the hands of analyst by allowing the analyst to engage in real-time data exploration process. We selected the VA software called Tableau after evaluating several VA tools. Two types of analysis tasks analysis within and between datasets were demonstrated in the video presentation using an approach called paired analysis. Paired analysis, as defined in VA, is an analysis approach in which a VA tool expert works side-by-side with a domain expert during the analysis. The domain expert is the one who understands the significance of the data, and asks the questions that the collected data might address. The tool expert then creates visualizations to help find patterns in the data that might answer these questions. The short lag-time between the hypothesis generation and the rapid visual display of the data is the main advantage of a VA approach.

  19. Analysis of Requirement Engineering Processes, Tools/Techniques and Methodologies

    Directory of Open Access Journals (Sweden)

    Tousif ur Rehman

    2013-02-01

    Full Text Available Requirement engineering is an integral part of the software development lifecycle since the basis for developing successful software depends on comprehending its requirements in the first place. Requirement engineering involves a number of processes for gathering requirements in accordance with the needs and demands of users and stakeholders of the software product. In this paper, we have reviewed the prominent processes, tools and technologies used in the requirement gathering phase. The study is useful to perceive the current state of the affairs pertaining to the requirement engineering research and to understand the strengths and limitations of the existing requirement engineering techniques. The study also summarizes the best practices and how to use a blend of the requirement engineering techniques as an effective methodology to successfully conduct the requirement engineering task. The study also highlights the importance of security requirements as though they are part of the non-functional requirement, yet are naturally considered fundamental to secure software development.

  20. Method development for arsenic analysis by modification in spectrophotometric technique

    Directory of Open Access Journals (Sweden)

    M. A. Tahir

    2012-01-01

    Full Text Available Arsenic is a non-metallic constituent, present naturally in groundwater due to some minerals and rocks. Arsenic is not geologically uncommon and occurs in natural water as arsenate and arsenite. Additionally, arsenic may occur from industrial discharges or insecticide application. World Health Organization (WHO and Pakistan Standard Quality Control Authority have recommended a permissible limit of 10 ppb for arsenic in drinking water. Arsenic at lower concentrations can be determined in water by using high tech instruments like the Atomic Absorption Spectrometer (hydride generation. Because arsenic concentration at low limits of 1 ppb can not be determined easily with simple spectrophotometric technique, the spectrophotometric technique using silver diethyldithiocarbamate was modified to achieve better results, up to the extent of 1 ppb arsenic concentration.

  1. Magnetic resonance elastography (MRE) in cancer: Technique, analysis, and applications

    Science.gov (United States)

    Pepin, Kay M.; Ehman, Richard L.; McGee, Kiaran P.

    2015-01-01

    Tissue mechanical properties are significantly altered with the development of cancer. Magnetic resonance elastography (MRE) is a noninvasive technique capable of quantifying tissue mechanical properties in vivo. This review describes the basic principles of MRE and introduces some of the many promising MRE methods that have been developed for the detection and characterization of cancer, evaluation of response to therapy, and investigation of the underlying mechanical mechanisms associated with malignancy. PMID:26592944

  2. Analysis of Acoustic Emission Signals using WaveletTransformation Technique

    Directory of Open Access Journals (Sweden)

    S.V. Subba Rao

    2008-07-01

    Full Text Available Acoustic emission (AE monitoring is carried out during proof pressure testing of pressurevessels to find the occurrence of any crack growth-related phenomenon. While carrying out AEmonitoring, it is often found that the background noise is very high. Along with the noise, thesignal includes various phenomena related to crack growth, rubbing of fasteners, leaks, etc. Dueto the presence of noise, it becomes difficult to identify signature of the original signals related to the above phenomenon. Through various filtering/ thresholding techniques, it was found that the original signals were getting filtered out along with noise. Wavelet transformation technique is found to be more appropriate to analyse the AE signals under such situations. Wavelet transformation technique is used to de-noise the AE data. The de-noised signal is classified to identify a signature based on the type of phenomena.Defence Science Journal, 2008, 58(4, pp.559-564, DOI:http://dx.doi.org/10.14429/dsj.58.1677

  3. Infrared Spectroscopy of Explosives Residues: Measurement Techniques and Spectral Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, Mark C.; Bernacki, Bruce E.

    2015-03-11

    Infrared laser spectroscopy of explosives is a promising technique for standoff and non-contact detection applications. However, the interpretation of spectra obtained in typical standoff measurement configurations presents numerous challenges. Understanding the variability in observed spectra from explosives residues and particles is crucial for design and implementation of detection algorithms with high detection confidence and low false alarm probability. We discuss a series of infrared spectroscopic techniques applied toward measuring and interpreting the reflectance spectra obtained from explosives particles and residues. These techniques utilize the high spectral radiance, broad tuning range, rapid wavelength tuning, high scan reproducibility, and low noise of an external cavity quantum cascade laser (ECQCL) system developed at Pacific Northwest National Laboratory. The ECQCL source permits measurements in configurations which would be either impractical or overly time-consuming with broadband, incoherent infrared sources, and enables a combination of rapid measurement speed and high detection sensitivity. The spectroscopic methods employed include standoff hyperspectral reflectance imaging, quantitative measurements of diffuse reflectance spectra, reflection-absorption infrared spectroscopy, microscopic imaging and spectroscopy, and nano-scale imaging and spectroscopy. Measurements of explosives particles and residues reveal important factors affecting observed reflectance spectra, including measurement geometry, substrate on which the explosives are deposited, and morphological effects such as particle shape, size, orientation, and crystal structure.

  4. Ratiometric analysis of fura red by flow cytometry: a technique for monitoring intracellular calcium flux in primary cell subsets.

    Directory of Open Access Journals (Sweden)

    Emily R Wendt

    Full Text Available Calcium flux is a rapid and sensitive measure of cell activation whose utility could be enhanced with better techniques for data extraction. We describe a technique to monitor calcium flux by flow cytometry, measuring Fura Red calcium dye by ratiometric analysis. This technique has several advantages: 1 using a single calcium dye provides an additional channel for surface marker characterization, 2 allows robust detection of calcium flux by minority cell populations within a heterogeneous population of primary T cells and monocytes 3 can measure total calcium flux and additionally, the proportion of responding cells, 4 can be applied to studying the effects of drug treatment, simultaneously stimulating and monitoring untreated and drug treated cells. Using chemokine receptor activation as an example, we highlight the utility of this assay, demonstrating that only cells expressing a specific chemokine receptor are activated by cognate chemokine ligand. Furthermore, we describe a technique for simultaneously stimulating and monitoring calcium flux in vehicle and drug treated cells, demonstrating the effects of the Gαi inhibitor, pertussis toxin (PTX, on chemokine stimulated calcium flux. The described real time calcium flux assay provides a robust platform for characterizing cell activation within primary cells, and offers a more accurate technique for studying the effect of drug treatment on receptor activation in a heterogeneous population of primary cells.

  5. Automated image analysis techniques for cardiovascular magnetic resonance imaging

    NARCIS (Netherlands)

    Geest, Robertus Jacobus van der

    2011-01-01

    The introductory chapter provides an overview of various aspects related to quantitative analysis of cardiovascular MR (CMR) imaging studies. Subsequently, the thesis describes several automated methods for quantitative assessment of left ventricular function from CMR imaging studies. Several novel

  6. Development of a Rapid Soil Water Content Detection Technique Using Active Infrared Thermal Methods for In-Field Applications

    Directory of Open Access Journals (Sweden)

    Federico Pallottino

    2011-10-01

    Full Text Available The aim of this study was to investigate the suitability of active infrared thermography and thermometry in combination with multivariate statistical partial least squares analysis as rapid soil water content detection techniques both in the laboratory and the field. Such techniques allow fast soil water content measurements helpful in both agricultural and environmental fields. These techniques, based on the theory of heat dissipation, were tested by directly measuring temperature dynamic variation of samples after heating. For the assessment of temperature dynamic variations data were collected during three intervals (3, 6 and 10 s. To account for the presence of specific heats differences between water and soil, the analyses were regulated using slopes to linearly describe their trends. For all analyses, the best model was achieved for a 10 s slope. Three different approaches were considered, two in the laboratory and one in the field. The first laboratory-based one was centred on active infrared thermography, considered measurement of temperature variation as independent variable and reported r = 0.74. The second laboratory–based one was focused on active infrared thermometry, added irradiation as independent variable and reported r = 0.76. The in-field experiment was performed by active infrared thermometry, heating bare soil by solar irradiance after exposure due to primary tillage. Some meteorological parameters were inserted as independent variables in the prediction model, which presented r = 0.61. In order to obtain more general and wide estimations in-field a Partial Least Squares Discriminant Analysis on three classes of percentage of soil water content was performed obtaining a high correct classification in the test (88.89%. The prediction error values were lower in the field with respect to laboratory analyses. Both techniques could be used in conjunction with a Geographic Information System for obtaining detailed information

  7. Automated Techniques for Rapid Analysis of Momentum Exchange Devices

    Science.gov (United States)

    2013-12-01

    Contiguousness At this point, it is necessary to introduce the concept of contiguousness. In this thesis, a state space analysis representation is... concept of contiguousness was established to ensure that the results of the analysis would allow for the CMGs to reach every state in the defined...forces at the attachment points of the RWs and CMGs throughout a spacecraft maneuver. Current pedagogy on this topic focuses on the transfer of

  8. Computational Intelligence Techniques for Electro-Physiological Data Analysis

    OpenAIRE

    Riera Sardà, Alexandre

    2012-01-01

    This work contains the efforts I have made in the last years in the field of Electrophysiological data analysis. Most of the work has been done at Starlab Barcelona S.L. and part of it at the Neurodynamics Laboratory of the Department of Psychiatry and Clinical Psychobiology of the University of Barcelona. The main work deals with the analysis of electroencephalography (EEG) signals, although other signals, such as electrocardiography (ECG), electroculography (EOG) and electromiography (EMG) ...

  9. Neutron Activation Analysis of Water - A Review

    Science.gov (United States)

    Buchanan, John D.

    1971-01-01

    Recent developments in this field are emphasized. After a brief review of basic principles, topics discussed include sources of neutrons, pre-irradiation physical and chemical treatment of samples, neutron capture and gamma-ray analysis, and selected applications. Applications of neutron activation analysis of water have increased rapidly within the last few years and may be expected to increase in the future.

  10. Evaluation of Wellness Detection Techniques using Complex Activities Association for Smart Home Ambient

    Directory of Open Access Journals (Sweden)

    Farhan Sabir Ujager

    2016-08-01

    Full Text Available Wireless Sensor Network based smart homes have the potential to meet the growing challenges of independent living of elderly people in smart homes. However, wellness detection of elderly people in smart homes is still a challenging research domain. Many researchers have proposed several techniques; however, majority of these techniques does not provide a comprehensive solution because complex activities cannot be determined easily and comprehensive wellness is difficult to diagnose. In this study’s critical review, it has been observed that strong association lies among the vital wellness determination parameters. In this paper, an association rules based model is proposed for the simple and complex (overlapped activities recognition and comprehensive wellness detection mechanism after analyzing existing techniques. It considers vital wellness detection parameters (temporal association of sub activity location and sub activity, time gaps between two adjacent activities, temporal association of inter and intra activities. Activity recognition and wellness detection will be performed on the basis of extracted temporal association rules and expert knowledgebase. Learning component is an important module of our proposed model to accommodate the changing trends in the frequent pattern behavior of an elderly person and recommend a caregiver/expert to adjust the expert knowledgebase according to the found abnormalities.

  11. Analysis of the changes in keratoplasty indications and preferred techniques.

    Directory of Open Access Journals (Sweden)

    Stefan J Lang

    Full Text Available Recently, novel techniques introduced to the field of corneal surgery, e.g. Descemet membrane endothelial keratoplasty (DMEK and corneal crosslinking, extended the therapeutic options. Additionally contact lens fitting has developed new alternatives. We herein investigated, whether these techniques have affected volume and spectrum of indications of keratoplasties in both a center more specialized in treating Fuchs' dystrophy (center 1 and a second center that is more specialized in treating keratoconus (center 2.We retrospectively reviewed the waiting lists for indication, transplantation technique and the patients' travel distances to the hospital at both centers.We reviewed a total of 3778 procedures. Fuchs' dystrophy increased at center 1 from 17% (42 to 44% (150 and from 13% (27 to 23% (62 at center 2. In center 1, DMEK increased from zero percent in 2010 to 51% in 2013. In center 2, DMEK was not performed until 2013. The percentage of patients with keratoconus slightly decreased from 15% (36 in 2009 vs. 12% (40 in 2013 in center 1. The respective percentages in center 2 were 28% (57 and 19% (51. In both centers, the patients' travel distances increased.The results from center 1 suggest that DMEK might increase the total number of keratoplasties. The increase in travel distance suggests that this cannot be fully attributed to recruiting the less advanced patients from the hospital proximity. The increase is rather due to more referrals from other regions. The decrease of keratoconus patients in both centers is surprising and may be attributed to optimized contact lens fitting or even to the effect corneal crosslinking procedure.

  12. Improved Tandem Measurement Techniques for Aerosol Particle Analysis

    Science.gov (United States)

    Rawat, Vivek Kumar

    Non-spherical, chemically inhomogeneous (complex) nanoparticles are encountered in a number of natural and engineered environments, including combustion systems (which produces highly non-spherical aggregates), reactors used in gas-phase materials synthesis of doped or multicomponent materials, and in ambient air. These nanoparticles are often highly diverse in size, composition and shape, and hence require determination of property distribution functions for accurate characterization. This thesis focuses on development of tandem mobility-mass measurement techniques coupled with appropriate data inversion routines to facilitate measurement of two dimensional size-mass distribution functions while correcting for the non-idealities of the instruments. Chapter 1 provides the detailed background and motivation for the studies performed in this thesis. In chapter 2, the development of an inversion routine is described which is employed to determine two dimensional size-mass distribution functions from Differential Mobility Analyzer-Aerosol Particle Mass analyzer tandem measurements. Chapter 3 demonstrates the application of the two dimensional distribution function to compute cumulative mass distribution function and also evaluates the validity of this technique by comparing the calculated total mass concentrations to measured values for a variety of aerosols. In Chapter 4, this tandem measurement technique with the inversion routine is employed to analyze colloidal suspensions. Chapter 5 focuses on application of a transverse modulation ion mobility spectrometer coupled with a mass spectrometer to study the effect of vapor dopants on the mobility shifts of sub 2 nm peptide ion clusters. These mobility shifts are then compared to models based on vapor uptake theories. Finally, in Chapter 6, a conclusion of all the studies performed in this thesis is provided and future avenues of research are discussed.

  13. Categorical and nonparametric data analysis choosing the best statistical technique

    CERN Document Server

    Nussbaum, E Michael

    2014-01-01

    Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain

  14. Combined Technique Analysis of Punic Make-up Materials

    Energy Technology Data Exchange (ETDEWEB)

    Huq,A.; Stephens, P.; Ayed, N.; Binous, H.; Burgio, L.; Clark, R.; Pantos, E.

    2006-01-01

    Ten archaeological Punic make-up samples from Tunisia dating from the 4th to the 1st centuries BC were analyzed by several techniques including Raman microscopy and synchrotron X-ray diffraction in order to determine their compositions. Eight samples were red and found to contain either quartz and cinnabar or quartz and haematite. The remaining two samples were pink, the main diffracting phase in them being quartz. Examination of these two samples by optical microscopy and by illumination under a UV lamp suggest that the pink dye is madder. These findings reveal the identities of the materials used by Carthaginians for cosmetic and/or ritual make-up purposes.

  15. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...

  16. Design and Performance Analysis of Various Adders and Multipliers Using GDI Technique

    Directory of Open Access Journals (Sweden)

    Simran kaur

    2015-10-01

    Full Text Available With the active development of portable electronic devices, the need for low power dissipation, high speed and compact implementation, give rise to several research intentions. There are several design techniques used for the circuit configuration in VLSI systems but there are very few design techniques that gives the required extensibility. This paper describes the implementation of various adders and multipliers. The design approach proposed in the article is based on the GDI (Gate Diffusion Input technique. The paper also includes a comparative analysis of this low power method over CMOS design style with respect to power consumption, area complexity and delay. In this paper, a new GDI based cell designs are projected and are found to be efficient in terms of power consumption and area in comparison with existing CMOS based cell functionality. Power and delay has been calculated using Cadence Virtuoso tool at 45nm CMOS technology. The results obtained show better power and delay performance of the proposed designs at 1.3V supply voltage.

  17. Comparative Analysis of Data Mining Techniques for Malaysian Rainfall Prediction

    Directory of Open Access Journals (Sweden)

    Suhaila Zainudin

    2016-12-01

    Full Text Available Climate change prediction analyses the behaviours of weather for a specific time. Rainfall forecasting is a climate change task where specific features such as humidity and wind will be used to predict rainfall in specific locations. Rainfall prediction can be achieved using classification task under Data Mining. Different techniques lead to different performances depending on rainfall data representation including representation for long term (months patterns and short-term (daily patterns. Selecting an appropriate technique for a specific duration of rainfall is a challenging task. This study analyses multiple classifiers such as Naïve Bayes, Support Vector Machine, Decision Tree, Neural Network and Random Forest for rainfall prediction using Malaysian data. The dataset has been collected from multiple stations in Selangor, Malaysia. Several pre-processing tasks have been applied in order to resolve missing values and eliminating noise. The experimental results show that with small training data (10% from 1581 instances Random Forest correctly classified 1043 instances. This is the strength of an ensemble of trees in Random Forest where a group of classifiers can jointly beat a single classifier.

  18. Techniques for hazard analysis and their use at CERN.

    Science.gov (United States)

    Nuttall, C; Schönbacher, H

    2001-01-01

    CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors.

  19. Microscopy Techniques for Analysis of Nickel Metal Hydride Batteries Constituents.

    Science.gov (United States)

    Carpenter, Graham J C; Wronski, Zbigniew

    2015-12-01

    With the need for improvements in the performance of rechargeable batteries has come the necessity to better characterize cell electrodes and their component materials. Electron microscopy has been shown to reveal many important features of microstructure that are becoming increasingly important for understanding the behavior of the components during the many charge/discharge cycles required in modern applications. The aim of this paper is to present an overview of how the full suite of techniques available using transmission electron microscopy (TEM) and scanning transmission electron microscopy was applied to the case of materials for the positive electrode in nickel metal hydride rechargeable battery electrodes. Embedding and sectioning of battery-grade powders with an ultramicrotome was used to produce specimens that could be readily characterized by TEM. Complete electrodes were embedded after drying, and also after dehydration from the original wet state, for examination by optical microscopy and using focused ion beam techniques. Results of these studies are summarized to illustrate the significance of the microstructural information obtained.

  20. Techniques of EMG signal analysis: detection, processing, classification and applications

    Science.gov (United States)

    Hussain, M.S.; Mohd-Yasin, F.

    2006-01-01

    Electromyography (EMG) signals can be used for clinical/biomedical applications, Evolvable Hardware Chip (EHW) development, and modern human computer interaction. EMG signals acquired from muscles require advanced methods for detection, decomposition, processing, and classification. The purpose of this paper is to illustrate the various methodologies and algorithms for EMG signal analysis to provide efficient and effective ways of understanding the signal and its nature. We further point up some of the hardware implementations using EMG focusing on applications related to prosthetic hand control, grasp recognition, and human computer interaction. A comparison study is also given to show performance of various EMG signal analysis methods. This paper provides researchers a good understanding of EMG signal and its analysis procedures. This knowledge will help them develop more powerful, flexible, and efficient applications. PMID:16799694

  1. Nonlinear techniques for forecasting solar activity directly from its time series

    Science.gov (United States)

    Ashrafi, S.; Roszman, L.; Cooley, J.

    1993-01-01

    This paper presents numerical techniques for constructing nonlinear predictive models to forecast solar flux directly from its time series. This approach makes it possible to extract dynamical in variants of our system without reference to any underlying solar physics. We consider the dynamical evolution of solar activity in a reconstructed phase space that captures the attractor (strange), give a procedure for constructing a predictor of future solar activity, and discuss extraction of dynamical invariants such as Lyapunov exponents and attractor dimension.

  2. Finite Element Modeling Techniques for Analysis of VIIP

    Science.gov (United States)

    Feola, Andrew J.; Raykin, J.; Gleason, R.; Mulugeta, Lealem; Myers, Jerry G.; Nelson, Emily S.; Samuels, Brian C.; Ethier, C. Ross

    2015-01-01

    Visual Impairment and Intracranial Pressure (VIIP) syndrome is a major health concern for long-duration space missions. Currently, it is thought that a cephalad fluid shift in microgravity causes elevated intracranial pressure (ICP) that is transmitted along the optic nerve sheath (ONS). We hypothesize that this in turn leads to alteration and remodeling of connective tissue in the posterior eye which impacts vision. Finite element (FE) analysis is a powerful tool for examining the effects of mechanical loads in complex geometries. Our goal is to build a FE analysis framework to understand the response of the lamina cribrosa and optic nerve head to elevations in ICP in VIIP.

  3. Analysis of active islanding detection methods for grid-connected microinverters for renewable energy processing

    Energy Technology Data Exchange (ETDEWEB)

    Trujillo, C.L. [Grupo de Sistemas Electronicos Industriales del Departamento de Ingenieria Electronica, Universidad Politecnica de Valencia, Camino de Vera S/N, C.P. 46022, Valencia (Spain); Departamento de Ingenieria Electronica, Universidad Distrital Francisco Jose de Caldas, Carrera 7 N 40-53 Piso 5, Bogota (Colombia); Velasco, D.; Figueres, E.; Garcera, G. [Grupo de Sistemas Electronicos Industriales del Departamento de Ingenieria Electronica, Universidad Politecnica de Valencia, Camino de Vera S/N, C.P. 46022, Valencia (Spain)

    2010-11-15

    This paper presents the analysis and comparison of the main active techniques for islanding detection used in grid-connected microinverters for power processing of renewable energy sources. These techniques can be classified into two classes: techniques introducing positive feedback in the control of the inverter and techniques based on harmonics injection. Accurate PSIM trademark simulations have been carried out in order to perform a comparative analysis of the techniques under study and to establish their advantages and disadvantages according to IEEE standards. (author)

  4. Multi-scale statistical analysis of coronal solar activity

    Science.gov (United States)

    Gamborino, Diana; del-Castillo-Negrete, Diego; Martinell, Julio J.

    2016-07-01

    Multi-filter images from the solar corona are used to obtain temperature maps that are analyzed using techniques based on proper orthogonal decomposition (POD) in order to extract dynamical and structural information at various scales. Exploring active regions before and after a solar flare and comparing them with quiet regions, we show that the multi-scale behavior presents distinct statistical properties for each case that can be used to characterize the level of activity in a region. Information about the nature of heat transport is also to be extracted from the analysis.

  5. The Analysis of a Phobic Child: Some Problems of Theory and Technique in Child Analysis.

    Science.gov (United States)

    Bornstein, Berta

    2014-01-01

    This paper attempts to clarify some theoretical and technical aspects of child analysis by correlating the course of treatment, the structure of the neurosis, and the technique employed in the case of a phobic boy who was in analysis over a period of three years. The case was chosen for presentation: (1) because of the discrepancy between the clinical simplicity of the symptom and the complicated ego structure behind it; (2) because of the unusual clearness with which the patient brought to the fore the variegated patterns of his libidinal demands; (3) because of the patient's attempts at transitory solutions, oscillations between perversions and symptoms, and processes of new symptom formation; (4) because the vicissitudes and stabilization of character traits could be clearly traced; (5) and finally, because of the rare opportunity to witness during treatment the change from grappling with reality by means of pathological mechanisms, to dealing with reality in a relatively conflict-free fashion.

  6. Digital methods of photopeak integration in activation analysis.

    Science.gov (United States)

    Baedecker, P. A.

    1971-01-01

    A study of the precision attainable by several methods of gamma-ray photopeak integration has been carried out. The 'total peak area' method, the methods proposed by Covell, Sterlinski, and Quittner, and some modifications of these methods have been considered. A modification by Wasson of the total peak area method is considered to be the most advantageous due to its simplicity and the relatively high precision obtainable with this technique. A computer routine for the analysis of spectral data from nondestructive activation analysis experiments employing a Ge(Li) detector-spectrometer system is described.

  7. 48 CFR 15.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... of the offeror's cost trends, on the basis of current and historical cost or pricing data; (C... the FAR looseleaf edition), Cost Accounting Standards. (v) Review to determine whether any cost data... required. (2) Price analysis shall be used when certified cost or pricing data are not required...

  8. Sentiment analysis of Arabic tweets using text mining techniques

    Science.gov (United States)

    Al-Horaibi, Lamia; Khan, Muhammad Badruddin

    2016-07-01

    Sentiment analysis has become a flourishing field of text mining and natural language processing. Sentiment analysis aims to determine whether the text is written to express positive, negative, or neutral emotions about a certain domain. Most sentiment analysis researchers focus on English texts, with very limited resources available for other complex languages, such as Arabic. In this study, the target was to develop an initial model that performs satisfactorily and measures Arabic Twitter sentiment by using machine learning approach, Naïve Bayes and Decision Tree for classification algorithms. The datasets used contains more than 2,000 Arabic tweets collected from Twitter. We performed several experiments to check the performance of the two algorithms classifiers using different combinations of text-processing functions. We found that available facilities for Arabic text processing need to be made from scratch or improved to develop accurate classifiers. The small functionalities developed by us in a Python language environment helped improve the results and proved that sentiment analysis in the Arabic domain needs lot of work on the lexicon side.

  9. Radio & Optical Interferometry: Basic Observing Techniques and Data Analysis

    CERN Document Server

    Monnier, John D

    2012-01-01

    Astronomers usually need the highest angular resolution possible, but the blurring effect of diffraction imposes a fundamental limit on the image quality from any single telescope. Interferometry allows light collected at widely-separated telescopes to be combined in order to synthesize an aperture much larger than an individual telescope thereby improving angular resolution by orders of magnitude. Radio and millimeter wave astronomers depend on interferometry to achieve image quality on par with conventional visible and infrared telescopes. Interferometers at visible and infrared wavelengths extend angular resolution below the milli-arcsecond level to open up unique research areas in imaging stellar surfaces and circumstellar environments. In this chapter the basic principles of interferometry are reviewed with an emphasis on the common features for radio and optical observing. While many techniques are common to interferometers of all wavelengths, crucial differences are identified that will help new practi...

  10. Analysis of ultrasonic techniques for monitoring milk coagulation during cheesemaking

    Science.gov (United States)

    Budelli, E.; Pérez, N.; Lema, P.; Negreira, C.

    2012-12-01

    Experimental determination of time of flight and attenuation has been proposed in the literature as alternatives to monitoring the evolution of milk coagulation during cheese manufacturing. However, only laboratory scale procedures have been described. In this work, the use of ultrasonic time of flight and attenuation to determine cutting time and its feasibility to be applied at industrial scale were analyzed. Limitations to implement these techniques at industrial scale are shown experimentally. The main limitation of the use of time of flight is its strong dependence with temperature. Attenuation monitoring is affected by a thin layer of milk skin covering the transducer, which modifies the signal in a non-repetitive way. The results of this work can be used to develop alternative ultrasonic systems suitable for application in the dairy industry.

  11. An Active Damping Technique for Small DC-Link Capacitor Based Drive System

    DEFF Research Database (Denmark)

    Maheshwari, Ram Krishan; Munk-Nielsen, Stig; Lu, Kaiyuan

    2013-01-01

    A small dc-link capacitor based drive system shows instability when it is operated with large input line inductance at operating points with high power. This paper presents a simple, new active damping technique that can stabilize effectively the drive system at unstable operating points, offering...

  12. Status of the Usage of Active Learning and Teaching Method and Techniques by Social Studies Teachers

    Science.gov (United States)

    Akman, Özkan

    2016-01-01

    The purpose of this study was to determine the active learning and teaching methods and techniques which are employed by the social studies teachers working in state schools of Turkey. This usage status was assessed using different variables. This was a case study, wherein the research was limited to 241 social studies teachers. These teachers…

  13. Regulation on the Appraisal Activities of The Products,Techniques and Applications Projects of BIRTV

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    I General Provisions 1.The Appraisal Activities of the Products,Techniques andApplications Projects of BIRTV is held for the purpose of strengthen-ing the technical advisory work and pilot role,and providing positiveguidance and effective assistance in the field of new technology

  14. Determination of Volatile Organic Compounds in the Atmosphere Using Two Complementary Analysis Techniques.

    Science.gov (United States)

    Alonso, L; Durana, N; Navazo, M; García, J A; Ilardia, J L

    1999-08-01

    During a preliminary field campaign of volatile organic compound (VOC) measurements carried out in an urban area, two complementary analysis techniques were applied to establish the technical and scientific bases for a strategy to monitor and control VOCs and photochemical oxidants in the Autonomous Community of the Basque Country. Integrated sampling was conducted using Tenax sorbent tubes and laboratory analysis by gas chromatography, and grab sampling and in situ analysis also were conducted using a portable gas chromatograph. With the first technique, monocyclic aromatic hydrocarbons appeared as the compounds with the higher mean concentrations. The second technique allowed the systematic analysis of eight chlorinated and aromatic hydrocarbons. Results of comparing both techniques, as well as the additional information obtained with the second technique, are included.

  15. Transient analysis techniques in performing impact and crash dynamic studies

    Science.gov (United States)

    Pifko, A. B.; Winter, R.

    1989-01-01

    Because of the emphasis being placed on crashworthiness as a design requirement, increasing demands are being made by various organizations to analyze a wide range of complex structures that must perform safely when subjected to severe impact loads, such as those generated in a crash event. The ultimate goal of crashworthiness design and analysis is to produce vehicles with the ability to reduce the dynamic forces experienced by the occupants to specified levels, while maintaining a survivable envelope around them during a specified crash event. DYCAST is a nonlinear structural dynamic finite element computer code that started from the plans systems of a finite element program for static nonlinear structural analysis. The essential features of DYCAST are outlined.

  16. Biomechanical analysis technique choreographic movements (for example, "grand battman jete"

    Directory of Open Access Journals (Sweden)

    Batieieva N.P.

    2015-04-01

    Full Text Available Purpose : biomechanical analysis of the execution of choreographic movement "grand battman jete". Material : the study involved students (n = 7 of the department of classical choreography faculty of choreography. Results : biomechanical analysis of choreographic movement "grand battman jete" (classic exercise, obtained kinematic characteristics (path, velocity, acceleration, force of the center of mass (CM bio parts of the body artist (foot, shin, thigh. Built bio kinematic model (phase. The energy characteristics - mechanical work and kinetic energy units legs when performing choreographic movement "grand battman jete". Conclusions : It was found that the ability of an athlete and coach-choreographer analyze the biomechanics of movement has a positive effect on the improvement of choreographic training of qualified athletes in gymnastics (sport, art, figure skating and dance sports.

  17. Elemental analysis of silver coins by PIXE technique

    Energy Technology Data Exchange (ETDEWEB)

    Tripathy, B.B. [Department of Physics, Silicon Institute of Technology, Patia, Bhubaneswar 751 024 (India); Rautray, Tapash R. [Department of Dental Biomaterials, School of Dentistry, Kyungpook National University, 2-188-1 Samduk -dong, Jung-gu, Daegu 700 412 (Korea, Republic of); ARASMIN, G. Udayagiri, Kandhamal, Orissa 762 100 (India)], E-mail: tapash.rautray@gmail.com; Rautray, A.C. [ARASMIN, G. Udayagiri, Kandhamal, Orissa 762 100 (India); Vijayan, V. [Praveen Institute of Radiation Technology, Flat No. 9A, Avvai Street, New Perungalathur, Chennai 600 063 (India)

    2010-03-15

    Elemental analysis of nine Indian silver coins during British rule was carried out by proton induced X-ray emission spectroscopy. Eight elements, namely Cr, Fe, Ni, Cu, Zn, As, Ag, and Pb were determined in the present study. Ag and Cu were found to be the major elements, Zn was the only minor element and all other elements are present at the trace level. The variation of the elemental concentration may be due to the use of different ores for making coins.

  18. Advances in oriental document analysis and recognition techniques

    CERN Document Server

    Lee, Seong-Whan

    1999-01-01

    In recent years, rapid progress has been made in computer processing of oriental languages, and the research developments in this area have resulted in tremendous changes in handwriting processing, printed oriental character recognition, document analysis and recognition, automatic input methodologies for oriental languages, etc. Advances in computer processing of oriental languages can also be seen in multimedia computing and the World Wide Web. Many of the results in those domains are presented in this book.

  19. Modelling, analysis and validation of microwave techniques for the characterisation of metallic nanoparticles

    Science.gov (United States)

    Sulaimalebbe, Aslam

    In the last decade, the study of nanoparticle (NP) systems has become a large and interesting research area due to their novel properties and functionalities, which are different from those of the bulk materials, and also their potential applications in different fields. It is vital to understand the behaviour and properties of nano-materials aiming at implementing nanotechnology, controlling their behaviour and designing new material systems with superior performance. Physical characterisation of NPs falls into two main categories, property and structure analysis, where the properties of the NPs cannot be studied without the knowledge of size and structure. The direct measurement of the electrical properties of metal NPs presents a key challenge and necessitates the use of innovative experimental techniques. There have been numerous reports of two/four point resistance measurements of NPs films and also electrical conductivity of NPs films using the interdigitated microarray (IDA) electrode. However, using microwave techniques such as open ended coaxial probe (OCP) and microwave dielectric resonator (DR) for electrical characterisation of metallic NPs are much more accurate and effective compared to other traditional techniques. This is because they are inexpensive, convenient, non-destructive, contactless, hazardless (i.e. at low power) and require no special sample preparation. This research is the first attempt to determine the microwave properties of Pt and Au NP films, which were appealing materials for nano-scale electronics, using the aforementioned microwave techniques. The ease of synthesis, relatively cheap, unique catalytic activities and control over the size and the shape were the main considerations in choosing Pt and Au NPs for the present study. The initial phase of this research was to implement and validate the aperture admittance model for the OCP measurement through experiments and 3D full wave simulation using the commercially available Ansoft

  20. NOS/NGS activities to support development of radio interferometric surveying techniques

    Science.gov (United States)

    Carter, W. E.; Dracup, J. F.; Hothem, L. D.; Robertson, D. S.; Strange, W. E.

    1980-01-01

    National Geodetic Survey activities towards the development of operational geodetic survey systems based on radio interferometry are reviewed. Information about the field procedures, data reduction and analysis, and the results obtained to date is presented.

  1. Skills and Vacancy Analysis with Data Mining Techniques

    Directory of Open Access Journals (Sweden)

    Izabela A. Wowczko

    2015-11-01

    Full Text Available Through recognizing the importance of a qualified workforce, skills research has become one of the focal points in economics, sociology, and education. Great effort is dedicated to analyzing labor demand and supply, and actions are taken at many levels to match one with the other. In this work we concentrate on skills needs, a dynamic variable dependent on many aspects such as geography, time, or the type of industry. Historically, skills in demand were easy to evaluate since transitions in that area were fairly slow, gradual, and easy to adjust to. In contrast, current changes are occurring rapidly and might take an unexpected turn. Therefore, we introduce a relatively simple yet effective method of monitoring skills needs straight from the source—as expressed by potential employers in their job advertisements. We employ open source tools such as RapidMiner and R as well as easily accessible online vacancy data. We demonstrate selected techniques, namely classification with k-NN and information extraction from a textual dataset, to determine effective ways of discovering knowledge from a given collection of vacancies.

  2. Manure management and greenhouse gas mitigation techniques : a comparative analysis

    Energy Technology Data Exchange (ETDEWEB)

    Langmead, C.

    2003-09-03

    Alberta is the second largest agricultural producer in Canada, ranking just behind Ontario. Approximately 62 per cent of the province's farm cash receipts are attributable to the livestock industry. Farmers today maintain large numbers of a single animal type. The drivers for more advanced manure management systems include: the trend towards confined feeding operations (CFO) is creating large, concentrated quantities of manure; public perception of CFO; implementation of provincial legislation regulating the expansion and construction of CFO; ratification of the Kyoto Protocol raised interest in the development of improved manure management systems capable of reducing greenhouse gas (GHG) emissions; and rising energy costs. The highest methane emissions factors are found with liquid manure management systems. They contribute more than 80 per cent of the total methane emissions from livestock manure in Alberta. The author identified and analyzed three manure management techniques to mitigate GHG emissions. They were: bio-digesters, gasification systems, and composting. Three recommendations were made to establish a strategy to support emissions offsets and maximize the reduction of methane emissions from the livestock industry. The implementation of bio-digesters, especially for the swine industry, was recommended. It was suggested that a gasification pilot project for poultry manure should be pursued by Climate Change Central. Public outreach programs promoting composting of cattle manure for beef feedlots and older style dairy barns should also be established. 19 refs., 11 tabs., 3 figs.

  3. Chromatographic finger print analysis of Naringi crenulata by HPTLC technique

    Institute of Scientific and Technical Information of China (English)

    Subramanian Sampathkumar; Ramakrishnan N

    2011-01-01

    Objective:To establish the fingerprint profile of Naringi crenulata (N. crenulata) (Roxb.) Nicols. using high performance thin layer chromatography (HPTLC) technique. Methods: Preliminary phytochemical screening was done and HPTLC studies were carried out. CAMAG HPTLC system equipped with Linomat V applicator, TLC scanner 3, Reprostar 3 and WIN CATS-4 software was used. Results: The results of preliminary phytochemical studies confirmed the presence of protein, lipid, carbohydrate, reducing sugar, phenol, tannin, flavonoid, saponin, triterpenoid, alkaloid, anthraquinone and quinone. HPTLC finger printing of ethanolic extract of stem revealed 10 spots with Rf values in the range of 0.08 to 0.65;bark showed 8 peaks with Rf values in the range of 0.07 to 0.63 and the ethanol extract of leaf revealed 8 peaks with Rf values in the range of 0.09 to 0.49, respectively. The purity of sample was confirmed by comparing the absorption spectra at start, middle and end position of the band. Conclusions:It can be concluded that HPTLC finger printing of N. crenulata may be useful in differentiating the species from the adulterant and act as a biochemical marker for this medicinally important plant in the pharmaceutical industry and plant systematic studies.

  4. Comparative Analysis of Automatic Vehicle Classification Techniques: A Survey

    Directory of Open Access Journals (Sweden)

    Kanwal Yousaf

    2012-09-01

    Full Text Available Vehicle classification has emerged as a significant field of study because of its importance in variety of applications like surveillance, security system, traffic congestion avoidance and accidents prevention etc. So far numerous algorithms have been implemented for classifying vehicle. Each algorithm follows different procedures for detecting vehicles from videos. By evaluating some of the commonly used techniques we highlighted most beneficial methodology for classifying vehicles. In this paper we pointed out the working of several video based vehicle classification algorithms and compare these algorithms on the basis of different performance metrics such as classifiers, classification methodology or principles and vehicle detection ratio etc. After comparing these parameters we concluded that Hybrid Dynamic Bayesian Network (HDBN Classification algorithm is far better than the other algorithms due to its nature of estimating the simplest features of vehicles from different videos. HDBN detects vehicles by following important stages of feature extraction, selection and classification. It extracts the rear view information of vehicles rather than other information such as distance between the wheels and height of wheel etc.

  5. Preliminary Analysis of ULPC Light Curves Using Fourier Decomposition Technique

    CERN Document Server

    Ngeow, Chow-Choong; Kanbur, Shashi; Barrett, Brittany; Lin, Bin

    2013-01-01

    Recent work on Ultra Long Period Cepheids (ULPCs) has suggested their usefulness as a distance indicator, but has not commented on their relationship as compared with other types of variable stars. In this work, we use Fourier analysis to quantify the structure of ULPC light curves and compare them to Classical Cepheids and Mira variables. Our preliminary results suggest that the low order Fourier parameters of ULPCs show a continuous trend defined by Classical Cepheids after the resonance around 10 days. However their Fourier parameters also overlapped with those from Miras, which make the classification of long period variable stars difficult based on the light curves information alone.

  6. [THE COMPARATIVE ANALYSIS OF TECHNIQUES OF IDENTIFICATION OF CORYNEBACTERIUM NON DIPHTHERIAE].

    Science.gov (United States)

    Kharseeva, G G; Voronina, N A; Mironov, A Yu; Alutina, E L

    2015-12-01

    The comparative analysis was carried out concerning effectiveness of three techniques of identification of Corynebacterium non diphtheriae: bacteriological, molecular genetic (sequenation on 16SpRNA) andmass-spectrometric (MALDI-ToFMS). The analysis covered 49 strains of Corynebacterium non diphtheriae (C.pseudodiphheriticum, C.amycolatum, C.propinquum, C.falsenii) and 2 strains of Corynebacterium diphtheriae isolated under various pathology form urogenital tract and upper respiratory ways. The corinbacteria were identified using bacteriologic technique, sequenation on 16SpRNA and mass-spectrometric technique (MALDIToF MS). The full concordance of results of species' identification was marked in 26 (51%) of strains of Corynebacterium non diphtheriae at using three analysis techniques; in 43 (84.3%) strains--at comparison of bacteriologic technique with sequenation on 16S pRNA and in 29 (57%)--at mass-spectrometric analysis and sequenation on 16S pRNA. The bacteriologic technique is effective for identification of Corynebacterium diphtheriae. The precise establishment of species belonging of corynebacteria with variable biochemical characteristics the molecular genetic technique of analysis is to be applied. The mass-spectrometric technique (MALDI-ToF MS) requires further renewal of data bases for identifying larger spectrum of representatives of genus Corynebacterium.

  7. Performance Evaluation a Developed Energy Harvesting Interface Circuit in Active Technique

    Directory of Open Access Journals (Sweden)

    Ramizi Mohamed

    2014-10-01

    Full Text Available This study presents the performance evaluation a developed energy harvesting interface circuit in active technique. The energy harvesting interface circuit for micro-power applications uses equivalent voltage of the piezoelectric materials have been developed and simulated. Circuit designs and simulation results are presented for a conventional diode rectifier with voltage doubler in passive technique. Most of the existing techniques are mainly passive-based energy harvesting circuits. Generally, the power harvesting capability of the passive technique is very low. To increase the harvested energy, the active technique and its components such as MOSFET, thyristor and transistor have chosen to design the proposed energy harvesting interface circuit. In this study, it has simulated both the conventional in passive circuit and developed energy harvester in active technique. The developed interface circuits consisting of piezoelectric element with input source of vibration, AC-DC thyristor doubler rectifier circuit and DC-DC boost converter using thyristor with storage device. In the development circuits, it is noted that the components thyristor instead of mainly diode available in conventional circuits have chosen. Because the forward voltage potential (0.7 V is higher than the incoming input voltage (0.2 V. Finally, the complete energy harvester using PSPICE software have designed and simulated. The proposed circuits in PSPICE generate the boost-up DC voltage up to 2 V. The overall efficiency of the developed circuit is 70%, followed by the software simulation, which is greater than conventional circuit efficiency of 20% in performance evaluator. It is concluded that the developed circuit output voltage can be used to operate for the applications in autonomous devices.

  8. Structural Analysis of Composite Laminates using Analytical and Numerical Techniques

    Directory of Open Access Journals (Sweden)

    Sanghi Divya

    2016-01-01

    Full Text Available A laminated composite material consists of different layers of matrix and fibres. Its properties can vary a lot with each layer’s or ply’s orientation, material property and the number of layers itself. The present paper focuses on a novel approach of incorporating an analytical method to arrive at a preliminary ply layup order of a composite laminate, which acts as a feeder data for the further detailed analysis done on FEA tools. The equations used in our MATLAB are based on analytical study code and supply results that are remarkably close to the final optimized layup found through extensive FEA analysis with a high probabilistic degree. This reduces significant computing time and saves considerable FEA processing to obtain efficient results quickly. The result output by our method also provides the user with the conditions that predicts the successive failure sequence of the composite plies, a result option which is not even available in popular FEM tools. The predicted results are further verified by testing the laminates in the laboratory and the results are found in good agreement.

  9. Analysis of compressive fracture in rock using statistical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Blair, S.C.

    1994-12-01

    Fracture of rock in compression is analyzed using a field-theory model, and the processes of crack coalescence and fracture formation and the effect of grain-scale heterogeneities on macroscopic behavior of rock are studied. The model is based on observations of fracture in laboratory compression tests, and incorporates assumptions developed using fracture mechanics analysis of rock fracture. The model represents grains as discrete sites, and uses superposition of continuum and crack-interaction stresses to create cracks at these sites. The sites are also used to introduce local heterogeneity. Clusters of cracked sites can be analyzed using percolation theory. Stress-strain curves for simulated uniaxial tests were analyzed by studying the location of cracked sites, and partitioning of strain energy for selected intervals. Results show that the model implicitly predicts both development of shear-type fracture surfaces and a strength-vs-size relation that are similar to those observed for real rocks. Results of a parameter-sensitivity analysis indicate that heterogeneity in the local stresses, attributed to the shape and loading of individual grains, has a first-order effect on strength, and that increasing local stress heterogeneity lowers compressive strength following an inverse power law. Peak strength decreased with increasing lattice size and decreasing mean site strength, and was independent of site-strength distribution. A model for rock fracture based on a nearest-neighbor algorithm for stress redistribution is also presented and used to simulate laboratory compression tests, with promising results.

  10. Bringing New Tools and Techniques to Bear on Earthquake Hazard Analysis and Mitigation

    Science.gov (United States)

    Willemann, R. J.; Pulliam, J.; Polanco, E.; Louie, J. N.; Huerta-Lopez, C.; Schmitz, M.; Moschetti, M. P.; Huerfano Moreno, V.; Pasyanos, M.

    2013-12-01

    During July 2013, IRIS held an Advanced Studies Institute in Santo Domingo, Dominican Republic, that was designed to enable early-career scientists who already have mastered the fundamentals of seismology to begin collaborating in frontier seismological research. The Institute was conceived of at a strategic planning workshop in Heredia, Costa Rica, that was supported and partially funded by USAID, with a goal of building geophysical capacity to mitigate the effects of future earthquakes. To address this broad goal, we drew participants from a dozen different countries of Middle America. Our objectives were to develop understanding of the principles of earthquake hazard analysis, particularly site characterization techniques, and to facilitate future research collaborations. The Institute was divided into three main sections: overviews on the fundamentals of earthquake hazard analysis and lectures on the theory behind methods of site characterization; fieldwork where participants acquired new data of the types typically used in site characterization; and computer-based analysis projects in which participants applied their newly-learned techniques to the data they collected. This was the first IRIS institute to combine an instructional short course with field work for data acquisition. Participants broke into small teams to acquire data, analyze it on their own computers, and then make presentations to the assembled group describing their techniques and results.Using broadband three-component seismometers, the teams acquired data for Spatial Auto-Correlation (SPAC) analysis at seven array locations, and Horizontal to Vertical Spectral Ratio (HVSR) analysis at 60 individual sites along six profiles throughout Santo Domingo. Using a 24-channel geophone string, the teams acquired data for Refraction Microtremor (SeisOptReMi™ from Optim) analysis at 11 sites, with supplementary data for active-source Multi-channel Spectral Analysis of Surface Waves (MASW) analysis at

  11. Multidimensional Analysis of Quenching: Comparison of Inverse Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, K.J.

    1998-11-18

    Understanding the surface heat transfer during quenching can be beneficial. Analysis to estimate the surface heat transfer from internal temperature measurements is referred to as the inverse heat conduction problem (IHCP). Function specification and gradient adjoint methods, which use a gradient search method coupled with an adjoint operator, are widely u led methods to solve the IHCP. In this paper the two methods are presented for the multidimensional case. The focus is not a rigorous comparison of numerical results. Instead after formulating the multidimensional solutions, issues associated with the numerical implementation and practical application of the methods are discussed. In addition, an experiment that measured the surface heat flux and temperatures for a transient experimental case is analyzed. Transient temperatures are used to estimate the surface heat flux, which is compared to the measured values. The estimated surface fluxes are comparable for the two methods.

  12. Nonlinear systems techniques for dynamical analysis and control

    CERN Document Server

    Lefeber, Erjen; Arteaga, Ines

    2017-01-01

    This treatment of modern topics related to the control of nonlinear systems is a collection of contributions celebrating the work of Professor Henk Nijmeijer and honoring his 60th birthday. It addresses several topics that have been the core of Professor Nijmeijer’s work, namely: the control of nonlinear systems, geometric control theory, synchronization, coordinated control, convergent systems and the control of underactuated systems. The book presents recent advances in these areas, contributed by leading international researchers in systems and control. In addition to the theoretical questions treated in the text, particular attention is paid to a number of applications including (mobile) robotics, marine vehicles, neural dynamics and mechanical systems generally. This volume provides a broad picture of the analysis and control of nonlinear systems for scientists and engineers with an interest in the interdisciplinary field of systems and control theory. The reader will benefit from the expert participan...

  13. A Generalized Lanczos-QR Technique for Structural Analysis

    DEFF Research Database (Denmark)

    Vissing, S.

    systems with very special properties. Due to the finite discretization the matrices are sparse and a relatively large number of problems also has real and symmetric matrices. The matrix equation for an undamped vibration contains two matrices describing tangent stiffness and mass distributions....... Alternatively, in a stability analysis, tangent stiffness and geometric stiffness matrices are introduced into an eigenvalue problem used to determine possible bifurcation points. The common basis for these types of problems is that the matrix equation describing the problem contains two real, symmetric......Within the field of solid mechanics such as structural dynamics and linearized as well as non-linear stability, the eigenvalue problem plays an important role. In the class of finite element and finite difference discretized problems these engineering problems are characterized by large matrix...

  14. Denial of Service Attack Techniques: Analysis, Implementation and Comparison

    Directory of Open Access Journals (Sweden)

    Khaled Elleithy

    2005-02-01

    Full Text Available A denial of service attack (DOS is any type of attack on a networking structure to disable a server from servicing its clients. Attacks range from sending millions of requests to a server in an attempt to slow it down, flooding a server with large packets of invalid data, to sending requests with an invalid or spoofed IP address. In this paper we show the implementation and analysis of three main types of attack: Ping of Death, TCP SYN Flood, and Distributed DOS. The Ping of Death attack will be simulated against a Microsoft Windows 95 computer. The TCP SYN Flood attack will be simulated against a Microsoft Windows 2000 IIS FTP Server. Distributed DOS will be demonstrated by simulating a distribution zombie program that will carry the Ping of Death attack. This paper will demonstrate the potential damage from DOS attacks and analyze the ramifications of the damage.

  15. Design of Process Displays based on Risk Analysis Techniques

    DEFF Research Database (Denmark)

    Paulsen, Jette Lundtang

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, es-pecially in view of the enormous amount of information available in computer-based supervision systems...... in some detail. Finally we address the problem of where to put the dot and the lines: when all information is ‘on the table’, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose...... was to develop a software tool for maintenance supervision of components in a nuclear power plant....

  16. Design of process displays based on risk analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lundtang Paulsen, J

    2004-05-01

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, especially in view of the enormous amount of information available in computer-based supervision systems. The state of the art is discussed: How are supervision systems designed today and why? Which strategies are used? What kind of research is going on? Four different plants and their display systems, designed by the author, are described and discussed. Next we outline different methods for eliciting knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chapter presents an overview of the various types of operation references: constitutive equations, set points, design parameters, component characteristics etc., and their validity in different situations. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engineer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described in some detail. Finally we address the problem of where to put the dot and the lines: when all information is on the table, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose was to develop a software tool for maintenance supervision of components in a nuclear power plant. (au)

  17. Connectomic analysis of brain networks: novel techniques and future directions

    Directory of Open Access Journals (Sweden)

    Leonie Cazemier

    2016-11-01

    Full Text Available Brain networks, localized or brain-wide, exist only at the cellular level, i.e. between specific pre- and postsynaptic neurons, which are connected through functionally diverse synapses located at specific points of their cell membranes. Connectomics is the emerging subfield of neuroanatomy explicitly aimed at elucidating the wiring of brain networks with cellular resolution and a quantified accuracy. Such data are indispensable for realistic modeling of brain circuitry and function. A connectomic analysis, therefore, needs to identify and measure the soma, dendrites, axonal path and branching patterns together with the synapses and gap junctions of the neurons involved in any given brain circuit or network. However, because of the submicron caliber, 3D complexity and high packing density of most such structures, as well as the fact that axons frequently extend over long distances to make synapses in remote brain regions, creating connectomic maps is technically challenging and requires multi-scale approaches, Such approaches involve the combination of the most sensitive cell labeling and analysis methods available, as well as the development of new ones able to resolve individual cells and synapses with increasing high-throughput. In this review, we provide an overview of recently introduced high-resolution methods, which researchers wanting to enter the field of connectomics may consider. It includes several molecular labeling tools, some of which specifically label synapses, and covers a number of novel imaging tools such as brain clearing protocols and microscopy approaches. Apart from describing the tools, we also provide an assessment of their qualities. The criteria we use assess the qualities that tools need in order to contribute to deciphering the key levels of circuit organization. We conclude with a brief future outlook for neuroanatomic research, computational methods and network modeling, where we also point out several outstanding

  18. Multifractal detrended fluctuation analysis of human EEG: preliminary investigation and comparison with the wavelet transform modulus maxima technique.

    Directory of Open Access Journals (Sweden)

    Todd Zorick

    Full Text Available Recently, many lines of investigation in neuroscience and statistical physics have converged to raise the hypothesis that the underlying pattern of neuronal activation which results in electroencephalography (EEG signals is nonlinear, with self-affine dynamics, while scalp-recorded EEG signals themselves are nonstationary. Therefore, traditional methods of EEG analysis may miss many properties inherent in such signals. Similarly, fractal analysis of EEG signals has shown scaling behaviors that may not be consistent with pure monofractal processes. In this study, we hypothesized that scalp-recorded human EEG signals may be better modeled as an underlying multifractal process. We utilized the Physionet online database, a publicly available database of human EEG signals as a standardized reference database for this study. Herein, we report the use of multifractal detrended fluctuation analysis on human EEG signals derived from waking and different sleep stages, and show evidence that supports the use of multifractal methods. Next, we compare multifractal detrended fluctuation analysis to a previously published multifractal technique, wavelet transform modulus maxima, using EEG signals from waking and sleep, and demonstrate that multifractal detrended fluctuation analysis has lower indices of variability. Finally, we report a preliminary investigation into the use of multifractal detrended fluctuation analysis as a pattern classification technique on human EEG signals from waking and different sleep stages, and demonstrate its potential utility for automatic classification of different states of consciousness. Therefore, multifractal detrended fluctuation analysis may be a useful pattern classification technique to distinguish among different states of brain function.

  19. Refolding techniques for recovering biologically active recombinant proteins from inclusion bodies.

    Science.gov (United States)

    Yamaguchi, Hiroshi; Miyazaki, Masaya

    2014-02-20

    Biologically active proteins are useful for studying the biological functions of genes and for the development of therapeutic drugs and biomaterials in a biotechnology industry. Overexpression of recombinant proteins in bacteria, such as Escherichia coli, often results in the formation of inclusion bodies, which are protein aggregates with non-native conformations. As inclusion bodies contain relatively pure and intact proteins, protein refolding is an important process to obtain active recombinant proteins from inclusion bodies. However, conventional refolding methods, such as dialysis and dilution, are time consuming and, often, recovered yields of active proteins are low, and a trial-and-error process is required to achieve success. Recently, several approaches have been reported to refold these aggregated proteins into an active form. The strategies largely aim at reducing protein aggregation during the refolding procedure. This review focuses on protein refolding techniques using chemical additives and laminar flow in microfluidic chips for the efficient recovery of active proteins from inclusion bodies.

  20. Measurements of fusion neutron yields by neutron activation technique: Uncertainty due to the uncertainty on activation cross-sections

    Energy Technology Data Exchange (ETDEWEB)

    Stankunas, Gediminas, E-mail: gediminas.stankunas@lei.lt [Lithuanian Energy Institute, Laboratory of Nuclear Installation Safety, Breslaujos str. 3, LT-44403 Kaunas (Lithuania); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Batistoni, Paola [ENEA, Via E. Fermi, 45, 00044 Frascati, Rome (Italy); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Sjöstrand, Henrik; Conroy, Sean [Department of Physics and Astronomy, Uppsala University, PO Box 516, SE-75120 Uppsala (Sweden); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom)

    2015-07-11

    The neutron activation technique is routinely used in fusion experiments to measure the neutron yields. This paper investigates the uncertainty on these measurements as due to the uncertainties on dosimetry and activation reactions. For this purpose, activation cross-sections were taken from the International Reactor Dosimetry and Fusion File (IRDFF-v1.05) in 640 groups ENDF-6 format for several reactions of interest for both 2.5 and 14 MeV neutrons. Activation coefficients (reaction rates) have been calculated using the neutron flux spectra at JET vacuum vessel, both for DD and DT plasmas, calculated by MCNP in the required 640-energy group format. The related uncertainties for the JET neutron spectra are evaluated as well using the covariance data available in the library. These uncertainties are in general small, but not negligible when high accuracy is required in the determination of the fusion neutron yields.

  1. LEMPEL - ZIV - WELCH & HUFFMAN” - THE LOSSLESS COMPRESSION TECHNIQUES; (IMPLEMENTATION ANALYSIS AND COMPARISON THEREOF)

    OpenAIRE

    Kapil Kapoor*, Dr. Abhay Sharma

    2016-01-01

    This paper is about the Implementation Analysis and Comparison of Lossless Compression Techniques viz. Lempel-Ziv-Welch and Huffman. LZW technique assigns fixed length code words. It requires no prior information about the probability of occurrence of symbols to be encoded. Basic idea in Huffman technique is that different gray levels occur with different probability (non-uniform- '•histogram). It uses shorter code words for the more common gray levels and longer code words for the l...

  2. Diffusion of point defects in crystalline silicon using the kinetic activation-relaxation technique method

    Science.gov (United States)

    Trochet, Mickaël; Béland, Laurent Karim; Joly, Jean-François; Brommer, Peter; Mousseau, Normand

    2015-06-01

    We study point-defect diffusion in crystalline silicon using the kinetic activation-relaxation technique (k-ART), an off-lattice kinetic Monte Carlo method with on-the-fly catalog building capabilities based on the activation-relaxation technique (ART nouveau), coupled to the standard Stillinger-Weber potential. We focus more particularly on the evolution of crystalline cells with one to four vacancies and one to four interstitials in order to provide a detailed picture of both the atomistic diffusion mechanisms and overall kinetics. We show formation energies, activation barriers for the ground state of all eight systems, and migration barriers for those systems that diffuse. Additionally, we characterize diffusion paths and special configurations such as dumbbell complex, di-interstitial (IV-pair+2I) superdiffuser, tetrahedral vacancy complex, and more. This study points to an unsuspected dynamical richness even for this apparently simple system that can only be uncovered by exhaustive and systematic approaches such as the kinetic activation-relaxation technique.

  3. Experimental Study of Active Techniques for Blade/Vortex Interaction Noise Reduction

    Science.gov (United States)

    Kobiki, Noboru; Murashige, Atsushi; Tsuchihashi, Akihiko; Yamakawa, Eiichi

    This paper presents the experimental results of the effect of Higher Harmonic Control (HHC) and Active Flap on the Blade/Vortex Interaction (BVI) noise. Wind tunnel tests were performed with a 1-bladed rotor system to evaluate the simplified BVI phenomenon avoiding the complicated aerodynamic interference which is characteristically and inevitably caused by a multi-bladed rotor. Another merit to use this 1-bladed rotor system is that the several objective active techniques can be evaluated under the same condition installed in the same rotor system. The effects of the active techniques on the BVI noise reduction were evaluated comprehensively by the sound pressure, the blade/vortex miss distance obtained by Laser light Sheet (LLS), the blade surface pressure distribution and the tip vortex structure by Particle Image Velocimetry (PIV). The correlation among these quantities to describe the effect of the active techniques on the BVI conditions is well obtained. The experiments show that the blade/vortex miss distance is more dominant for BVI noise than the other two BVI governing factors, such as blade lift and vortex strength at the moment of BVI.

  4. Nuclear and radiochemical techniques in chemical analysis. Progress report, August 1, 1978-July 31, 1979

    Energy Technology Data Exchange (ETDEWEB)

    Finston, H. L.; Williams, E. T.

    1979-07-01

    Studies of homogeneous liquid-liquid extraction have been extended to include (1) a detailed determination of the phase diagram of the system propylene carbonate-water, (2) the extraction of a large variety of both monodentate and bidentate iron complexes, (3) the solvent extraction characteristics of analogues of propylene carbonate, (4) the behavior under pressure of the propylene carbonate water system, and (5) the extraction behavior of alkaline earth - TTA chelates. One consequence of these studies was the observation that the addition of ethanol to propylene carbonate-water or to isobutylene carbonate-water yields a single homogeneous phase. Subsequent evaporation of the ethanol restores the two immiscible phases. Past neutron activation analysis has been attempted for the heavy elements Pb, Bi, Tl at the Brookhaven HFBR (in- or near-core position) and at the Brookhaven CLIF facility. The latter appears more promising and we have initiated a collaborative program to use the CLIF facility. A milking system which can provide ca. 16 ..mu..Ci of carrier-free /sup 212/Pb was developed for use in an isotope dilution technique for lead. Collaboration with laboratories already determining trace lead by flameless Atomic Absorption or by concentration by electrodeposition into a hanging drop followed by Anodic stripping will be proposed. The Proton X-Ray Emission system has undergone marked improvement with the acquisition of a new high resolution Si(Li) detector and a new multi-channel analyzer system. Various techniques have been explored to dissolve and prepare samples for PIXE analysis and also for verification by Atomic Absorption analysis.

  5. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eom, H.S.; Kim, J.H.; Lee, J.C.; Choi, Y.R.; Moon, S.S

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system.

  6. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    Science.gov (United States)

    Migneault, Gerard E.

    1987-01-01

    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  7. Financial planning and analysis techniques of mining firms: a note on Canadian practice

    Energy Technology Data Exchange (ETDEWEB)

    Blanco, H.; Zanibbi, L.R. (Laurentian University, Sudbury, ON (Canada). School of Commerce and Administration)

    1992-06-01

    This paper reports on the results of a survey of the financial planning and analysis techniques in use in the mining industry in Canada. The study was undertaken to determine the current status of these practices within mining firms in Canada and to investigate the extent to which the techniques are grouped together within individual firms. In addition, tests were performed on the relationship between these groups of techniques and both organizational size and price volatility of end product. The results show that a few techniques are widely utilized in this industry but that the techniques used most frequently are not as sophisticated as reported in previous, more broadly based surveys. The results also show that firms tend to use 'bundles' of techniques and that the relative use of some of these groups of techniques is weakly associated with both organizational size and type of end product. 19 refs., 7 tabs.

  8. Analysis and interpretation of dynamic FDG PET oncological studies using data reduction techniques

    Directory of Open Access Journals (Sweden)

    Santos Andres

    2007-10-01

    Full Text Available Abstract Background Dynamic positron emission tomography studies produce a large amount of image data, from which clinically useful parametric information can be extracted using tracer kinetic methods. Data reduction methods can facilitate the initial interpretation and visual analysis of these large image sequences and at the same time can preserve important information and allow for basic feature characterization. Methods We have applied principal component analysis to provide high-contrast parametric image sets of lower dimensions than the original data set separating structures based on their kinetic characteristics. Our method has the potential to constitute an alternative quantification method, independent of any kinetic model, and is particularly useful when the retrieval of the arterial input function is complicated. In independent component analysis images, structures that have different kinetic characteristics are assigned opposite values, and are readily discriminated. Furthermore, novel similarity mapping techniques are proposed, which can summarize in a single image the temporal properties of the entire image sequence according to a reference region. Results Using our new cubed sum coefficient similarity measure, we have shown that structures with similar time activity curves can be identified, thus facilitating the detection of lesions that are not easily discriminated using the conventional method employing standardized uptake values.

  9. An Analysis on the Discourse Cohesion Techniques in Children's English Books

    Institute of Scientific and Technical Information of China (English)

    罗春燕

    2014-01-01

    Discourse cohesion techniques analysis attracts much attention both at home and abroad and many scholars have con-ducted their research in this field, however, few of them focus on children’s English books which has its own characteristics and cohesion techniques and deserves our research.

  10. STUDY ON MODULAR FAULT TREE ANALYSIS TECHNIQUE WITH CUT SETS MATRIX METHOD

    Institute of Scientific and Technical Information of China (English)

    1998-01-01

    A new fault tree analysis (FTA) computation method is put forth by using modularization technique in FTA with cut sets matrix, and can reduce NP (Nondeterministic polynomial) difficulty effectively. This software can run in IBM-PC and DOS 3.0 and up. The method provides theoretical basis and computation tool for application of FTA technique in the common engineering system

  11. Cross- And Up-selling Techniques In E-Commerce Activities

    Directory of Open Access Journals (Sweden)

    Bernard F. Kubiak

    2010-12-01

    Full Text Available In the article the Cross- and Up-selling marketing techniques were presented in the spectrum of e-commerce. Their aim is to raise both the value of a single sales transaction and confidence as well as to lower the risk of ta king the customers by the competitors. In the article iterative client service model was presented, indicating particular roles of the Cross- and Up-selling techniques in the processes of automation and integration marketing activities In the context of customer relationship managem ent those are of great importance as they give reasons for conducting effective advertisement, promotion and loyal campaigns. The work is finished by summary in which authors pay attention to economic advantages resulting from using presented techniques

  12. Ecophysiological Analysis of Microorganisms in Complex Microbial Systems by Combination of Fluorescence In Situ Hybridization with Extracellular Staining Techniques

    Science.gov (United States)

    Nielsen, Jeppe Lund; Kragelund, Caroline; Nielsen, Per Halkjær

    Ecophysiological analysis and functions of single cells in complex microbial systems can be examined by simple combinations of Fluorescence in situ hybridization (FISH) for identification with various staining techniques targeting functional phenotypes. In this chapter, we describe methods and protocols optimized for the study of extracellular enzymes, surface hydrophobicity and specific surface structures. Although primarily applied to the study of microbes in wastewater treatment (activated sludge and biofilms), the methods may also be used with minor modifications in several other ecosystems.

  13. Prompt gamma-ray activation analysis (PGAA)

    Energy Technology Data Exchange (ETDEWEB)

    Kern, J. [Fribourg Univ. (Switzerland). Inst. de Physique

    1996-11-01

    The paper deals with a brief description of the principles of prompt gamma-ray activation analysis (PGAA), with the detection of gamma-rays, the PGAA project at SINQ and with the expected performances. 8 figs., 3 tabs., 10 refs.

  14. Transforming Teacher Education, An Activity Theory Analysis

    Science.gov (United States)

    McNicholl, Jane; Blake, Allan

    2013-01-01

    This paper explores the work of teacher education in England and Scotland. It seeks to locate this work within conflicting sociocultural views of professional practice and academic work. Drawing on an activity theory framework that integrates the analysis of these seemingly contradictory discourses with a study of teacher educators' practical…

  15. Using Link Analysis Technique with a Modified Shortest-Path Algorithm to Fight Money Laundering

    Institute of Scientific and Technical Information of China (English)

    CHEN Yunkai; MAI Quanwen; LU Zhengding

    2006-01-01

    Effective link analysis techniques are needed to help law enforcement and intelligence agencies fight money laundering.This paper presents a link analysis technique that uses a modified shortest-path algorithms to identify the strongest association paths between entities in a money laundering network.Based on two-tree Dijkstra and Priority-First-Search (PFS) algorithm, a modified algorithm is presented.To apply the algorithm, a network representation transformation is made first.

  16. Data Analysis Techniques for Resolving Nonlinear Processes in Plasmas : a Review

    OpenAIRE

    de Wit, T. Dudok

    1996-01-01

    The growing need for a better understanding of nonlinear processes in plasma physics has in the last decades stimulated the development of new and more advanced data analysis techniques. This review lists some of the basic properties one may wish to infer from a data set and then presents appropriate analysis techniques with some recent applications. The emphasis is put on the investigation of nonlinear wave phenomena and turbulence in space plasmas.

  17. Low level radioactivity measurements with phoswich detectors using coincident techniques and digital pulse processing analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fuente, R. de la [University of Leon, Escuela de Ingenieria Industrial, Leon 24071 (Spain); Celis, B. de [University of Leon, Escuela de Ingenieria Industrial, Leon 24071 (Spain)], E-mail: bcelc@unileon.es; Canto, V. del; Lumbreras, J.M. [University of Leon, Escuela de Ingenieria Industrial, Leon 24071 (Spain); Celis, Alonso B. de [King' s College London, IoP, De Crespigny Park, SE58AF (United Kingdom); Martin-Martin, A. [Laboratorio LIBRA, Edificio I-D, Paseo Belen 3. 47011 Valladolid (Spain); Departamento de Fisica Teorica, Atomica y Optica, Facultad de Ciencias. Po Prado de la Magdalena, s/n. 47005 Valladolid (Spain)], E-mail: alonsomm@libra.uva.es; Gutierrez-Villanueva, J.L. [Laboratorio LIBRA, Edificio I-D, Paseo Belen 3. 47011 Valladolid (Spain); Departamento de Fisica Teorica, Atomica y Optica, Facultad de Ciencias. Po Prado de la Magdalena, s/n. 47005 Valladolid (Spain)], E-mail: joselg@libra.uva.es

    2008-10-15

    A new system has been developed for the detection of low radioactivity levels of fission products and actinides using coincidence techniques. The device combines a phoswich detector for {alpha}/{beta}/{gamma}-ray recognition with a fast digital card for electronic pulse analysis. The phoswich can be used in a coincident mode by identifying the composed signal produced by the simultaneous detection of {alpha}/{beta} particles and X-rays/{gamma} particles. The technique of coincidences with phoswich detectors was proposed recently to verify the Nuclear Test Ban Treaty (NTBT) which established the necessity of monitoring low levels of gaseous fission products produced by underground nuclear explosions. With the device proposed here it is possible to identify the coincidence events and determine the energy and type of coincident particles. The sensitivity of the system has been improved by employing liquid scintillators and a high resolution low energy germanium detector. In this case it is possible to identify simultaneously by {alpha}/{gamma} coincidence transuranic nuclides present in environmental samples without necessity of performing radiochemical separation. The minimum detectable activity was estimated to be 0.01 Bq kg{sup -1} for 0.1 kg of soil and 1000 min counting.

  18. Techniques and environments for big data analysis parallel, cloud, and grid computing

    CERN Document Server

    Dehuri, Satchidananda; Kim, Euiwhan; Wang, Gi-Name

    2016-01-01

    This volume is aiming at a wide range of readers and researchers in the area of Big Data by presenting the recent advances in the fields of Big Data Analysis, as well as the techniques and tools used to analyze it. The book includes 10 distinct chapters providing a concise introduction to Big Data Analysis and recent Techniques and Environments for Big Data Analysis. It gives insight into how the expensive fitness evaluation of evolutionary learning can play a vital role in big data analysis by adopting Parallel, Grid, and Cloud computing environments.

  19. Publishing nutrition research: a review of multivariate techniques--part 2: analysis of variance.

    Science.gov (United States)

    Harris, Jeffrey E; Sheean, Patricia M; Gleason, Philip M; Bruemmer, Barbara; Boushey, Carol

    2012-01-01

    This article is the eighth in a series exploring the importance of research design, statistical analysis, and epidemiology in nutrition and dietetics research, and the second in a series focused on multivariate statistical analytical techniques. The purpose of this review is to examine the statistical technique, analysis of variance (ANOVA), from its simplest to multivariate applications. Many dietetics practitioners are familiar with basic ANOVA, but less informed of the multivariate applications such as multiway ANOVA, repeated-measures ANOVA, analysis of covariance, multiple ANOVA, and multiple analysis of covariance. The article addresses all these applications and includes hypothetical and real examples from the field of dietetics.

  20. The effects of communication techniques on public relation activities: A sample of hospitality business

    OpenAIRE

    Şirvan Şen Demir

    2011-01-01

    Nowadays, firms who give importance to public relations have been increasing rapidly in numbers. All modern firms either found public relations department in their body to deal with public relations operations or outsource this activity to consultants in order to communicate with target populations. Among the firms in tourism sector, hospitality companies are the ones that use public relations the most. The purpose of this study is to investigate the communication techniques in public relatio...

  1. Two—Port Noise Measurement of Active Microwave Devices Using the Modified F50 Technique

    Institute of Scientific and Technical Information of China (English)

    WANGJun; CHENHuilian; TANGGaodi

    2003-01-01

    The overview of traditional measurement techniques of microwave noise indicates that the refiecto-metric and the source-pull tuners methods are all expen-sive and time-consuming because of the use of broad-band tuners and frequent calibration. Moreover, based on the two techniques, a complicated algorithm is usually needed to extract accurately the two-port noise feature parame-ters from the over-determined measured data. Recently, a novel technique is proposed to measure the noise figure at the single source (50Ω). To improve the accuracy, a mod-ified F50 Technique is presented here. And an extraction method of four noise parameters from the single measured data of F50 is also given using the Pospieszalski model of transistor and two-port noise analysis models as the addi-tional information. The experimental results demonstrate the practicability of the presented method as expected by showing the four noise parameters extracted from the sin-gle measurement of F50 are in agreement with the results obtained from source-pull tuners technique with 13 source admittances.

  2. Effects of nanosuspension and inclusion complex techniques on the in vitro protease inhibitory activity of naproxen

    Energy Technology Data Exchange (ETDEWEB)

    Dharmalingam, Senthil Rajan; Chidambaram, Kumarappan; Srinivasan, Ramamurthy; Nadaraju, Shamala, E-mail: dsenthilrajan@yahoo.co.in [School of Pharmacy, International Medical University, Bukit Jalil, Kuala Lumpur (Malaysia)

    2014-01-15

    This study investigated the effects of nanosuspension and inclusion complex techniques on in vitro trypsin inhibitory activity of naproxen—a member of the propionic acid derivatives, which are a group of antipyretic, analgesic, and non-steroidal anti-inflammatory drugs. Nanosuspension and inclusion complex techniques were used to increase the solubility and anti-inflammatory efficacy of naproxen. The evaporative precipitation into aqueous solution (EPAS) technique and the kneading methods were used to prepare the nanosuspension and inclusion complex of naproxen, respectively. We also used an in vitro protease inhibitory assay to investigate the anti-inflammatory effect of modified naproxen formulations. Physiochemical properties of modified naproxen formulations were analyzed using UV, IR spectra, and solubility studies. Beta-cyclodextrin inclusion complex of naproxen was found to have a lower percentage of antitryptic activity than a pure nanosuspension of naproxen did. In conclusion, nanosuspension of naproxen has a greater anti-inflammatory effect than the other two tested formulations. This is because the nanosuspension formulation reduces the particle size of naproxen. Based on these results, the antitryptic activity of naproxen nanosuspension was noteworthy; therefore, this formulation can be used for the management of inflammatory disorders. (author)

  3. Enhancement of bimetallic Fe-Mn/CNTs nano catalyst activity and product selectivity using microemulsion technique

    Institute of Scientific and Technical Information of China (English)

    Zahra; Zolfaghari; Ahmad; Tavasoli; Saber; Tabyar; Ali; Nakhaei; Pour

    2014-01-01

    Bimetallic Fe-Mn nano catalysts supported on carbon nanotubes(CNTs) were prepared using microemulsion technique with water-to-surfactant ratios of 0.4-1.6. The nano catalysts were extensively characterized by different methods and their activity and selectivity in Fischer-Tropsch synthesis(FTS) have been assessed in a fixed-bed microreactor. The physicochemical properties and performance of the nanocatalysts were compared with the catalyst prepared by impregnation method. Very narrow particle size distribution has been produced by the microemulsion technique at relatively high loading of active metal. TEM images showed that small metal nano particles in the range of 3–7 nm were not only confined inside the CNTs but also located on the outer surface of the CNTs. Using microemulsion technique with water to surfactant ratio of0.4 decreased the average iron particle sizes to 5.1 nm. The reduction percentage and dispersion percentage were almost doubled. Activity and selectivity were found to be dependent on the catalyst preparation method and average iron particle size. CO conversion and FTS rate increased from 49.1% to 71.0% and 0.144 to 0.289 gHC/(gcat h), respectively. While the WGS rate decreased from 0.097 to 0.056 gCO2/(gcat h). C5+liquid hydrocarbons selectivity decreased slightly and olefins selectivity almost doubled.

  4. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  5. Scanning angle Raman spectroscopy: Investigation of Raman scatter enhancement techniques for chemical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Matthew W. [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    This thesis outlines advancements in Raman scatter enhancement techniques by applying evanescent fields, standing-waves (waveguides) and surface enhancements to increase the generated mean square electric field, which is directly related to the intensity of Raman scattering. These techniques are accomplished by employing scanning angle Raman spectroscopy and surface enhanced Raman spectroscopy. A 1064 nm multichannel Raman spectrometer is discussed for chemical analysis of lignin. Extending dispersive multichannel Raman spectroscopy to 1064 nm reduces the fluorescence interference that can mask the weaker Raman scattering. Overall, these techniques help address the major obstacles in Raman spectroscopy for chemical analysis, which include the inherently weak Raman cross section and susceptibility to fluorescence interference.

  6. A NOVEL GRAPH MODEL FOR E-MAIL FORENSICS: EVIDENCE ACTIVITY ANALYSIS GRAPH

    Directory of Open Access Journals (Sweden)

    Sridhar Neralla

    2013-10-01

    Full Text Available This work puts forward a novel technique for evidence analysis that assist cyber forensics expert for analyzing cyber crime episode in an effective manner. Novelty in Evidence Activity Analysis Graph is reflection of activities involved in the cyber crime that are represented as nodes along with stylometric analysis. In this pieceof work email based cyber crime incident is considered for study and that incident is represented in Evidence Activity Analysis Graph to fix the suspect. Comparisons between various graphs that are used in cyber forensics were also discussed in this paper. This model establishes relationships between various activities that were happened in the cyber crime.

  7. Artifact suppression and analysis of brain activities with electroencephalography signals

    Institute of Scientific and Technical Information of China (English)

    Md. Rashed-Al-Mahfuz; Md. Rabiul Islam; Keikichi Hirose; Md. Khademul Islam Molla

    2013-01-01

    Brain-computer interface is a communication system that connects the brain with computer (or other devices) but is not dependent on the normal output of the brain (i.e., peripheral nerve and muscle). Electro-oculogram is a dominant artifact which has a significant negative influence on further analysis of real electroencephalography data. This paper presented a data adaptive technique for artifact suppression and brain wave extraction from electroencephalography signals to detect regional brain activities. Empirical mode decomposition based adaptive thresholding approach was employed here to suppress the electro-oculogram artifact. Fractional Gaussian noise was used to determine the threshold level derived from the analysis data without any training. The purified electroencephalography signal was composed of the brain waves also called rhythmic components which represent the brain activities. The rhythmic components were extracted from each electroencephalography channel using adaptive wiener filter with the original scale. The regional brain activities were mapped on the basis of the spatial distribution of rhythmic components, and the results showed that different regions of the brain are activated in response to different stimuli. This research analyzed the activities of a single rhythmic component, alpha with respect to different motor imaginations. The experimental results showed that the proposed method is very efficient in artifact suppression and identifying individual motor imagery based on the activities of alpha component.

  8. Search for accuracy in activation in activation analysis of trace elements in different matrices. [Neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Meloni, S.; Ganzerli-Valentini, M.T.; Caramella-Crespi, V.; Maxia, V.; Maggi, L.; Pisani, U.; Soma, R.; Borroni, P.

    1976-01-01

    Different factors may affect accuracy in activation analysis of trace elements. The evaluation of these factors often requires a number of time consuming experiments, but the statement of accuracy in activation analysis is of great value to cast some light on the overall reliability of the method itself. It can be pointed out that accuracy is often inversely proportional to the number of steps of the whole analytical procedure, from sampling to calculation of results. Several techniques of activation analysis were developed and applied to the determination of trace element content in standard reference materials and in samples chosen for intercomparison among laboratories. Emphasis was put on limiting the number of steps to improve the accuracy and on achieving the best of precision. Results are presented and discussed, together with the criteria for the choice of the most appropriate separation technique. Other sources of systematic errors, such as the reliability of the content of the reference standards and dead-time corrections when short-lived isotopes are involved, were taken into account and discussed.

  9. Coronary bifurcation lesions treated with double kissing crush technique compared to classical crush technique: serial intravascular ultrasound analysis

    Institute of Scientific and Technical Information of China (English)

    SHAN Shou-jie; YE Fei; LIU Zhi-zhong; TIAN Nai-liang; ZHANG Jun-jie; HEN Shao-liang

    2013-01-01

    Background The double kissing (DK) crush technique is a modified version of the crush technique.It is specifically designed to increase the success rate of the final kissing balloon post-dilatation,but its efficacy and safety remain unclear.Methods Data were obtained from the DKCRUSH-I trial,a prospective,randomized,multi-center study to evaluate safety and efficacy.Post-procedural and eight-month follow-up intravascular ultrasound (IVUS) analysis was available in 61 cases.Volumetric analysis using Simpson's method within the Taxus stent,and cross-sectional analysis at the five sites of the main vessel (MV) and three sites of the side branch (SB) were performed.Impact of the bifurcation angle on stent expansion at the carina was also evaluated.Results Stent expansion in the SB ostium was significantly less in the classical crush group ((53.81±13.51)%) than in the DK crush group ((72.27±11.46)%) (P=-0.04).For the MV,the incidence of incomplete crush was 41.9% in the DK group and 70.0% in the classical group (P=-0.03).The percentage of neointimal area at the ostium had a tendency to be smaller in the DK group compared with the classical group ((16.4±19.2)% vs.(22.8±27.1)%,P=-0.06).The optimal threshold of post-procedural minimum stent area (MSA) to predict follow-up minimum lumen area (MLA) <4.0 mm2 at the SB ostium was 4.55 mm2,yielding an area under the curve of 0.80 (95% confidence interval:0.61 to 0.92).Conclusion Our data suggest that the DK crush technique is associated with improved quality of the final kissing balloon inflation (FKBI) and had smaller optimal cutoff value of post-procedural MSA at the SB ostium.

  10. Message Structures: a modelling technique for information systems analysis and design

    CERN Document Server

    España, Sergio; Pastor, Óscar; Ruiz, Marcela

    2011-01-01

    Despite the increasing maturity of model-driven software development (MDD), some research challenges remain open in the field of information systems (IS). For instance, there is a need to improve modelling techniques so that they cover several development stages in an integrated way, and they facilitate the transition from analysis to design. This paper presents Message Structures, a technique for the specification of communicative interactions between the IS and organisational actors. This technique can be used both in the analysis stage and in the design stage. During analysis, it allows abstracting from the technology that will support the IS, and to complement business process diagramming techniques with the specification of the communicational needs of the organisation. During design, Message Structures serves two purposes: (i) it allows to systematically derive a specification of the IS memory (e.g. a UML class diagram), (ii) and it allows to reason the user interface design using abstract patterns. Thi...

  11. Activity Analysis and Cost Analysis in Medical Schools.

    Science.gov (United States)

    Koehler, John E.; Slighton, Robert L.

    There is no unique answer to the question of what an ongoing program costs in medical schools. The estimates of program costs generated by classical methods of cost accounting are unsatisfactory because such accounting cannot deal with the joint production or joint cost problem. Activity analysis models aim at calculating the impact of alternative…

  12. Determination of moisture content and water activity in algae and fish by thermoanalytical techniques

    Directory of Open Access Journals (Sweden)

    Vilma Mota da Silva

    2008-01-01

    Full Text Available The water content in seafoods is very important since it affects their sensorial quality, microbiological stability, physical characteristics and shelf life. In this study, thermoanalytical techniques were employed to develop a simple and accurate method to determine water content (moisture by thermogravimetry (TG and water activity from moisture content values and freezing point depression using differential scanning calorimetry (DSC. The precision of the results suggests that TG is a suitable technique to determine moisture content in biological samples. The average water content values for fish samples of Lutjanus synagris and Ocyurus chrysurus species were 76.4 ± 5.7% and 63.3 ± 3.9%, respectively, while that of Ulva lactuca marine algae species was 76.0 ± 4.4%. The method presented here was also successfully applied to determine water activity in two species of fish and six species of marine algae collected in the Atlantic coastal waters of Bahia, in Brazil. Water activity determined in fish samples ranged from 0.946 - 0.960 and was consistent with values reported in the literature, i.e., 0.9 - 1.0. The water activity values determined in marine algae samples lay within the interval of 0.974 - 0.979.

  13. Determination of moisture content and water activity in algae and fish by thermoanalytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Vilma Mota da; Silva, Luciana Almeida; Andrade, Jailson B. de [Universidade Federal da Bahia (UFBA), Salvador, BA (Brazil). Inst. de Quimica]. E-mail: jailsong@ufba.br; Veloso, Marcia C. da Cunha [Centro Federal de Educacao Tecnologica da Bahia (CEFET-BA), Salvador, BA (Brazil)); Santos, Gislaine Vieira [Universidade Federal da Bahia (UFBA), Salvador, BA (Brazil). Inst. de Biologia

    2008-07-01

    The water content in seafoods is very important since it affects their sensorial quality, microbiological stability, physical characteristics and shelf life. In this study, thermoanalytical techniques were employed to develop a simple and accurate method to determine water content (moisture) by thermogravimetry (TG) and water activity from moisture content values and freezing point depression using differential scanning calorimetry (DSC). The precision of the results suggests that TG is a suitable technique to determine moisture content in biological samples. The average water content values for fish samples of Lutjanus synagris and Ocyurus chrysurus species were 76.4 {+-} 5.7% and 63.3 {+-} 3.9%, respectively, while that of Ulva lactuca marine algae species was 76.0 {+-} 4.4%. The method presented here was also successfully applied to determine water activity in two species of fish and six species of marine algae collected in the Atlantic coastal waters of Bahia, in Brazil. Water activity determined in fish samples ranged from 0.946 - 0.960 and was consistent with values reported in the literature, i.e., 0.9 - 1.0. The water activity values determined in marine algae samples lay within the interval of 0.974 - 0.979. (author)

  14. A multi coding technique to reduce transition activity in VLSI circuits

    Science.gov (United States)

    Vithyalakshmi, N.; Rajaram, M.

    2014-02-01

    Advances in VLSI technology have enabled the implementation of complex digital circuits in a single chip, reducing system size and power consumption. In deep submicron low power CMOS VLSI design, the main cause of energy dissipation is charging and discharging of internal node capacitances due to transition activity. Transition activity is one of the major factors that also affect the dynamic power dissipation. This paper proposes power reduction analyzed through algorithm and logic circuit levels. In algorithm level the key aspect of reducing power dissipation is by minimizing transition activity and is achieved by introducing a data coding technique. So a novel multi coding technique is introduced to improve the efficiency of transition activity up to 52.3% on the bus lines, which will automatically reduce the dynamic power dissipation. In addition, 1 bit full adders are introduced in the Hamming distance estimator block, which reduces the device count. This coding method is implemented using Verilog HDL. The overall performance is analyzed by using Modelsim and Xilinx Tools. In total 38.2% power saving capability is achieved compared to other existing methods.

  15. Quantitation of microbicidal activity of mononuclear phagocytes: an in vitro technique.

    Directory of Open Access Journals (Sweden)

    Rege N

    1993-01-01

    Full Text Available An in vitro assay technique was set up to determine the phagocytic and microbicidal activity of a monocyte-macrophage cell line using Candida species as test organisms. The norms were determined for the activity of peritoneal macrophages of rats (24.69 +/- 2.6% phagocytosis and 35.4 +/- 5.22% ICK and human (27.89 +/- 3.63% phagocytosis and 50.91 +/- 6.3% ICK. The assay technique was used to test the degree of activation of macrophages induced by metronidazole, Tinospora cordifolia and Asparaqus racemousus and to compare their effects with a standard immunomodulator muramyl-dipeptide. All the three test agents increased the phagocytic and killing capacity of macrophages in a dose dependent manner upto a certain dose, beyond which either these activities were found to have plateaued or decreased. The optimal doses for MDP, Metronidazole, Asparagus racemosus and Tinospora cordifolia were found to be 100 micrograms, 300 mg/kg, 200 mg/kg and 100 mg/kg respectively. Patients with cirrhosis were screened for defects in monocyte function. The depressed monocyte function (20.58 +/- 5% phago and 41.24 +/- 12.19% ICK; P < 0.05 was observed indicating a compromised host defense. The utility of this candidicidal assay in experimental and clinical studies is discussed.

  16. Analysis of the Nucleoside Content of Cordyceps sinensis Using the Stepwise Gradient Elution Technique of Thin-Layer Chromatography

    Institute of Scientific and Technical Information of China (English)

    MA,King-Wah(马敬桦); CHAU,Foo-Tim(周福添); WU,Jian-Yong(吴建勇)

    2004-01-01

    Nucleoside is the main class of active components in Cordyceps sinensis. Thin-layer chromatography (TLC) is one of the most commonly used methods in pharmacopoeias for analyzing chemical components of herbal medicine. Since the isocratic elution method cannot be applied successfully in TLC analysis for separating all the nucleoside components, the stepwise gradient elution has been developed in this work to separate eight nucleoside standards with success. In this way, quantitative analyses of the samples of Cordyceps sinensis were achieved via the proposed TLC procedure coupled with the scanning densitometric techniques of CAMAG and TLCQA methods for qualitative and quantitative analysis.

  17. Operational modal analysis via image based technique of very flexible space structures

    Science.gov (United States)

    Sabatini, Marco; Gasbarri, Paolo; Palmerini, Giovanni B.; Monti, Riccardo

    2013-08-01

    Vibrations represent one of the most important topics of the engineering design relevant to flexible structures. The importance of this problem increases when a very flexible system is considered, and this is often the case of space structures. In order to identify the modal characteristics, in terms of natural frequencies and relevant modal parameters, ground tests are performed. However, these parameters could vary due to the operative conditions of the system. In order to continuously monitor the modal characteristics during the satellite lifetime, an operational modal analysis is mandatory. This kind of analysis is usually performed by using classical accelerometers or strain gauges and by properly analyzing the acquired output. In this paper a different approach for the vibrations data acquisition will be performed via image-based technique. In order to simulate a flexible satellite, a free flying platform is used; the problem is furthermore complicated by the fact that the overall system, constituted by a highly rigid bus and very flexible panels, must necessarily be modeled as a multibody system. In the experimental campaign, the camera, placed on the bus, will be used to identify the eigenfrequencies of the vibrating structure; in this case aluminum thin plates simulate very flexible solar panels. The structure is excited by a hammer or studied during a fast attitude maneuver. The results of the experimental activity will be investigated and compared with respect to the numerical simulation obtained via a FEM-multibody software and the relevant results will be proposed and discussed.

  18. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    Energy Technology Data Exchange (ETDEWEB)

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  19. Auditing Information Structures in Organizations: A Review of Data Collection Techniques for Network Analysis

    NARCIS (Netherlands)

    Zwijze-Koning, Karen H.; Jong, de Menno D.T.

    2005-01-01

    Network analysis is one of the current techniques for investigating organizational communication. Despite the amount of how-to literature about using network analysis to assess information flows and relationships in organizations, little is known about the methodological strengths and weaknesses of

  20. Qualitative and quantitative analysis of lignocellulosic biomass using infrared techniques: A mini-review

    Science.gov (United States)

    Current wet chemical methods for biomass composition analysis using two-step sulfuric acid hydrolysis are time-consuming, labor-intensive, and unable to provide structural information about biomass. Infrared techniques provide fast, low-cost analysis, are non-destructive, and have shown promising re...

  1. Use of fuzzy techniques for analysis of dynamic loads in power systems

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Presents the use of fuzzy techniques for analysis of dynamic load characteristics of power systems to identify the voltage stability (collapse) of a weak bus and concludes from the consistent results obtained that this is a useful tool for analysis of load charactersitics of sophiscated power systems and their components.

  2. Automation of the verneuil technique on the basis of a stability analysis

    Science.gov (United States)

    Borodin, V. A.; Brener, E. A.; Tatarchenko, V. A.; Gusev, V. I.; Tsigler, I. N.

    1981-04-01

    This paper presents a stability analysis for the crystallization of large-size crystals grown by Verneuil techniques with variable powder feed rate. The laws of powder feed regulation are found, ensuring automatic maintenance of constant cross section of the growing crystal. Experimental verification of the results of the theoretical analysis is obtained.

  3. Advances in the Chemical Analysis and Biological Activities of Chuanxiong

    Directory of Open Access Journals (Sweden)

    Jin-Ao Duan

    2012-09-01

    Full Text Available Chuanxiong Rhizoma (Chuan-Xiong, CX, the dried rhizome of Ligusticum chuanxiong Hort. (Umbelliferae, is one of the most popular plant medicines in the World. Modern research indicates that organic acids, phthalides, alkaloids, polysaccharides, ceramides and cerebrosides are main components responsible for the bioactivities and properties of CX. Because of its complex constituents, multidisciplinary techniques are needed to validate the analytical methods that support CX’s use worldwide. In the past two decades, rapid development of technology has advanced many aspects of CX research. The aim of this review is to illustrate the recent advances in the chemical analysis and biological activities of CX, and to highlight new applications and challenges. Emphasis is placed on recent trends and emerging techniques.

  4. Meso-scale characterization of lithium distribution in lithium-ion batteries using ion beam analysis techniques

    Science.gov (United States)

    Gonzalez-Arrabal, R.; Panizo-Laiz, M.; Fujita, K.; Mima, K.; Yamazaki, A.; Kamiya, T.; Orikasa, Y.; Uchimoto, Y.; Sawada, H.; Okuda, C.; Kato, Y.; Perlado, J. M.

    2015-12-01

    The performance of a Li-ion battery (LIB) is mainly governed by the diffusion capabilities of lithium in the electrodes. Thus, for LIB improvement it is essential to characterize the lithium distribution. Most of the traditionally used techniques for lithium characterization give information about the local scale or in the macroscopic scale. However, the lithium behavior at the local scale is not mirrored at the macroscopic scale. Therefore, the lithium characterization in the mesoscopic scale would be of help to understand and to connect the mechanisms taking place in the two spatial scales. In this paper, we show a general description of the capabilities and limitations of ion beam analysis techniques to study the distributions of lithium and other elements present in the electrodes in the mesoscopic scale. The potential of the 7Li(p,α0)4He nuclear reaction to non-invasively examine the lithium distribution as a function of depth is illustrated. The lithium spatial distribution is characterized using particle induced γ-ray (μ-PIGE) spectroscopy. This technique allows estimating the density of the active particles in the electrode effectively contributing to the Li intercalation and/or de-intercalation. The advantages of the use of ion beam analysis techniques in comparison to more traditional techniques for electrode characterization are discussed.

  5. Technique and experiment of active direct gas pressure measurement in coal roadway

    Institute of Scientific and Technical Information of China (English)

    CHEN Xue-xi; MA Shang-quan; QI Li-ming

    2009-01-01

    An active measurement method and its principle was introduced considering the low success rate, special difficulty, and long measurement time of the direct gas pres-sure measurement currently used in coal roadways. The technology of drilling, borehole sealing depth, borehole sealing length, sealing control of the measuring process, com-pensatory computation of gas loss quantity and other key techniques were discussed. Fi-nally, based on the latest instrument the authors developed, a series of experiments of di-rect gas pressure measurement in the coal roadways of the Jincheng and Tongchuan mine district, were carried out. The experimental results show that active gas pressure measurement technique has advantages as follows: (1) the application scope of direct gas pressure measurement technique is wide and it does not have the restriction of coal hardness, coal seam fissure and other conditions; (2) the measured results are credible, which can be tested by the same gas pressure value acquired from a different borehole in the same place; (3) the measurement process is convenient and quick, it takes about 2 to 3 days to acquire the gas pressure value in a coal seam.

  6. Technique and experiment of active direct gas pressure measurement in coal roadway

    Energy Technology Data Exchange (ETDEWEB)

    Xue-xi Chen; Shang-quan Ma; Li-ming Qi [North China Institute of Science and Technology, Beijing (China). School of Safety Engineering

    2009-06-15

    An active measurement method and its principle was introduced considering the low success rate, special difficulty, and long measurement time of the direct gas pressure measurement currently used in coal roadways. The technology of drilling, borehole sealing depth, borehole sealing length, sealing control of the measuring process, compensatory computation of gas loss quantity and other key techniques were discussed. Finally, based on the latest instrument the authors developed, a series of experiments of direct gas pressure measurement in the coal roadways of the Jincheng and Tongchuan mine district, were carried out. The experimental results show that active gas pressure measurement technique has advantages as follows: (1) the application scope of direct gas pressure measurement technique is wide and it does not have the restriction of coal hardness, coal seam fissure and other conditions; (2) the measured results are credible, which can be tested by the same gas pressure value acquired from a different borehole in the same place; (3) the measurement process is convenient and quick, it takes about 2 to 3 days to acquire the gas pressure value in a coal seam. 8 refs., 1 fig., 2 tabs.

  7. Salient Feature Identification and Analysis using Kernel-Based Classification Techniques for Synthetic Aperture Radar Automatic Target Recognition

    Science.gov (United States)

    2014-03-27

    SALIENT FEATURE IDENTIFICATION AND ANALYSIS USING KERNEL-BASED CLASSIFICATION TECHNIQUES FOR SYNTHETIC APERTURE RADAR AUTOMATIC TARGET RECOGNITION...FEATURE IDENTIFICATION AND ANALYSIS USING KERNEL-BASED CLASSIFICATION TECHNIQUES FOR SYNTHETIC APERTURE RADAR AUTOMATIC TARGET RECOGNITION THESIS Presented...SALIENT FEATURE IDENTIFICATION AND ANALYSIS USING KERNEL-BASED CLASSIFICATION TECHNIQUES FOR SYNTHETIC APERTURE RADAR AUTOMATIC TARGET RECOGNITION

  8. Tools and techniques to study ligand-receptor interactions and receptor activation by TNF superfamily members.

    Science.gov (United States)

    Schneider, Pascal; Willen, Laure; Smulski, Cristian R

    2014-01-01

    Ligands and receptors of the TNF superfamily are therapeutically relevant targets in a wide range of human diseases. This chapter describes assays based on ELISA, immunoprecipitation, FACS, and reporter cell lines to monitor interactions of tagged receptors and ligands in both soluble and membrane-bound forms using unified detection techniques. A reporter cell assay that is sensitive to ligand oligomerization can identify ligands with high probability of being active on endogenous receptors. Several assays are also suitable to measure the activity of agonist or antagonist antibodies, or to detect interactions with proteoglycans. Finally, self-interaction of membrane-bound receptors can be evidenced using a FRET-based assay. This panel of methods provides a large degree of flexibility to address questions related to the specificity, activation, or inhibition of TNF-TNF receptor interactions in independent assay systems, but does not substitute for further tests in physiologically relevant conditions.

  9. A new chromosome fluorescence banding technique combining DAPI staining with image analysis in plants.

    Science.gov (United States)

    Liu, Jing Yu; She, Chao Wen; Hu, Zhong Li; Xiong, Zhi Yong; Liu, Li Hua; Song, Yun Chun

    2004-08-01

    In this study, a new chromosome fluorescence banding technique was developed in plants. The technique combined 4',6-diamidino-2-phenylindole (DAPI) staining with software analysis including three-dimensional imaging after deconvolution. Clear multiple and adjacent DAPI bands like G-bands were obtained by this technique in the tested species including Hordeum vulgare L., Oryza officinalis, Wall & Watt, Triticum aestivum L., Lilium brownii, Brown, and Vicia faba L. During mitotic metaphase, the numbers of bands for the haploid genomes of these species were about 185, 141, 309, 456 and 194, respectively. Reproducibility analysis demonstrated that banding patterns within a species were stable at the same mitotic stage and they could be used for identifying specific chromosomes and chromosome regions. The band number fluctuated: the earlier the mitotic stage, the greater the number of bands. The technique enables genes to be mapped onto specific band regions of the chromosomes by only one fluorescence in situ hybridisation (FISH) step with no chemical banding treatments. In this study, the 45S and 5S rDNAs of some tested species were located on specific band regions of specific chromosomes and they were all positioned at the interbands with the new technique. Because no chemical banding treatment was used, the banding patterns displayed by the technique should reflect the natural conformational features of chromatin. Thus it could be expected that this technique should be suitable for all eukaryotes and would have widespread utility in chromosomal structure analysis and physical mapping of genes.

  10. Application of a sensitivity analysis technique to high-order digital flight control systems

    Science.gov (United States)

    Paduano, James D.; Downing, David R.

    1987-01-01

    A sensitivity analysis technique for multiloop flight control systems is studied. This technique uses the scaled singular values of the return difference matrix as a measure of the relative stability of a control system. It then uses the gradients of these singular values with respect to system and controller parameters to judge sensitivity. The sensitivity analysis technique is first reviewed; then it is extended to include digital systems, through the derivation of singular-value gradient equations. Gradients with respect to parameters which do not appear explicitly as control-system matrix elements are also derived, so that high-order systems can be studied. A complete review of the integrated technique is given by way of a simple example: the inverted pendulum problem. The technique is then demonstrated on the X-29 control laws. Results show linear models of real systems can be analyzed by this sensitivity technique, if it is applied with care. A computer program called SVA was written to accomplish the singular-value sensitivity analysis techniques. Thus computational methods and considerations form an integral part of many of the discussions. A user's guide to the program is included. The SVA is a fully public domain program, running on the NASA/Dryden Elxsi computer.

  11. The Immersive Virtual Reality Experience: A Typology of Users Revealed Through Multiple Correspondence Analysis Combined with Cluster Analysis Technique.

    Science.gov (United States)

    Rosa, Pedro J; Morais, Diogo; Gamito, Pedro; Oliveira, Jorge; Saraiva, Tomaz

    2016-03-01

    Immersive virtual reality is thought to be advantageous by leading to higher levels of presence. However, and despite users getting actively involved in immersive three-dimensional virtual environments that incorporate sound and motion, there are individual factors, such as age, video game knowledge, and the predisposition to immersion, that may be associated with the quality of virtual reality experience. Moreover, one particular concern for users engaged in immersive virtual reality environments (VREs) is the possibility of side effects, such as cybersickness. The literature suggests that at least 60% of virtual reality users report having felt symptoms of cybersickness, which reduces the quality of the virtual reality experience. The aim of this study was thus to profile the right user to be involved in a VRE through head-mounted display. To examine which user characteristics are associated with the most effective virtual reality experience (lower cybersickness), a multiple correspondence analysis combined with cluster analysis technique was performed. Results revealed three distinct profiles, showing that the PC gamer profile is more associated with higher levels of virtual reality effectiveness, that is, higher predisposition to be immersed and reduced cybersickness symptoms in the VRE than console gamer and nongamer. These findings can be a useful orientation in clinical practice and future research as they help identify which users are more predisposed to benefit from immersive VREs.

  12. Technique for continuous high-resolution analysis of trace substances in firn and ice cores

    Energy Technology Data Exchange (ETDEWEB)

    Roethlisberger, R.; Bigler, M.; Hutterli, M.; Sommer, S.; Stauffer, B.; Junghans, H.G.; Wagenbach, D.

    2000-01-15

    The very successful application of a CFA (Continuous flow analysis) system in the GRIP project (Greenland Ice Core Project) for high-resolution ammonium, calcium, hydrogen peroxide, and formaldehyde measurements along a deep ice core led to further development of this analysis technique. The authors included methods for continuous analysis technique. The authors included methods for continuous analysis of sodium, nitrate, sulfate, and electrolytical conductivity, while the existing methods have been improved. The melting device has been optimized to allow the simultaneous analysis of eight components. Furthermore, a new melter was developed for analyzing firn cores. The system has been used in the frame of the European Project for Ice Coring in Antarctica (EPICA) for in-situ analysis of several firn cores from Dronning Maud Land, Antarctica, and for the new ice core drilled at Dome C, Antarctica.

  13. Study of the magnetospheres of active regions on the sun by radio astronomy techniques

    Science.gov (United States)

    Bogod, V. M.; Kal'tman, T. I.; Peterova, N. G.; Yasnov, L. V.

    2017-01-01

    In the 1990s, based on detailed studies of the structure of active regions (AR), the concept of the magnetosphere of the active region was proposed. This includes almost all known structures presented in the active region, ranging from the radio granulation up to noise storms, the radiation of which manifests on the radio waves. The magnetosphere concept, which, from a common point of view, considers the manifestations of the radio emission of the active region as a single active complex, allows one to shed light on the relation between stable and active processes and their interrelations. It is especially important to identify the basic ways of transforming nonthermal energy into thermal energy. A dominant role in all processes is attributed to the magnetic field, the measurement of which on the coronal levels can be performed by radio-astronomical techniques. The extension of the wavelength range and the introduction of new tools and advanced modeling capabilities makes it possible to analyze the physical properties of plasma structures in the AR magnetosphere and to evaluate the coronal magnetic fields at the levels of the chromosphere-corona transition zone and the lower corona. The features and characteristics of the transition region from the S component to the B component have been estimated.

  14. Induced modifications on algae photosynthetic activity monitored by pump-and-probe technique

    Energy Technology Data Exchange (ETDEWEB)

    Barbini, R.; Colao, F.; Fantoni, R.; Palucci, A.; Ribezzo, S. [ENEA, Centro Ricerche Frascati, Rome (Italy). Dip. Innovazione; Tarzillo, G.; Carlozzi, P.; Pelosi, E. [CNR, Florence (Italy). Centro Studi Microorganismi Autotrofi

    1995-12-01

    The lidar fluorosensor system available at ENEA Frascati has been used for a series of laboratory measurements on brackish-water and marine phytoplankton grown in laboratory with the proper saline solution. The system, already used to measure the laser induced fluorescence spectra of different algae species and their detection limits, has been upgraded with a short pulse Nd:YAG laser and rearranged to test a new technique based on laser pump and probe excitation. Results of this new technique for remote monitoring of the in-vivo photosynthetic activity will be presented, as measured during a field campaign carried out in Florence during the Autumn 1993, where the effects of an actinic saturating light and different chemicals have also been checked.

  15. Low activation brazing materials and techniques for SiC f/SiC composites

    Science.gov (United States)

    Riccardi, B.; Nannetti, C. A.; Petrisor, T.; Sacchetti, M.

    2002-12-01

    A low activation brazing technique for silicon carbide fiber reinforced silicon carbide matrix composites (SiC f/SiC) is presented; this technique is based on the use of the 78Si-22Ti (wt%) eutectic alloy. The joints obtained take advantage of a melting point able to avoid composite fibre-interface degradation. All the joints showed absence of discontinuities and defects at the interface and a fine eutectic structure. Moreover, the joint layer appeared well adherent both to the matrix and the fibre interphase and the brazing alloy infiltration looked sufficiently controlled. The joints of SiC f/SiC composites showed 71±10 MPa almost pure shear strength at RT and up to 70 MPa at 600 °C.

  16. Comparison of Selective Culturing and Biochemical Techniques for Measuring Biological Activity in Geothermal Process Fluids

    Energy Technology Data Exchange (ETDEWEB)

    Pryfogle, Peter Albert

    2000-09-01

    For the past three years, scientists at the Idaho National Engineering and Environmental Laboratory have been conducting studies aimed at determining the presence and influence of bacteria found in geothermal plant cooling water systems. In particular, the efforts have been directed at understanding the conditions that lead to the growth and accumulation of biomass within these systems, reducing the operational and thermal efficiency. Initially, the methods selected were based upon the current practices used by the industry and included the collection of water quality parameters, the measurement of soluble carbon, and the use of selective medial for the determination of the number density of various types of organisms. This data has been collected on a seasonal basis at six different facilities located at the Geysers’ in Northern California. While this data is valuable in establishing biological growth trends in the facilities and providing an initial determination of upset or off-normal conditions, more detailed information about the biological activity is needed to determine what is triggering or sustaining the growth in these facilities in order to develop improved monitoring and treatment techniques. In recent years, new biochemical approaches, based upon the analyses of phospholipid fatty acids and DNA recovered from environmental samples, have been developed and commercialized. These techniques, in addition to allowing the determination of the quantity of biomass, also provide information on the community composition and the nutritional status of the organisms. During the past year, samples collected from the condenser effluents of four of the plants from The Geysers’ were analyzed using these methods and compared with the results obtained from selective culturing techniques. The purpose of this effort was to evaluate the cost-benefit of implementing these techniques for tracking microbial activity in the plant study, in place of the selective culturing

  17. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    Science.gov (United States)

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.

  18. The Delphi Technique in nursing research - Part 3: Data Analysis and Reporting

    Directory of Open Access Journals (Sweden)

    Dimitrios Kosmidis

    2013-07-01

    Full Text Available The Delphi technique is a research method with a multitude of literature regarding its application, yet there is limited guidance on methods of analysis and the presentation of results. Aim: To describe and critically analyze the main methods of the qualitative and quantitative data analysis in studies using the Delphi Technique. Materials and methods: The literature search included research and review articles of nursing interest within the following databases: IATROTEK, Medline, Cinahl and Scopus, from 2001 to 2011. Key-words were used Delphi technique, nursing and research methodology, in English and Greek language. Results: The literature search revealed 285 articles of nursing interest (266 reserch articles and 19 reviews. Data analysis in formal methodology surveys using the Delphi technique initially involves a qualitative analysis of the experts' views which are gathered during the first round. Subsequently, various statistical analyses methods are employed in order to estimate the final level of consensus on the coming rounds (iterations. Prescribing the desired degree of consensus is usually based on subjective assumptions, while the final identification is done with mainly via descriptive and inductive statistical measures. In the presentation of results, simple tables are mainly used which are based on descriptive data, statistical criteria or scatter charts in order to illustrate the experts' opinions. Conclusions: The Delphi Technique has infiltrated nursing research with great variability in data analysis methodology and presentation of results depending on each study's aims and characteristics.

  19. Quantitative comparison of performance analysis techniques for modular and generic network-on-chip

    Directory of Open Access Journals (Sweden)

    M. C. Neuenhahn

    2009-05-01

    Full Text Available NoC-specific parameters feature a huge impact on performance and implementation costs of NoC. Hence, performance and cost evaluation of these parameter-dependent NoC is crucial in different design-stages but the requirements on performance analysis differ from stage to stage. In an early design-stage an analysis technique featuring reduced complexity and limited accuracy can be applied, whereas in subsequent design-stages more accurate techniques are required.

    In this work several performance analysis techniques at different levels of abstraction are presented and quantitatively compared. These techniques include a static performance analysis using timing-models, a Colored Petri Net-based approach, VHDL- and SystemC-based simulators and an FPGA-based emulator. Conducting NoC-experiments with NoC-sizes from 9 to 36 functional units and various traffic patterns, characteristics of these experiments concerning accuracy, complexity and effort are derived.

    The performance analysis techniques discussed here are quantitatively evaluated and finally assigned to the appropriate design-stages in an automated NoC-design-flow.

  20. Probability-Based Diagnostic Imaging Technique Using Error Functions for Active Structural Health Monitoring

    Directory of Open Access Journals (Sweden)

    Rahim Gorgin,

    2014-07-01

    Full Text Available This study presents a novel probability-based diagnostic imaging (PDI technique using error functions for active structural health monitoring (SHM. To achieve this, first the changes between baseline and current signals of each sensing path are measured, and by taking the root mean square of such changes, the energy of the scattered signal at different times can be calculated. Then, for different pairs of signal acquisition paths, an error function based on the energy of the scattered signals is introduced. Finally, the resultant error function is fused to the final estimation of the probability of damage presence in the monitoring area. As for applications, developed methods were employed to various damage identification cases, including cracks located in regions among an active sensor network with different configurations (pulse-echo and pitch-catch, and holes located in regions outside active network sensors with pitch-catch configuration. The results identified using experimental Lamb wave signals at different central frequencies corroborated that the developed PDI technique using error functions is capable of monitoring structural damage, regardless of its shape, size and location. The developed method doesn’t need direct interpretation of overlaid and dispersed lamb wave components for damage identification and can monitor damage located anywhere in the structure. These bright advantages, qualify the above presented PDI method for online structural health monitoring.

  1. Designed nanostructured pt film for electrocatalytic activities by underpotential deposition combined chemical replacement techniques.

    Science.gov (United States)

    Huang, Minghua; Jin, Yongdong; Jiang, Heqing; Sun, Xuping; Chen, Hongjun; Liu, Baifeng; Wang, Erkang; Dong, Shaojun

    2005-08-18

    Multiple-deposited Pt overlayer modified Pt nanoparticle (MD-Pt overlayer/PtNPs) films were deliberately constructed on glassy carbon electrodes through alternately multiple underpotential deposition (UPD) of Ag followed redox replacement reaction by Pt (II) cations. The linear and regular growth of the films characterized by cyclic voltammetry was observed. Atomic force spectroscopy (AFM) provides the surface morphology of the nanostructured Pt films. Rotating disk electrode (RDE) voltammetry and rotating ring-disk electrode (RRDE) voltammetry demonstrate that the MD-Pt overlayer/PtNPs films can catalyze an almost four-electron reduction of O(2) to H(2)O in air-saturated 0.1 M H(2)SO(4). Thus-prepared Pt films behave as novel nanostructured electrocatalysts for dioxygen reduction and hydrogen evolution reaction (HER) with enhanced electrocatalytic activities, in terms of both reduction peak potential and peak current, when compared to that of the bulk polycrystalline Pt electrode. Additionally, it is noted that after multiple replacement cycles, the electrocatalytic activities improved remarkably, although the increased amount of Pt is very low in comparison to that of pre-modified PtNPs due to the intrinsic feature of the UPD-redox replacement technique. In other words, the electrocatalytic activities could be improved markedly without using very much Pt by the technique of tailoring the catalytic surface. These features may provide an interesting way to produce Pt catalysts with a reliable catalytic performance as well as a reduction in cost.

  2. THE 'HYBRID' TECHNIQUE FOR RISK ANALYSIS OF SOME DISEASES

    Institute of Scientific and Technical Information of China (English)

    SHANGHANJI; LUYUCHU; XUXUEMEI; CHENQIAN

    2001-01-01

    Based on the data obtained from a survey recently made in Shanghai, this paper presents the hybrid technique for risk analysis and evaluation of some diseases. After determination of main risk factors of these diseases by analysis of variance, the authors introduce a new concept 'Illness Fuzzy Set' and use fuzzy comprehensive evaluation to evaluate the risk of suffering from a disease for residents. Optimal technique is used to determinethe weights wi in fuzzy comprehensive evaluation, and a new method 'Improved Information Distribution' is also introduced for the treatment of small sample problem. It is shown that the results obtained by using the hybrid technique are better than by using single fuzzy technique or single statistical method.

  3. THE ‘HYBRID’ TECHNIQUE FOR RISK ANALYSIS OF SOME DISEASES

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Based on the data obtained from a survey recently made in Shanghai, this paper presents the hybrid technique for risk analysis and evaluation of some diseases. After determination of main risk factors of these diseases by analysis of variance, the authors introduce a new concept ‘Illness Fuzzy Set’ and use fuzzy comprehensive evaluation to evaluate the risk of suffering from a disease for residents. Optimal technique is used to determine the weights wi in fuzzy comprehensive evaluation, and a new method ‘Improved Information Distribution’ is also introduced for the treatment of small sample problem. It is shown that the results obtained by using the hybrid technique are better than by using single fuzzy technique or single statistical method.

  4. DOCTRINAL BASICS OF THE LEGAL TECHNIQUE: COMPARATIVE ANALYSIS WITHIN THE EUROPEAN LEGAL FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Alexander Malko

    2014-07-01

    Full Text Available The legal technique was initially developed as a kind of “interpreter” for the legislative will in the legal language using a specific ingenuity of legal engineering. Historically, the theoretical base of the legal technique was formed on a phased basis, essentially stimulated by state reforms, social transformations, and active legislation systematization. It should be mentioned here that legal technique is a distinctive category reflecting the political, economic, and legal situation in the historical period of a certain state development, but being extra-national in itself.The resource harmonization of the legal technique within the European legal framework means norm-setting regulations, coordination, and elaboration of common recommendations for the European countries. The cooperation in the legal technique standards harmonization will require the all-European cooperation to the new level as far as legal standards, human rights, democratic development, legitimacy and cultural cooperation are concerned.

  5. Design and Performance Analysis of Various Adders and Multipliers Using GDI Technique

    OpenAIRE

    Simran kaur; Balwinder Singh; Jain, D.K.

    2015-01-01

    With the active development of portable electronic devices, the need for low power dissipation, high speed and compact implementation, give rise to several research intentions. There are several design techniques used for the circuit configuration in VLSI systems but there are very few design techniques that gives the required extensibility. This paper describes the implementation of various adders and multipliers. The design approach proposed in the article is based on the GDI (G...

  6. Análise biomecânica e histológica de tendões flexores reparados em coelhos usando três técnicas de sutura (quatro e seis passadas com mobilização ativa precoce Biomechanics and histological analysis in rabbit flexor tendons repaired using three suture techniques (four and six strands with early active mobilization

    Directory of Open Access Journals (Sweden)

    Antônio Lourenço Severo

    2012-02-01

    mobilization. METHODS: the right calcaneal tendons of 36 rabbits of the New Zealand breed were used in the analysis. This sample presents similar size to human flexor tendon that has approximately 4.5 mm (varying from 2mm. The selected sample showed the same mass (2.5 to 3kg and were male or female adults (from 8 ½ months. RESULTS: in the biomechanical analysis, there was no statistically significant difference (p>0.01. There was no statistical difference in relation to surgical time in all three suture techniques (p>0.01. With the early active mobility, there was qualitative and quantitative evidence of thickening of collagen in 38.9% on the 15th day and in 66.7% on the 30th day, making the biological tissue stronger and more resistant (p=0.095. CONCLUSION: this study demonstrated that there was no histological difference between the results achieved with an inside or outside end knot with respect to the repaired tendon and the number of strands did not affect healing, vascularization or sliding of the tendon in the osteofibrous tunnel, which are associated with early active mobility, with the repair techniques applied.

  7. A Comparative Analysis of Techniques for PAPR Reduction of OFDM Signals

    Directory of Open Access Journals (Sweden)

    M. Janjić

    2014-06-01

    Full Text Available In this paper the problem of high Peak-to-Average Power Ratio (PAPR in Orthogonal Frequency-Division Multiplexing (OFDM signals is studied. Besides describing three techniques for PAPR reduction, SeLective Mapping (SLM, Partial Transmit Sequence (PTS and Interleaving, a detailed analysis of the performances of these techniques for various values of relevant parameters (number of phase sequences, number of interleavers, number of phase factors, number of subblocks depending on applied technique, is carried out. Simulation of these techniques is run in Matlab software. Results are presented in the form of Complementary Cumulative Distribution Function (CCDF curves for PAPR of 30000 randomly generated OFDM symbols. Simulations are performed for OFDM signals with 32 and 256 subcarriers, oversampled by a factor of 4. A detailed comparison of these techniques is made based on Matlab simulation results.

  8. Elemental Analysis of Lapis Lazuli sample, using complementary techniques of IBIL and MicroPIXE

    Directory of Open Access Journals (Sweden)

    T Nikbakht

    2015-07-01

    Full Text Available Ion Beam Induced Luminescence (IBIL is a useful IBA technique which could be utilized to obtain information about the nature of chemical bonds in materials. Regarding the probed area, this non-destructive and fast technique is a suitable complementary one for MicroPIXE. Since most minerals are luminescent, IBIL is an applicable analytical technique in mineralogy. In this research work, to characterize a Lapis lazuli sample, a 2.7 MeV proton beam is utilized. After data collection and analysis of the results obtained from both techniques of IBIL and MicroPIXE, elemental maps of the sample were developed. Comparison of the results with other available ones in the literature indicates the capability and accuracy of the combination of the two complementary techniques for characterization of minerals as well as precious historical objects

  9. Analysis of Far-Field Radiation from Apertures Using Monte Carlo Integration Technique

    Directory of Open Access Journals (Sweden)

    Mohammad Mehdi Fakharian

    2014-12-01

    Full Text Available An integration technique based on the use of Monte Carlo Integration (MCI is proposed for the analysis of the electromagnetic radiation from apertures. The technique that can be applied to the calculation of the aperture antenna radiation patterns is the equivalence principle followed by physical optics, which can then be used to compute far-field antenna radiation patterns. However, this technique is often complex mathematically, because it requires integration over the closed surface. This paper presents an extremely simple formulation to calculate the far-fields from some types of aperture radiators by using MCI technique. The accuracy and effectiveness of this technique are demonstrated in three cases of radiation from the apertures and results are compared with the solutions using FE simulation and Gaussian quadrature rules.

  10. Fault detection in digital and analog circuits using an i(DD) temporal analysis technique

    Science.gov (United States)

    Beasley, J.; Magallanes, D.; Vridhagiri, A.; Ramamurthy, Hema; Deyong, Mark

    1993-01-01

    An i(sub DD) temporal analysis technique which is used to detect defects (faults) and fabrication variations in both digital and analog IC's by pulsing the power supply rails and analyzing the temporal data obtained from the resulting transient rail currents is presented. A simple bias voltage is required for all the inputs, to excite the defects. Data from hardware tests supporting this technique are presented.

  11. The use of selected theatre rehearsal technique activities with African-American adolescents labeled "behavior disordered".

    Science.gov (United States)

    Anderson, M G

    1992-01-01

    The extensive literature on the overrepresentation of adolescent African-American male learners in classes for students identified as behavior disordered has essentially not addressed the problems caused by teacher reactions to adolescent conversational language use, the qualitative differences in language choices, or the impact of the conversational choices of adolescents on their educational treatment. This article explores how the dramaturgical perspective of selected Theatre Rehearsal Technique (TRT) activities can be used as learning experiences in communication with this student population. If these students gain quantifiable success in their social communication interactions, reassessment of their special education placement might facilitate their entrance into less restrictive educational environments.

  12. Low Temperature Irradiation Applied to Neutron Activation Analysis of Mercury In Human Whole Blood

    Energy Technology Data Exchange (ETDEWEB)

    Brune, D.

    1966-02-15

    The distribution of mercury in human whole blood has been studied by means of neutron activation analysis. During the irradiation procedure the samples were kept at low temperature by freezing them in a cooling device in order to prevent interferences caused by volatilization and contamination. The mercury activity was separated by means of distillation and ion exchange techniques.

  13. A novel integrated active capping technique for the remediation of nitrobenzene-contaminated sediment.

    Science.gov (United States)

    Sun, Hongwen; Xu, Xiaoyang; Gao, Guandao; Zhang, Zizhong; Yin, Peijie

    2010-10-15

    The objective of this study was to develop a novel integrated active capping system and to investigate its efficiency in the remediation of nitrobenzene-contaminated sediment. An integrated Fe(0)-sorbent-microorganism remediation system was proposed as an in situ active capping technique to remediate nitrobenzene-contaminated sediment. In this system, nitrobenzene was reduced to aniline by Fe(0), which has a much better biodegradability. The sorption capacity and structural properties of cinder was measured to examine its applicability as the sorbent and matrix for this integrated capping system. Indigenous microorganisms from Songhuajiang River sediment, which was contaminated by nitrobenzene and aniline in Chinese Petrochemical Explosion in Jilin, were acquired one month after the explosion and used in this active capping system to degrade nitrobenzene and its reduced product, aniline. A bench-scale remediation experiment was conducted on a mimicked nitrobenzene-contaminated sediment to investigate the efficiency of the integrated capping system and the synergistic effects of the combined components in the active capping system. The results show that this integrated active capping system can effectively block the release of target pollutants into the upper-layer water and remove the compounds from the environment.

  14. Refolding Techniques for Recovering Biologically Active Recombinant Proteins from Inclusion Bodies

    Directory of Open Access Journals (Sweden)

    Hiroshi Yamaguchi

    2014-02-01

    Full Text Available Biologically active proteins are useful for studying the biological functions of genes and for the development of therapeutic drugs and biomaterials in a biotechnology industry. Overexpression of recombinant proteins in bacteria, such as Escherichia coli, often results in the formation of inclusion bodies, which are protein aggregates with non-native conformations. As inclusion bodies contain relatively pure and intact proteins, protein refolding is an important process to obtain active recombinant proteins from inclusion bodies. However, conventional refolding methods, such as dialysis and dilution, are time consuming and, often, recovered yields of active proteins are low, and a trial-and-error process is required to achieve success. Recently, several approaches have been reported to refold these aggregated proteins into an active form. The strategies largely aim at reducing protein aggregation during the refolding procedure. This review focuses on protein refolding techniques using chemical additives and laminar flow in microfluidic chips for the efficient recovery of active proteins from inclusion bodies.

  15. Enhancing Student Self-Study Attitude and Activity with Motivational Techniques

    Directory of Open Access Journals (Sweden)

    Kent Rhoads

    2013-09-01

    Full Text Available Research has shown that students will exhibit a positive attitude towards self-study, but that they will often fail to complete self-study activities. The purpose of this paper is to investigate positive instructor interactions and motivation of students to complete self- study activities and students’ attitudes towards self-study. Six English instructors at the University of Shizuoka created a one-semester self-access study log for use in the university self-access language laboratory in order to find out how many students would complete the log. One of the six instructors applied motivational techniques in the classroom in an effort to engender greater student self-study. Later a questionnaire was administered to 465 student participants to determine their self-study attitudes and activities. The data collected from the questionnaire and the high participation in the self- study activities suggest the positive impact the motivational actions employed by the instructor had on his students' attitudes towards self-study activities.

  16. Development of Distinction Method of Production Area of Ginsengs by Using a Neutron Activation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Jin; Chung, Yong Sam; Sun, Gwang Min; Lee, Yu Na; Yoo, Sang Ho [KAERI, Daejeon (Korea, Republic of)

    2010-05-15

    Distinction of production area of Korean ginsengs has been tried by using neutron activation techniques such as an instrumental neutron activation analysis (INAA) and a prompt gamma activation analysis (PGAA). A distribution of elements has varied according to the part of plant clue to the difference of enrichment effect and influence from a soil where the plants have been grown. So correlation study between plants and soil has been an Issue. In this study, the distribution of trace elements within a Korean ginseng was investigated by using an instrumental neutron activation analysis

  17. Elemental Analysis of Lapis Lazuli sample, using complementary techniques of IBIL and MicroPIXE

    OpenAIRE

    T Nikbakht; Kakuee, O. R.; M Lamehi Rachti; M Sedaghati Boorkhani

    2015-01-01

    Ion Beam Induced Luminescence (IBIL) is a useful IBA technique which could be utilized to obtain information about the nature of chemical bonds in materials. Regarding the probed area, this non-destructive and fast technique is a suitable complementary one for MicroPIXE. Since most minerals are luminescent, IBIL is an applicable analytical technique in mineralogy. In this research work, to characterize a Lapis lazuli sample, a 2.7 MeV proton beam is utilized. After data collection and analysi...

  18. Flow analysis techniques as effective tools for the improved environmental analysis of organic compounds expressed as total indices.

    Science.gov (United States)

    Maya, Fernando; Estela, José Manuel; Cerdà, Víctor

    2010-04-15

    The scope of this work is the accomplishment of an overview about the current state-of-the-art flow analysis techniques applied to the environmental determination of organic compounds expressed as total indices. Flow analysis techniques are proposed as effective tools for the quick obtention of preliminary chemical information about the occurrence of organic compounds on the environment prior to the use of more complex, time-consuming and expensive instrumental techniques. Recently improved flow-based methodologies for the determination of chemical oxygen demand, halogenated organic compounds and phenols are presented and discussed in detail. The aim of the present work is to demonstrate the highlight of flow-based techniques as vanguard tools on the determination of organic compounds in environmental water samples.

  19. Relationships between eigen and complex network techniques for the statistical analysis of climate data

    CERN Document Server

    Donges, Jonathan F; Loew, Alexander; Marwan, Norbert; Kurths, Jürgen

    2013-01-01

    Eigen techniques such as empirical orthogonal function (EOF) or coupled pattern (CP) analysis have been frequently used for detecting patterns in multivariate climatological data sets. Recently, statistical methods originating from the theory of complex networks have been employed for the very same purpose of spatio-temporal analysis. This climate network analysis is usually based on the same set of similarity matrices as is used in classical EOF or CP analysis, e.g., the correlation matrix of a single climatological field or the cross-correlation matrix between two distinct climatological fields. In this study, formal relationships between both eigen and network approaches are derived and illustrated using exemplary data sets. These results allow to pinpoint that climate network analysis can complement classical eigen techniques and provides substantial additional information on the higher-order structure of statistical interrelationships in climatological data sets. Hence, climate networks are a valuable su...

  20. THE GAME TECHNIQUE NTCHNIQUE STIMULATING LEARNING ACTIVITY OF JUNIOR STUDENTS SPECIALIZING IN ECONOMICS

    Directory of Open Access Journals (Sweden)

    Juri. S. Ezrokh

    2014-01-01

    Full Text Available The research is aimed at specifying and developing the modern control system of current academic achievements of junior university students; and the main task is to find the adequate ways for stimulating the junior students’ learning activities, and estimating their individual achievements.Methods: The author applies his own assessment method for estimating and stimulating students’ learning outcomes, based on the rating-point system of gradually obtained points building up a student’s integrated learning outcomes.Results: The research findings prove that implementation of the given method can increase the motivational, multiplicative and controlling components of the learning process.Scientific novelty: The method in question is based on the new original game approach to controlling procedures and stimulation of learning motivation of the economic profile students.Practical significance: The recommended technique can intensify the incentivebased training activities both in and outside a classroom, developing thereby students’ professional and personal qualities.

  1. Analyzing Activity Behavior and Movement in a Naturalistic Environment Using Smart Home Techniques.

    Science.gov (United States)

    Cook, Diane J; Schmitter-Edgecombe, Maureen; Dawadi, Prafulla

    2015-11-01

    One of the many services that intelligent systems can provide is the ability to analyze the impact of different medical conditions on daily behavior. In this study, we use smart home and wearable sensors to collect data, while ( n = 84) older adults perform complex activities of daily living. We analyze the data using machine learning techniques and reveal that differences between healthy older adults and adults with Parkinson disease not only exist in their activity patterns, but that these differences can be automatically recognized. Our machine learning classifiers reach an accuracy of 0.97 with an area under the ROC curve value of 0.97 in distinguishing these groups. Our permutation-based testing confirms that the sensor-based differences between these groups are statistically significant.

  2. The active layer morphology of organic solar cells probed with grazing incidence scattering techniques.

    Science.gov (United States)

    Müller-Buschbaum, Peter

    2014-12-10

    Grazing incidence X-ray scattering (GIXS) provides unique insights into the morphology of active materials and thin film layers used in organic photovoltaic devices. With grazing incidence wide angle X-ray scattering (GIWAXS) the molecular arrangement of the material is probed. GIWAXS is sensitive to the crystalline parts and allows for the determination of the crystal structure and the orientation of the crystalline regions with respect to the electrodes. With grazing incidence small angle X-ray scattering (GISAXS) the nano-scale structure inside the films is probed. As GISAXS is sensitive to length scales from nanometers to several hundred nanometers, all relevant length scales of organic solar cells are detectable. After an introduction to GISAXS and GIWAXS, selected examples for application of both techniques to active layer materials are reviewed. The particular focus is on conjugated polymers, such as poly(3-hexylthiophene) (P3HT).

  3. Diffusion MRI of the neonate brain: acquisition, processing and analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Pannek, Kerstin [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, School of Medicine, Brisbane (Australia); University of Queensland, Centre for Advanced Imaging, Brisbane (Australia); Guzzetta, Andrea [IRCCS Stella Maris, Department of Developmental Neuroscience, Calambrone Pisa (Italy); Colditz, Paul B. [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, Perinatal Research Centre, Brisbane (Australia); Rose, Stephen E. [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, Centre for Advanced Imaging, Brisbane (Australia); University of Queensland Centre for Clinical Research, Royal Brisbane and Women' s Hospital, Brisbane (Australia)

    2012-10-15

    Diffusion MRI (dMRI) is a popular noninvasive imaging modality for the investigation of the neonate brain. It enables the assessment of white matter integrity, and is particularly suited for studying white matter maturation in the preterm and term neonate brain. Diffusion tractography allows the delineation of white matter pathways and assessment of connectivity in vivo. In this review, we address the challenges of performing and analysing neonate dMRI. Of particular importance in dMRI analysis is adequate data preprocessing to reduce image distortions inherent to the acquisition technique, as well as artefacts caused by head movement. We present a summary of techniques that should be used in the preprocessing of neonate dMRI data, and demonstrate the effect of these important correction steps. Furthermore, we give an overview of available analysis techniques, ranging from voxel-based analysis of anisotropy metrics including tract-based spatial statistics (TBSS) to recently developed methods of statistical analysis addressing issues of resolving complex white matter architecture. We highlight the importance of resolving crossing fibres for tractography and outline several tractography-based techniques, including connectivity-based segmentation, the connectome and tractography mapping. These techniques provide powerful tools for the investigation of brain development and maturation. (orig.)

  4. Survival of the Fittest: An Active Queue Management Technique for Noisy Packet Flows

    Directory of Open Access Journals (Sweden)

    Shirish S. Karande

    2007-01-01

    Full Text Available We present a novel active queue management (AQM technique to demonstrate the efficacy of practically harnessing the predictive utility of SSR indications for improved video communication. We consider a network within which corrupted packets are relayed over multiple hops, but a certain percentage of packets needs to be dropped at an intermediate node due to congestion. We propose an AQM technique, survival of the fittest (SOTF, to be employed at the relay node, within which we use packet state information, available from SSR indications and checksums, to drop packets with the highest corruption levels. On the basis of actual 802.11b measurements we show that such a side information (SI aware processing within the network can provide significant performance benefits over an SI-unaware scheme, random queue management (RQM, which is forced to randomly discard packets. With trace-based simulations, we show the utility of the proposed AQM technique in improving the error recovery performance of cross-layer FEC schemes. Finally, with the help of H.264-based video simulations these improvements are shown to translate into a significant improvement in video quality.

  5. Diametral tensile strength of composite resins submitted to different activation techniques.

    Science.gov (United States)

    Casselli, Denise Sá Maia; Worschech, Claudia Cia; Paulillo, Luis Alexandre Maffei Sartini; Dias, Carlos Tadeu Dos Santos

    2006-01-01

    The aim of this study was to evaluate the diametral tensile strength (DTS) of composite resins submitted to different curing techniques. Four composite resins were tested in this study: Targis (Ivoclar), Solidex (Shofu), Charisma (Heraeus-Kulzer) and Filtek Z250 (3M Espe). Sixty-four cylindrical specimens were prepared and divided into eight groups according to each polymerization technique (n = 8). The indirect composite resins (Targis and Solidex) were polymerized with their respective curing systems (Targis Power and EDG-lux); Charisma and Filtek Z250 were light-cured with conventional polymerization (halogen light) and additionally, with post-curing systems. Specimens were stored in artificial saliva at 37 degrees C for one week. DTS tests were performed in a Universal Testing Machine (0.5 mm/min). The data were statistically analyzed by ANOVA and Duncan tests. The results were (MPa): Z250/EDG-lux: 69.04 feminine; Z250/Targis Power: 68.57 feminine; Z250/conventional polymerization: 60.75b; Charisma/Targis Power: 52.34c; Charisma/conventional polymerization: 49.17c; Charisma/EDG-lux: 47.98c; Solidex: 36.62d; Targis: 32.86d. The results reveal that the post-cured Z250 composite resin showed the highest DTS means. Charisma composite presented no significant differences when activation techniques were compared. Direct composite resins presented higher DTS values than indirect resins.

  6. Neutron activation analysis applied to nutritional and foodstuff studies

    Energy Technology Data Exchange (ETDEWEB)

    Maihara, Vera A.; Santos, Paola S.; Moura, Patricia L.C.; Castro, Lilian P. de, E-mail: vmaihara@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Avegliano, Roseane P., E-mail: pagliaro@usp.b [Universidade de Sao Paulo (USP), SP (Brazil). Coordenadoria de Assistencia Social. Div. de Alimentacao

    2009-07-01

    Neutron Activation Analysis, NAA, has been successfully used on a regularly basis in several areas of nutrition and foodstuffs. NAA has become an important and useful research tool due to the methodology's advantages. These include high accuracy, small quantities of samples and no chemical treatment. This technique allows the determination of important elements directly related to human health. NAA also provides data concerning essential and toxic concentrations in foodstuffs and specific diets. In this paper some studies in the area of nutrition which have been carried out at the Neutron Activation Laboratory of IPEN/CNEN-SP will be presented: a Brazilian total diet study: nutritional element dietary intakes of Sao Paulo state population; a study of trace element in maternal milk and the determination of essential trace elements in some edible mushrooms. (author)

  7. Quantitative neutron capture resonance analysis verified with instrumental neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Blaauw, M. E-mail: blaauw@iri.tudelft.nl; Postma, H.; Mutti, P

    2003-06-01

    The newly developed elemental analysis technique Neutron Resonance Capture Analysis (NRCA) was verified by analyzing a prehistoric bronze arrowhead with both NRCA and Instrumental Activation Analysis (INAA). In NRCA, elements are identified through their neutron resonance capture energies as determined through detection of prompt capture gamma-rays as a function of time of flight. The quantification is obtained from the resonance peak areas. Corrections are required for neutron-energy-dependent dead time and self-shielding, the latter also depending on Doppler broadening. The analysis program REFIT, of which the intended use is the determination of the resonance parameters, was used to this end. The agreement observed between INAA and NRCA results indicates that the NRCA results obtained are accurate.

  8. Analysis of a Reflectarray by Using an Iterative Domain Decomposition Technique

    Directory of Open Access Journals (Sweden)

    Carlos Delgado

    2012-01-01

    Full Text Available We present an efficient method for the analysis of different objects that may contain a complex feeding system and a reflector structure. The approach is based on a domain decomposition technique that divides the geometry into several parts to minimize the vast computational resources required when applying a full wave method. This technique is also parallelized by using the Message Passing Interface to minimize the memory and time requirements of the simulation. A reflectarray analysis serves as an example of the proposed approach.

  9. Dictionary learning and sparse recovery for electrodermal activity analysis

    Science.gov (United States)

    Kelsey, Malia; Dallal, Ahmed; Eldeeb, Safaa; Akcakaya, Murat; Kleckner, Ian; Gerard, Christophe; Quigley, Karen S.; Goodwin, Matthew S.

    2016-05-01

    Measures of electrodermal activity (EDA) have advanced research in a wide variety of areas including psychophysiology; however, the majority of this research is typically undertaken in laboratory settings. To extend the ecological validity of laboratory assessments, researchers are taking advantage of advances in wireless biosensors to gather EDA data in ambulatory settings, such as in school classrooms. While measuring EDA in naturalistic contexts may enhance ecological validity, it also introduces analytical challenges that current techniques cannot address. One limitation is the limited efficiency and automation of analysis techniques. Many groups either analyze their data by hand, reviewing each individual record, or use computationally inefficient software that limits timely analysis of large data sets. To address this limitation, we developed a method to accurately and automatically identify SCRs using curve fitting methods. Curve fitting has been shown to improve the accuracy of SCR amplitude and location estimations, but have not yet been used to reduce computational complexity. In this paper, sparse recovery and dictionary learning methods are combined to improve computational efficiency of analysis and decrease run time, while maintaining a high degree of accuracy in detecting SCRs. Here, a dictionary is first created using curve fitting methods for a standard SCR shape. Then, orthogonal matching pursuit (OMP) is used to detect SCRs within a dataset using the dictionary to complete sparse recovery. Evaluation of our method, including a comparison to for speed and accuracy with existing software, showed an accuracy of 80% and a reduced run time.

  10. Dry etching techniques for active devices based on hexagonal boron nitride epilayers

    Energy Technology Data Exchange (ETDEWEB)

    Grenadier, Samuel; Li, Jing; Lin, Jingyu; Jiang, Hongxing [Department of Electrical and Computer Engineering, Texas Tech University, Lubbock, Texas 79409 (United States)

    2013-11-15

    Hexagonal boron nitride (hBN) has emerged as a fundamentally and technologically important material system owing to its unique physical properties including layered structure, wide energy bandgap, large optical absorption, and neutron capture cross section. As for any materials under development, it is necessary to establish device processing techniques to realize active devices based on hBN. The authors report on the advancements in dry etching techniques for active devices based on hBN epilayers via inductively coupled plasma (ICP). The effect of ICP radio frequency (RF) power on the etch rate and vertical side wall profile was studied. The etching depth and angle with respect to the surface were measured using atomic force microscopy showing that an etching rate ∼1.25 μm/min and etching angles >80° were obtained. Profilometer data and scanning electron microscope images confirmed these results. This work demonstrates that SF{sub 6} is very suitable for etching hBN epilayers in RF plasma environments and can serve as a guide for future hBN device processing.

  11. IMPROVING STUDENTS’ LOW CLASS PARTICIPATION IN SPEAKING ACTIVITIES BY USING DRAMA TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Erly Wahyuni

    2013-04-01

    Abstract  Many a times the teaching of English language falls short of fulfilling its goals. Even after years of English teaching, the learners do not gain the confidence of using the language in and outside the class. Real communication involves ideas, emotions, feelings, appropriateness and adaptability. The conventional English class hardly gives the learners an opportunity to use language in this manner and develop fluency in it. Thus, the main purpose of the language teaching course, i.e., developing skills in communication, is unfortunately, neglected. An attractive alternative is teaching language through drama because drama provides practical knowledge of the expressive and communicative powers of a language. In other word, it integrates verbal and non-verbal aspects of communication, thus bringing together both mind and body, and restoring the balance between physical and intellectual aspects of learning. Furthermore, it fosters self-awareness (and awareness of others, self-esteem and confidence; and through this, motivation is developed. This article is aimed to look at the drama techniques and their activities that can motivate students to speak. Keywords: class participation, speaking activities, drama technique

  12. Surface deformation of active volcanic areas retrieved with the SBAS-DInSAR technique: an overview

    Directory of Open Access Journals (Sweden)

    G. Zeni

    2008-06-01

    Full Text Available This paper presents a comprehensive overview of the surface deformation retrieval capability of the Differential Synthetic Aperture Radar Interferometry (DInSAR algorithm, referred to as Small BAseline Subset (SBAS technique, in the context of active volcanic areas. In particular, after a brief description of the algorithm some experiments relevant to three selected case-study areas are presented. First, we concentrate on the application of the SBAS algorithm to a single-orbit scenario, thus considering a set of SAR data composed by images acquired on descending orbits by the European Remote Sensing (ERS radar sensors and relevant to the Long Valley caldera (eastern California area. Subsequently, we address the capability of the SBAS technique in a multipleorbit context by referring to Mt. Etna volcano (southern Italy test site, with respect to which two different ERS data set, composed by images acquired both on ascending and descending orbits, are available. Finally, we take advantage of the capability of the algorithm to work in a multi-platform scenario by jointly exploiting two different sets of SAR images collected by the ERS and the Environment Satellite (ENVISAT radar sensors in the Campi Flegrei caldera (southern Italy area. The presented results demonstrate the effectiveness of the algorithm to investigate the deformation field in active volcanic areas and the potential of the DInSAR methodologies within routine surveillance scenario.

  13. Collection and analysis of active particles

    Energy Technology Data Exchange (ETDEWEB)

    DeLong, C.W.

    1950-01-27

    When it became apparent that particles were emanating from the stacks of the separations plants, it become important that the source, size, activity and composition of the particles be determined in order to evaluate the hazard to persons working in and neer the stack areas. The present report will give the results of radiochemical analysis of particles collected by electrostatic precipitation from ``B`` plant canyon ventilation air, not from the off-gas ventilation line. Of importance is the fact the particles analyzed consist not only of particles from the ventilation air but also, unavoidably, of rust from the iron manifold used to conduct the gases to the precipitator. This makes a determination of the activity versus weight ratio impossible, but should not invalidate the radiochemical data.

  14. LEGO bricks used as chemotactic chambers: evaluation by a computer-assisted image analysis technique.

    Science.gov (United States)

    Azzarà, A; Chimenti, M

    2004-01-01

    One of the main techniques used to explore neutrophil motility, employs micropore filters in chemotactic chambers. Many new models have been proposed, in order to perform multiple microassays in a rapid, inexpensive and reproducible way. In this work, LEGO bricks have been used as chemotactic chambers in the evaluation of neutrophil random motility and chemotaxis and compared with conventional Boyden chambers in a "time-response" experiment. Neutrophil motility throughout the filters was evaluated by means of an image-processing workstation, in which a dedicated algorithm recognizes and counts the cells in several fields and focal planes throughout the whole filter; correlates counts and depth values; performs a statistical analysis of data; calculates the true value of neutrophil migration; determines the distribution of cells; and displays the migration pattern. By this method, we found that the distances travelled by the cells in conventional chambers and in LEGO bricks were perfectly identical, both in random migration and under chemotactic conditions. Moreover, no interference with the physiological behaviour of neutrophils was detectable. In fact, the kinetics of migration was identical both in random migration (characterized by a gaussian pattern) and in chemotaxis (characterized by a typical stimulation peak, previously identified by our workstation). In conclusion, LEGO bricks are extremely precise devices. They are simple to use and allow the use of small amounts of chemoattractant solution and cell suspension, supplying by itself a triplicate test. LEGO bricks are inexpensive, fast and suitable for current diagnostic activity or for research investigations in every laboratory.

  15. An electromagnetic signals monitoring and analysis wireless platform employing personal digital assistants and pattern analysis techniques

    Science.gov (United States)

    Ninos, K.; Georgiadis, P.; Cavouras, D.; Nomicos, C.

    2010-05-01

    This study presents the design and development of a mobile wireless platform to be used for monitoring and analysis of seismic events and related electromagnetic (EM) signals, employing Personal Digital Assistants (PDAs). A prototype custom-developed application was deployed on a 3G enabled PDA that could connect to the FTP server of the Institute of Geodynamics of the National Observatory of Athens and receive and display EM signals at 4 receiver frequencies (3 KHz (E-W, N-S), 10 KHz (E-W, N-S), 41 MHz and 46 MHz). Signals may originate from any one of the 16 field-stations located around the Greek territory. Employing continuous recordings of EM signals gathered from January 2003 till December 2007, a Support Vector Machines (SVM)-based classification system was designed to distinguish EM precursor signals within noisy background. EM-signals corresponding to recordings preceding major seismic events (Ms≥5R) were segmented, by an experienced scientist, and five features (mean, variance, skewness, kurtosis, and a wavelet based feature), derived from the EM-signals were calculated. These features were used to train the SVM-based classification scheme. The performance of the system was evaluated by the exhaustive search and leave-one-out methods giving 87.2% overall classification accuracy, in correctly identifying EM precursor signals within noisy background employing all calculated features. Due to the insufficient processing power of the PDAs, this task was performed on a typical desktop computer. This optimal trained context of the SVM classifier was then integrated in the PDA based application rendering the platform capable to discriminate between EM precursor signals and noise. System's efficiency was evaluated by an expert who reviewed 1/ multiple EM-signals, up to 18 days prior to corresponding past seismic events, and 2/ the possible EM-activity of a specific region employing the trained SVM classifier. Additionally, the proposed architecture can form a

  16. New Techniques in Time-Frequency Analysis: Adaptive Band, Ultra-Wide Band and Multi-Rate Signal Processing

    Science.gov (United States)

    2016-03-02

    There are numerous motivations for extending signal processing, and in particular, sampling theory , to non- Euclidean spaces, and in particular...AVAILABILITY STATEMENT Unlimited DISTRIBUTION A 13. SUPPLEMENTARY NOTES 14. ABSTRACT The project led to the development of new techniques and theories ...in the analysis of signals. These techniques and theories were extensions of known techniques -- sampling, Fourier, Gabor and wavelet analysis, and

  17. Automated Techniques for the Qualitative Analysis of Ecological Models: Continuous Models

    Directory of Open Access Journals (Sweden)

    Lynn van Coller

    1997-06-01

    Full Text Available The mathematics required for a detailed analysis of the behavior of a model can be formidable. In this paper, I demonstrate how various computer packages can aid qualitative analyses by implementing techniques from dynamical systems theory. Because computer software is used to obtain the results, the techniques can be used by nonmathematicians as well as mathematicians. In-depth analyses of complicated models that were previously very difficult to study can now be done. Because the paper is intended as an introduction to applying the techniques to ecological models, I have included an appendix describing some of the ideas and terminology. A second appendix shows how the techniques can be applied to a fairly simple predator-prey model and establishes the reliability of the computer software. The main body of the paper discusses a ratio-dependent model. The new techniques highlight some limitations of isocline analyses in this three-dimensional setting and show that the model is structurally unstable. Another appendix describes a larger model of a sheep-pasture-hyrax-lynx system. Dynamical systems techniques are compared with a traditional sensitivity analysis and are found to give more information. As a result, an incomplete relationship in the model is highlighted. I also discuss the resilience of these models to both parameter and population perturbations.

  18. Advanced spatio-temporal filtering techniques for photogrammetric image sequence analysis in civil engineering material testing

    Science.gov (United States)

    Liebold, F.; Maas, H.-G.

    2016-01-01

    The paper shows advanced spatial, temporal and spatio-temporal filtering techniques which may be used to reduce noise effects in photogrammetric image sequence analysis tasks and tools. As a practical example, the techniques are validated in a photogrammetric spatio-temporal crack detection and analysis tool applied in load tests in civil engineering material testing. The load test technique is based on monocular image sequences of a test object under varying load conditions. The first image of a sequence is defined as a reference image under zero load, wherein interest points are determined and connected in a triangular irregular network structure. For each epoch, these triangles are compared to the reference image triangles to search for deformations. The result of the feature point tracking and triangle comparison process is a spatio-temporally resolved strain value field, wherein cracks can be detected, located and measured via local discrepancies. The strains can be visualized as a color-coded map. In order to improve the measuring system and to reduce noise, the strain values of each triangle must be treated in a filtering process. The paper shows the results of various filter techniques in the spatial and in the temporal domain as well as spatio-temporal filtering techniques applied to these data. The best results were obtained by a bilateral filter in the spatial domain and by a spatio-temporal EOF (empirical orthogonal function) filtering technique.

  19. Real-time flight test analysis and display techniques for the X-29A aircraft

    Science.gov (United States)

    Hicks, John W.; Petersen, Kevin L.

    1989-01-01

    The X-29A advanced technology demonstrator flight envelope expansion program and the subsequent flight research phase gave impetus to the development of several innovative real-time analysis and display techniques. These new techniques produced significant improvements in flight test productivity, flight research capabilities, and flight safety. These techniques include real-time measurement and display of in-flight structural loads, dynamic structural mode frequency and damping, flight control system dynamic stability and control response, aeroperformance drag polars, and aircraft specific excess power. Several of these analysis techniques also provided for direct comparisons of flight-measured results with analytical predictions. The aeroperformance technique was made possible by the concurrent development of a new simplified in-flight net thrust computation method. To achieve these levels of on-line flight test analysis, integration of ground and airborne systems was required. The capability of NASA Ames Research Center, Dryden Flight Research Facility's Western Aeronautical Test Range was a key factor to enable implementation of these methods.

  20. An Analysis Technique/Automated Tool for Comparing and Tracking Analysis Modes of Different Finite Element Models

    Science.gov (United States)

    Towner, Robert L.; Band, Jonathan L.

    2012-01-01

    An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.

  1. Spatio-temporal analysis of discharge regimes based on hydrograph classification techniques in an agricultural catchment

    Science.gov (United States)

    Chen, Xiaofei; Bloeschl, Guenter; Blaschke, Alfred Paul; Silasari, Rasmiaditya; Exner-Kittridge, Mike

    2016-04-01

    The stream, discharges and groundwater hydro-graphs is an integration in spatial and temporal variations for small-scale hydrological response. Characterizing discharges response regime in a drainage farmland is essential to irrigation strategies and hydrologic modeling. Especially for agricultural basins, diurnal hydro-graphs from drainage discharges have been investigated to achieve drainage process inferences in varying magnitudes. To explore the variability of discharge responses, we developed an impersonal method to characterize and classify discharge hydrograph based on features of magnitude and time-series. A cluster analysis (hierarchical k-means) and principal components analysis techniques are used for discharge time-series and groundwater level hydro-graphs to analyze their event characteristics, using 8 different discharge and 18 groundwater level hydro-graphs to test. As the variability of rainfall activity, system location, discharge regime and soil moisture pre-event condition in the catchment, three main clusters of discharge hydro-graph are identified from the test. The results show that : (1) the hydro-graphs from these drainage discharges had similar shapes but different magnitudes for individual rainstorm; the similarity is also showed in overland flow discharge and spring system; (2) for each cluster, the similarity of shape insisted, but the rising slope are different due to different antecedent wetness condition and the rain accumulation meanwhile the difference of regression slope can be explained by system location and discharge area; and (3) surface water always has a close proportional relation with soil moisture throughout the year, while only after the soil moisture exceeds a certain threshold does the outflow of tile drainage systems have a direct ratio relationship with soil moisture and a inverse relationship with the groundwater levels. Finally, we discussed the potential application of hydrograph classification in a wider range of

  2. Comparative scanning electron microscopy evaluation of Canal Brushing technique, sonic activation, and master apical file for the removal of triple antibiotic paste from root canal (in vitro study

    Directory of Open Access Journals (Sweden)

    Deepa Ashoksingh Thakur

    2015-01-01

    Full Text Available Aims: To compare and evaluate the effectiveness of Canal Brushing technique, sonic activation, and master apical file (MAF for the removal of triple antibiotic paste (TAP from root canal using scanning electron microscopy (SEM. Materials and Methods: Twenty-two single rooted teeth were instrumented with ProTaper up to the size number F2 and dressed with TAP. TAP was removed with Canal Brush technique (Group I, n: 6, sonic (EndoActivator (Group II, n: 6, and MAF (Group III, n: 6. Four teeth served as positive (n: 2 and negative (n: 2 controls. The roots were split in the buccolingual direction and prepared for SEM examination (×1000 at coronal, middle, and apical third. Three examiners evaluated the wall cleanliness. Statistical Analysis: Statistical analysis was performed by Kruskal–Wallis test and Wilcoxon rank sum test. Results: Difference in cleanliness between three groups is statistically significant in cervical region only. Pairwise comparison in cervical region Canal Brush and sonic activation showed more removal of TAP than MAF. Conclusions: Canal Brush and sonic activation system showed better result than MAF in the cervical and middle third of canal. In the apical third, none of the techniques showed a better result. None of the techniques showed complete removal of TAP from the canal.

  3. Behaviour Change Techniques embedded in health and lifestyle apps: coding and analysis.

    Directory of Open Access Journals (Sweden)

    Gaston Antezana

    2015-09-01

    Full Text Available Background There is evidence showing that commercially available health and lifestyle apps can be used as co-adjuvants to clinical interventions and for the prevention of chronic and non-communicable diseases. This can be particularly significant to support and improve wellbeing of young people given their familiarity with these resources. However it is important to understand the content and consistency of Behaviour Change Techniques (BCT’s embedded in the apps to maximise their potential benefits. Objectives This study explores the BCT content of a selected list of health and lifestyle tracking apps in three behavioural dimensions: physical activity, sleep and diet. We identified BCT commonalities within and between categories to detect the most frequently used and arguably more effective techniques in the context of wellbeing and promotion of health behaviours. Methods Apps were selected by using keywords and by reviewing the “health and fitness” category of GooglePlay (477 apps. The selection criteria included free apps (even if they also offered paid versions and being common to GooglePlay and AppStore. A background review of each app was also completed. Selected apps were classified according to user ratings in GooglePlay (apps with less that 4+ star ratings were disregarded. The top ten apps in each category were selected, making it a total of 30 for the analysis. Three coders used the apps for two months and were trained to use a comprehensive 93 items taxonomy (BCTv1 to complete the analysis. Results Strong BCT similarities were found across all three categories, suggesting a consistent basic content composition. Out of all 93 BCTS’s 8 were identified as being present in at least 50% of the apps. 6 of these BCT’s are concentrated in categories “1. Goals and Planning” and “2. Feedback and Monitoring”. BCT “Social support (unspecified” was coded for in 63% of the apps, as it was present through different features in

  4. Comparative Analysis of Various Image Fusion Techniques For Biomedical Images: A Review

    Directory of Open Access Journals (Sweden)

    Nayera Nahvi,

    2014-05-01

    Full Text Available Image Fusion is a process of combining the relevant information from a set of images, into a single image, wherein the resultant fused image will be more informative and complete than any of the input images. This paper discusses implementation of DWT technique on different images to make a fused image having more information content. As DWT is the latest technique for image fusion as compared to simple image fusion and pyramid based image fusion, so we are going to implement DWT as the image fusion technique in our paper. Other methods such as Principal Component Analysis (PCA based fusion, Intensity hue Saturation (IHS Transform based fusion and high pass filtering methods are also discussed. A new algorithm is proposed using Discrete Wavelet transform and different fusion techniques including pixel averaging, min-max and max-min methods for medical image fusion. KEYWORDS:

  5. A hybrid fringe analysis technique for the elimination of random noise in interferometric wrapped phase maps

    Science.gov (United States)

    Bhat, Gopalakrishna K.

    1994-10-01

    A fringe analysis technique, which makes use of the spatial filtering property of the Fourier transform method, for the elimination of random impulsive noise in the wrapped phase maps obtained using the phase stepping technique, is presented. Phase noise is converted into intensity noise by transforming the wrapped phase map into a continuous fringe pattern inside the digital image processor. Fourier transform method is employed to filter out the intensity noise and recover the clean wrapped phase map. Computer generated carrier fringes are used to preserve the sign information. This technique makes the two dimensional phase unwrapping process less involved, because it eliminates the local phase fluctuations, which act as pseudo 2π discontinuities. The technique is applied for the elimination of noise in a phase map obtained using electro-optic holography.

  6. Rhinoplasty - analysis of the techniques used in a service in the south of Brazil

    Directory of Open Access Journals (Sweden)

    Pasinato, Rogério C

    2008-09-01

    Full Text Available Introduction: In the rhinoplasty, as in other surgeries, an adequate exposure of the manipulated structures is essential for a positive surgical result. Various techniques are used, and these may vary, mainly, because of the anatomical alterations found. Objective: To evaluate which are the most common surgical techniques and maneuver used in our service. Method: Retrospective analysis of the surgical descriptions of patients submitted to the rhinoplasty in the Otorhinolaryngology Department of the Clinical Hospital - UFPR in the year of 2007. Results: 79 patients were evaluated; in 86% of whom rhinoplasty with basic technique was performed, between 6,4% and 7,6% delivery and external rhinoplasty were used, respectively. Conclusion: In our service we performed basic technique rhinoplasty in the great majority of the patients.

  7. Exploring the potential of data mining techniques for the analysis of accident patterns

    DEFF Research Database (Denmark)

    Prato, Carlo Giacomo; Bekhor, Shlomo; Galtzur, Ayelet

    2010-01-01

    Research in road safety faces major challenges: individuation of the most significant determinants of traffic accidents, recognition of the most recurrent accident patterns, and allocation of resources necessary to address the most relevant issues. This paper intends to comprehend which data mining...... and association rules) data mining techniques are implemented for the analysis of traffic accidents occurred in Israel between 2001 and 2004. Results show that descriptive techniques are useful to classify the large amount of analyzed accidents, even though introduce problems with respect to the clear...... importance of input and intermediate neurons, and the relative importance of hundreds of association rules. Further research should investigate whether limiting the analysis to fatal accidents would simplify the task of data mining techniques in recognizing accident patterns without the “noise” probably...

  8. THE RESEARCH TECHNIQUES FOR ANALYSIS OF MECHANICAL AND TRIBOLOGICAL PROPERTIES OF COATING-SUBSTRATE SYSTEMS

    Directory of Open Access Journals (Sweden)

    Kinga CHRONOWSKA-PRZYWARA

    2014-06-01

    Full Text Available The article presents research techniques for the analysis of both mechanical and tribological properties of thin coatings applied on highly loaded machine elements. In the Institute of Machine Design and Exploitation, AGH University of Science and Technology students of the second level of Mechanical Engineering study tribology attending laboratory class. Students learn on techniques for mechanical and tribological testing of thin, hard coatings deposited by PVD and CVD technologies. The program of laboratories contains micro-, nanohardness and Young's modulus measurements by instrumental indentations and analysys of coating to substrate adhesion by scratch testing. The tribological properties of the coating-substrate systems are studied using various techniques, mainly in point contact load conditions with ball-on-disc and block-on-ring tribomiters as well as using ball cratering method in strongly abrasive suspensions.

  9. Advanced analysis technique for the evaluation of linear alternators and linear motors

    Science.gov (United States)

    Holliday, Jeffrey C.

    1995-01-01

    A method for the mathematical analysis of linear alternator and linear motor devices and designs is described, and an example of its use is included. The technique seeks to surpass other methods of analysis by including more rigorous treatment of phenomena normally omitted or coarsely approximated such as eddy braking, non-linear material properties, and power losses generated within structures surrounding the device. The technique is broadly applicable to linear alternators and linear motors involving iron yoke structures and moving permanent magnets. The technique involves the application of Amperian current equivalents to the modeling of the moving permanent magnet components within a finite element formulation. The resulting steady state and transient mode field solutions can simultaneously account for the moving and static field sources within and around the device.

  10. Standard Test Method for Oxygen Content Using a 14-MeV Neutron Activation and Direct-Counting Technique

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2007-01-01

    1.1 This test method covers the measurement of oxygen concentration in almost any matrix by using a 14-MeV neutron activation and direct-counting technique. Essentially, the same system may be used to determine oxygen concentrations ranging from over 50 % to about 10 g/g, or less, depending on the sample size and available 14-MeV neutron fluence rates. Note 1 - The range of analysis may be extended by using higher neutron fluence rates, larger samples, and higher counting efficiency detectors. 1.2 This test method may be used on either solid or liquid samples, provided that they can be made to conform in size, shape, and macroscopic density during irradiation and counting to a standard sample of known oxygen content. Several variants of this method have been described in the technical literature. A monograph is available which provides a comprehensive description of the principles of activation analysis using a neutron generator (1). 1.3 The values stated in either SI or inch-pound units are to be regarded...

  11. Breath Analysis Using Laser Spectroscopic Techniques: Breath Biomarkers, Spectral Fingerprints, and Detection Limits

    Directory of Open Access Journals (Sweden)

    Peeyush Sahay

    2009-10-01

    Full Text Available Breath analysis, a promising new field of medicine and medical instrumentation, potentially offers noninvasive, real-time, and point-of-care (POC disease diagnostics and metabolic status monitoring. Numerous breath biomarkers have been detected and quantified so far by using the GC-MS technique. Recent advances in laser spectroscopic techniques and laser sources have driven breath analysis to new heights, moving from laboratory research to commercial reality. Laser spectroscopic detection techniques not only have high-sensitivity and high-selectivity, as equivalently offered by the MS-based techniques, but also have the advantageous features of near real-time response, low instrument costs, and POC function. Of the approximately 35 established breath biomarkers, such as acetone, ammonia, carbon dioxide, ethane, methane, and nitric oxide, 14 species in exhaled human breath have been analyzed by high-sensitivity laser spectroscopic techniques, namely, tunable diode laser absorption spectroscopy (TDLAS, cavity ringdown spectroscopy (CRDS, integrated cavity output spectroscopy (ICOS, cavity enhanced absorption spectroscopy (CEAS, cavity leak-out spectroscopy (CALOS, photoacoustic spectroscopy (PAS, quartz-enhanced photoacoustic spectroscopy (QEPAS, and optical frequency comb cavity-enhanced absorption spectroscopy (OFC-CEAS. Spectral fingerprints of the measured biomarkers span from the UV to the mid-IR spectral regions and the detection limits achieved by the laser techniques range from parts per million to parts per billion levels. Sensors using the laser spectroscopic techniques for a few breath biomarkers, e.g., carbon dioxide, nitric oxide, etc. are commercially available. This review presents an update on the latest developments in laser-based breath analysis.

  12. Analysis of umayyad islamic silver coins (Dirhams) by using instrumental neutron activation analysis

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    Islamic silver coins (Dirhams) running the period between 107 to 126 Hijri (726-743AD), which belong to the Umayyad Empire period, 41-132 hijri (661-750AD), were selected for analysis by using instrumentalneutron activation analysis techniques.During this period (105-126H),(724-743AD), the Caliph Hisham Eben Abdlemalek ruled the Umayyad Empire.Dirhams were irradiated in a reactor neutron activation facility.Levels of various elements viz.Cu, Ag and Au were estimated.It was found that the average silver concentration, the baseconstituent of the Dirham, was about 88wt%.Correlation between thecomposition of Dirhams and the historical implications was discussed.

  13. Analysis techniques for multivariate root loci. [a tool in linear control systems

    Science.gov (United States)

    Thompson, P. M.; Stein, G.; Laub, A. J.

    1980-01-01

    Analysis and techniques are developed for the multivariable root locus and the multivariable optimal root locus. The generalized eigenvalue problem is used to compute angles and sensitivities for both types of loci, and an algorithm is presented that determines the asymptotic properties of the optimal root locus.

  14. Multiscale analysis of damage using dual and primal domain decomposition techniques

    NARCIS (Netherlands)

    Lloberas-Valls, O.; Everdij, F.P.X.; Rixen, D.J.; Simone, A.; Sluys, L.J.

    2014-01-01

    In this contribution, dual and primal domain decomposition techniques are studied for the multiscale analysis of failure in quasi-brittle materials. The multiscale strategy essentially consists in decomposing the structure into a number of nonoverlapping domains and considering a refined spatial res

  15. A borax fusion technique for quantitative X-ray fluorescence analysis

    NARCIS (Netherlands)

    Willigen, van J.H.H.G.; Kruidhof, H.; Dahmen, E.A.M.F.

    1971-01-01

    A borax fusion technique to cast glass discs for quantitative X-ray analysis is described in detail. The method is based on the “nonwetting” properties of a Pt/Au alloy towards molten borax, on the favourable composition of the flux and finally on the favourable form of the casting mould. The critic

  16. An evaluation of directional analysis techniques for multidirectional, partially reflected waves .1. numerical investigations

    DEFF Research Database (Denmark)

    Ilic, C; Chadwick, A; Helm-Petersen, Jacob

    2000-01-01

    Recent studies of advanced directional analysis techniques have mainly centred on incident wave fields. In the study of coastal structures, however, partially reflective wave fields are commonly present. In the near structure field, phase locked methods can be successfully applied. In the far fie...

  17. Two-Stage MAS Technique for Analysis of DRA Elements and Arrays on Finite Ground Planes

    DEFF Research Database (Denmark)

    Larsen, Niels Vesterdal; Breinbjerg, Olav

    2007-01-01

    A two-stage Method of Auxiliary Sources (MAS) technique is proposed for analysis of dielectric resonator antenna (DRA) elements and arrays on finite ground planes (FGPs). The problem is solved by first analysing the DRA on an infinite ground plane (IGP) and then using this solution to model the FGP...... problem....

  18. Applications of Modern Analysis Techniques in Searching back Ancient Art Ceramic Technologies

    Directory of Open Access Journals (Sweden)

    Nguyen Quang Liem

    2011-12-01

    Full Text Available This report highlights the promising applications of modern analysis techniques such as Scanning Electron Microsopy, X-ray fluorescence, X-ray diffraction, Raman scattering spectroscopy, and thermal expansion measurement in searching back the ancient art ceramics technologies.

  19. Depletive stripping chronopotentiometry : a major step forward in electrochemical stripping techniques for metal ion speciation analysis

    NARCIS (Netherlands)

    Town, R.M.; Leeuwen, van H.P.

    2004-01-01

    A comparative evaluation of the utility of the various modes of stripping chronopotentiometry (SCP) for trace metal speciation analysis is presented in the broad context of stripping voltammetric techniques. The remarkable fundamental advantages of depletive SCP at scanned deposition potential (SSCP

  20. Evaluation of syngas production unit cost of bio-gasification facility using regression analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Deng, Yangyang; Parajuli, Prem B.

    2011-08-10

    Evaluation of economic feasibility of a bio-gasification facility needs understanding of its unit cost under different production capacities. The objective of this study was to evaluate the unit cost of syngas production at capacities from 60 through 1800Nm 3/h using an economic model with three regression analysis techniques (simple regression, reciprocal regression, and log-log regression). The preliminary result of this study showed that reciprocal regression analysis technique had the best fit curve between per unit cost and production capacity, with sum of error squares (SES) lower than 0.001 and coefficient of determination of (R 2) 0.996. The regression analysis techniques determined the minimum unit cost of syngas production for micro-scale bio-gasification facilities of $0.052/Nm 3, under the capacity of 2,880 Nm 3/h. The results of this study suggest that to reduce cost, facilities should run at a high production capacity. In addition, the contribution of this technique could be the new categorical criterion to evaluate micro-scale bio-gasification facility from the perspective of economic analysis.